Performing a data backup Strategy of a server physical or virtual and the associated software applications should be easy, 9 times out of 10 there are problems. Enterprise customers for years have relied on industry leaders such as Commvault Software, NAKIVO, Veritas Technologies, Retrospect, VEEAM Software, Dell Technologies, IBM to provide a complex and diverse backup solution that ensures data is backed up and secured.
Any large-scale business today has multiple levels of data protection in place to stop unauthorised access to the network, anti-virus, malware scanners, firewalls etc. The list is endless, and these products are kept up to date with patches and updates, even replaced on a regular basis. I bet the same isn’t true of the business backup software, to be honest it’s probably doing an okay job, but it isn’t great.
The reason, the restore of a data backup Strategy could be required in 5 or 10 years and without the backup application that wrote it, the data is lost forever. Humans like to feel comfortable and we don’t like change to make them feel uncomfortable, despite the fact our backups are okay, we really don’t want the hassle of thinking how we are going to bring back that data from 5-10 years ago. Ideally restore that legacy data and put it in the cloud, after “X” years delete the lot and put the money towards infrastructure upgrades.
How many times a week does a van turn up to collect our backup data to store offsite, adding to our CO² footprint? Are businesses still doing this today, of course they are, remembered we don’t like change. Sometimes I want to shout out “wake up people, get out of the comfort zone”, talk to a business that understands all types of data.
The problem is many of these backup applications were written when the only thing that needed to be backed up were servers and a few software applications. Within the past 10 years there has been a massive advancement of computing applications, processing power and storage density increases. The backup software gradually became more monolithic:
Today performing a nightly or weekly backup of a system or application isn’t enough. Businesses are looking for CDP (continuous data protection) and near-instantaneous recovery of applications and systems in the event of data loss or failure. A robust data backup strategy solution must be able to scale, perform, be easy to use, and meet the needs of a business by backing up a multitude of applications and systems.
Data today is stored and accessible in a huge variety of ways. It is this complexity that makes it difficult to find a data backup strategy solution that can simply and easily back up from a diverse range of platforms. Most large businesses today do not rely on a single vendor for backup, and this adds to the cost and complexity of implementing an effective data backup strategy.
Humans like to generate and share data. By 2020, 44 ZB (zettabytes) of data were being generated annually, and it’s increasing. By 2025, IDC estimates we will be creating 463 EB (exabytes) of data daily or 168 ZB annually, a fourfold increase over 2020 estimates. Clearly, any data backup strategy solution needs to evolve, scale, and adapt to meet these requirements.
More people have access to computers, mobile devices, drones, IoT, and above all, technology to enable higher-resolution images, software to analyse complex data sets, messaging apps, and smart technologies like phones, fridges, washing machines, and TVs. The list and possibilities are endless, necessitating a future-proof data backup strategy.
The AMD EPYC 7002 Series Processors set a new standard for the modern data centre with 64 cores and 8.34 billion transistors on a 7 nm die. These are being used to analyse and process information. Just five years ago, an Intel Xeon E5 had 18 cores and 5.5 billion transistors on a 22 nm die. Things are getting smaller and faster. Hard disk capacities five years ago were at 4TB, today 18TB drives exist from Seagate, and it is predicted by 2025 hard disk capacities will be 100TB. A well-thought-out data backup strategy is essential to manage these advancements.
What can we do—sit back and do nothing and wonder why we aren’t providing data protection?
Every three years, a company should evaluate its data backup strategy and ask, "Is it good enough?" Or better yet, "Is it the best we can afford?" Backup software and data backup strategy solutions should be continuously updated to provide the highest level of protection.
When I started my business 29 years ago, backup software was hideously complex to configure, requiring users to load backup agents, create backup schedules, and sort out tape rotations. Today, with the right data backup strategy, businesses can avoid these headaches and ensure reliable data protection.
The data many businesses will generate over the next five years is certainly going to grow. That data needs to be stored, protected, and backed up. Planning a scalable data backup strategy now will prevent future challenges.
Backing up 50TB today could easily become 250TB in six years. Remember, upgrading the data backup strategy solution will likely involve a network infrastructure upgrade as well. A strong data backup strategy must consider network performance and scalability to handle such growth.
Any business today needs to think about running at least CDP (continuous data protection). There is no way UK WAN link speeds will increase five to tenfold by 2025, so a petascale business needs a data backup strategy to store data locally in the data centre rather than relying solely on the cloud. A robust data backup strategy ensures that data can be recovered efficiently, whether restoring a single file or 10TB of data from a failed server.