Why speedy backups are key for MSPs
Ineffective tools mean many businesses are unable to carry out daily backups
Backup isn’t simply about storing copies of customers' data; it’s more about what happens when you need to restore it.
When backups take time, they burden the IT department, which has to manage them, and the process also slows the network that supports the backups. Time not only impacts, but dictates backup behaviour.
While a daily backup is considered good practice, even this may not be regular enough. If there is a serious crash, a full day’s data could be lost, with restoration taking at least that long, if not longer.
Due to the state of most backup tools, few businesses back up as often as they should. In a Solarwinds MSP survey of 200 IT professionals at SMBs, 32% of respondents admitted to not carrying out daily backups, saying they are too time-consuming, while 31% wished their backups were more efficient.
This is partly due to the sheer size of some backups, but it's also down to the number of organisations that still use tape, which can be a difficult, unreliable and time-consuming way to do backup.
Even when a backup seems to go fine, it may well have failed. In the average data centre, backups only succeed around 85% of the time, according to Gartner. That drops to 75% in the average remote office. Many failures like this aren’t discovered until the data needs to be restored.
For MSPs, speed should be a critical consideration when performing backups. If the selection of data and the time it takes to perform the backup can be optimised without compromising how they protect customers' data, or how they protect applications and systems, then high-speed backup can actually improve data protection, and positively impact the costs of doing a backup.
Channel Pro Newsletter
Stay up to date with the latest Channel industry news and analysis with our twice-weekly newsletter
Essentially, when backing up, only the files that have been changed need to be backed up. When choosing an online backup tool, one that can identify which files have been modified and what’s changed within them will significantly reduce the raw backup data size, and will help keep backup windows shorter and more frequent.
Simply put, the less data that has to be sent over a network, the faster the backup will be. Data reduction on this scale opens up a whole new world of possibilities for cloud-based backups, which can be performed faster and far more often.
This approach also benefits the recovery process in the event of a disaster. High-priority data can be restored directly from the cloud right away, so that critical operations can get up and running quickly, followed by the lower-priority files.
Direct-to-cloud backups are also becoming more straightforward for many MSPs. Opting for a cloud-first backup solution means that you can solve part of the famous 3-2-1 backup strategy - where three copies of your data are stored across two media, with one copy located offsite. The production data plus the version in the cloud are two copies of the recommended three, and one is offsite by default. For the most critical data, a local backup copy can be held, therefore meeting the full 3-2-1 requirement.
Keeping pace in today’s digital world requires a need for speed and flexibility, especially for MSPs backing up business intelligence on behalf of their customers.
Prioritising speed when choosing an online backup tool will enable greater flexibility and help to keep customer data current when the time comes to recover it.
Esther is a freelance media analyst, podcaster, and one-third of Media Voices. She has previously worked as a content marketing lead for Dennis Publishing and the Media Briefing. She writes frequently on topics such as subscriptions and tech developments for industry sites such as Digital Content Next and What’s New in Publishing. She is co-founder of the Publisher Podcast Awards and Publisher Podcast Summit; the first conference and awards dedicated to celebrating and elevating publisher podcasts.