October 06, 2021
Datto Cloud vs Cloud Backup Replication
There was a time when taking a backup and ensuring it existed in another physical location was considered best practice. But this is no longer the case. Rather than using on-prem for backup, more managed service providers (MSPs) are looking to the cloud to save clients’ time. For those companies still doing a backup and sending it elsewhere, having a true cloud failover should be a consideration because having it gives access to data within 24 hours in the event of a system failure. In the worst-case scenario, when the main location where data resides is inaccessible, recovering data from another location and initializing backup restore can be so time-consuming it could cause crippling downtime and potentially ruin their business while waiting for it to be recovered.
The other limitation is that offsite backup storage typically comes with a price per consumption model, which is variable and not easy to predict over time. Solutions that provide an option for “running offsite” will typically charge for CPU/RAM utilisation, not only for the actual disaster event but for testing as well. Again, these costs are variable, challenging to predict, and incredibly expensive if utilised for an extended period of time.
Conversely, Datto Cloud offers a flat rate model whether it is storage-based one-year or infinite cloud retention. This predictable pricing allows MSPs to provide clients with a cost that will not change as data grows over time. With Datto Cloud, Disaster Recovery testing can also be performed as often as quarterly at no additional charge. And there is no additional cost for CPU/RAM utilisation when an end-user needs to leverage this service. More importantly, downtime is significantly reduced.
The MSP can work with our dedicated support team to bring up a client’s entire environment, not only to show how long the process took, but also to show what the performance of the client’s environments looks like when running in the cloud.
Documenting the following key steps to data recovery will also drastically reduce the time for an end-user to connect to their systems running in Datto Cloud.
- Orchestration – Determine the order of processes to follow to bring systems online
- Resource Dedication – CPU/RAM dedication for each system
- Networking – Does everything live on the same simple network? Or, has there been the creation of multiple subnet/VLANs?
- Remote Access – Remote Desktop, Open VPN, Site-to-Site IPSec VPN
While the end-user is working offsite, their dataset is loaded onto a new SIRIS, a Datto disaster recovery appliance, as the original was likely destroyed in the disaster. When the seeding is complete, the SIRIS is shipped overnight directly to the end-user. Once they receive it, they now have the option of turning off systems running in the Datto Cloud and downloading 24 hours of block-level changed data so that they can run and recover the entire environment from a new local appliance.
In moments of disaster, every second counts. Storing data in the cloud can be a lifesaver for accessing critical data quickly and reducing the risk of serious downtime and loss of productivity.