Relying on local storage alone is the IT equivalent of walking on thin ice.
Since everyone hates the term “backup,” I refer to it instead as the “pre-restore process” when speaking with my clients. And when I explain the pre-restore process, people treat it very seriously.
This sets the stage for a discussion about storing the results of your pre-restore process. Locally stored pre-restore files are great, because they restore fast. This is always the best line of defense when it comes to data safety. Recovering a file is just a matter of running the restore software, clicking the mouse a few times, and you’re back to work.
But sometimes bad things happen to hardware that could clobber your data as well. Let’s say someone steals your server. Those cretins will likely steal your backup server too. Or say a water pipe breaks and everything gets wet — including your pre-restore file repository. Weather emergencies? Everything suffers.
Replicating your pre-restore files offsite not only adds a layer of protection, it also satisfies auditing requirements and moves you into the realm of disaster recovery. There are hundreds of providers that offer cloud-based backup services — I mean, pre-restore services — but you can also do it yourself.
Since every business has at least two locations (the office and the owner’s home), disk images — what I call pre-restore file sets — can be copied from Location A to Location B, and vice versa, to protect each location’s data. Say you have two dry cleaning businesses on different sides of town. Copy files from one to the other, and you’re covered for most disasters.
For small pre-restore file sets, you may not need any extra equipment. Just write a script that copies files from one site to another, and your latest files will be safe. Most of the newer network-attached storage (NAS) appliances will do this for you with a few minutes of configuration, transferring file sets over the WAN to the NAS in the other location.
If you have large file sets, you may need to upgrade your routers on both ends (and also your Internet connections) to transfer files in a timely manner. Very large file sets? Deduplicate your file sets before transfer, and you’ll save a boatload of time and bandwidth.
What About the Cloud?
Does this do-it-yourself method protect your data adequately? It depends. In case of fire, yes. Theft, yes. Hurricane or earthquake? For that, you will need to go a bit further.
That means the cloud, and there are many options when it comes to cloud data storage. You probably need cloud storage simply because you’ll never carry that external drive offsite, no matter how many Post-It Notes you put on it.
The half-cloud model, where vendors provide hardware for on-premises backup that replicates to the cloud for a second layer of protection, is a hot trend. A few service providers offer software for your server, but more offer hardware as part of the service.
The half-cloud model may offer the best of both worlds. You have your pre-restore process file sets hosted locally, for the fastest possible restore. But the service replicates your files offsite as well, automatically, for added safety.
If you already have a local file set repository, you’re halfway to the hybrid model. Just contract with one of the hundreds of available cloud-based storage services for offsite storage.
Consumers know Carbonite and Mozy because they advertise so much. Both of those services have business-class options too, but once you get past a few dozen users, you’ll want a real business solution. There are many to choose from, and prices have dropped because storage costs have dropped. You can now protect a wealth of data for pennies on the gigabyte.