As we will discuss in our upcoming webinar, creating a second copy in order to protect important data is becoming increasingly inefficient, a fact driven home by the relentless growth of ‘copy’ data in companies of all sizes. Setting up a dedicated infrastructure to capture and store this extra copy is also expensive. But the time it takes to create that copy, and especially to restore it during a recovery, may be the biggest issue and the thing making traditional backups obsolete.
In an effort to make this whole process less painful and to address the problems of getting backups done in the face of exploding data sets, backup applications have added ways to maintain a logical full copy of current data without actually recopying all that data itself. Incremental and differential backups, plus changed block tracking (CBT) can reduce the amount of data that needs to be captured from the primary data set and handled by the backup system.
Deduplication and image backup can also cut down the sheer volume of data that’s sent to and stored on the backup system. These technologies can effectively shorten the window required to complete a given backup. But they create additional complexity in the process and essentially push the data capacity problem over to the restore side.
Traditional Recovery is Obsolete?
While these methods can help the backup get done, they consume resources, complicate the backup process and, most importantly, don’t help with restores. When a data recovery is needed most of these processes don’t save any time. Deduplication must usually be reversed and change-based backups must be applied to the last full copy before a restore can begin. This results in a longer Recovery Time Objective (RTO), the overall time required to restore lost or corrupted data to the application or user that needs it.
In addition to recovery time, the Recovery Point Objective (RPO) for traditional backup systems can be inadequate. These are breakpoints from which a restore can be initiated. When a backup is taken once a day, changes to data sets made during the day are vulnerable, since the only recovery available is from the previous night’s backup. Backup increments can be increased to provide some relief, but this adds to the amount of data collected and increases the overall complexity of the backup.
What’s needed is a data protection method that uses the primary data storage infrastructure and doesn’t create extra copies of data in the process.
Snapshots for Backup
Array-based snapshots can do just that. They run on the primary storage device and capture data changes in the smallest possible increments. As an array-based process a snapshot can be taken in a few seconds, essentially eliminating the backup window altogether. This makes complex incremental backup technologies all but obsolete and allows the data protection process to keep pace with data growth.
Snapshot are also much more efficient than traditional backups, even incremental techniques like CBT, resulting in less storage consumed solely for data protection. And since there’s no dedicated backup software or storage systems required, the infrastructure cost is largely unaffected, including the network that has to move backups from servers and storage systems to the backup system. But the biggest benefit is probably on the recovery side.
Recovering from a snapshot is almost instantaneous, simply load a new set of pointers to the original data blocks and the restore is complete. If the recovery is coming from a secondary array, say for DR purposes, a snapshot-based recovery will take longer. But it will still be dramatically faster than a traditional restore from a DR copy. Snapshots also enable a near-continuous RPO, without any appreciable performance impact if the right snapshot process is used.
Modernizing Snapshots For Better Backup and Recovery
Snapshot technology has been around for years but keeping snapshots for historical recovery was impractical because they took up too much space and hampered storage system performance. But now, thanks to the improvements seen in flash-powered hybrid storage systems, the long term and cost effective use of snapshots for primary data protection is a reality.
To learn more about how modern hybrid arrays can offer built in data protection register for this upcoming webinar from Storage Switzerland. In it, Lead Analyst George Crump will talk with two IT Managers that are leveraging snapshots to protect critical database applications using Nimble Storage hybrid arrays. When you register you’ll receive an advanced copy of Storage Switzerland’s new report “Can Array-based Snapshots Save Backup?” prior to the webinar.