The new state of data protection, whereby enterprises must protect copy data quantities that are growing exponentially on a global basis with instant, up-to-the-minute recovery, places new demands on the secondary storage infrastructure.
The key problem lies in the fact that data protection infrastructure is becoming more fragmented by use case. In the past, many organizations standardized on a singular data protection solution. Today, it is common for multiple data protection solutions to coexist in the data center. This is largely because applications and workloads have more individualized recovery and retention requirements that must be met. Also, many data protection solutions, whether legacy or newer, do not yet support some new applications and workloads.
While most applications carry demanding recovery time objectives (RTOs) and recovery point objectives (RPOs), some are stricter than others. Additionally, certain applications might be fueling secondary business initiatives such as application development or analytics, creating implications for what data needs to be restored and how quickly it must get back online. Additionally, the growing number of data privacy regulations will impact enterprise offices in specific regions of the world, and can impact certain applications more than others. Application owners typically want a best-of-breed solution that meets their particular need. Further contributing to this sprawl, it is becoming more common for some data protection capabilities to be built directly into the applications themselves, which the application owner might want to take advantage of.
Fragmented data protection implementations create an expensive problem in the form of secondary storage hardware and software sprawl. Data protection systems are eating up very large, and growing, components of IT capex budgets and data center floorspace – both of which are at a premium. Costs are further escalated via power and cooling requirements, and the fact that IT staff must deploy and manage these systems. Furthermore, multiple software licenses must also be procured and managed. This not only eats up budget, but it also hinders IT’s ability to be agile and react quickly to business needs.
Many enterprises are turning to the cloud to obtain a lower and more predictable cost structure, and to obtain a data protection implementation that is more centralized and streamlined but that still addresses applications’ varying requirements. However, if not done correctly, the cloud can in fact add costs and introduce new silos. Storage Switzerland’s recent webinar with Actifio, “How to Create an Infrastructure-less Backup Strategy,” dives further into what storage professionals should look for in a software-as-a-service model to meet changing data protection requirements.