The Impact of Changing Data Protection Requirements on Your Infrastructure

The new state of data protection, whereby enterprises must protect copy data quantities that are growing exponentially on a global basis with instant, up-to-the-minute recovery, places new demands on the secondary storage infrastructure.

The key problem lies in the fact that data protection infrastructure is becoming more fragmented by use case. In the past, many organizations standardized on a singular data protection solution. Today, it is common for multiple data protection solutions to coexist in the data center. This is largely because applications and workloads have more individualized recovery and retention requirements that must be met. Also, many data protection solutions, whether legacy or newer, do not yet support some new applications and workloads.

While most applications carry demanding recovery time objectives (RTOs) and recovery point objectives (RPOs), some are stricter than others. Additionally, certain applications might be fueling secondary business initiatives such as application development or analytics, creating implications for what data needs to be restored and how quickly it must get back online. Additionally, the growing number of data privacy regulations will impact enterprise offices in specific regions of the world, and can impact certain applications more than others. Application owners typically want a best-of-breed solution that meets their particular need. Further contributing to this sprawl, it is becoming more common for some data protection capabilities to be built directly into the applications themselves, which the application owner might want to take advantage of.

Fragmented data protection implementations create an expensive problem in the form of secondary storage hardware and software sprawl. Data protection systems are eating up very large, and growing, components of IT capex budgets and data center floorspace – both of which are at a premium. Costs are further escalated via power and cooling requirements, and the fact that IT staff must deploy and manage these systems. Furthermore, multiple software licenses must also be procured and managed. This not only eats up budget, but it also hinders IT’s ability to be agile and react quickly to business needs.

Many enterprises are turning to the cloud to obtain a lower and more predictable cost structure, and to obtain a data protection implementation that is more centralized and streamlined but that still addresses applications’ varying requirements. However, if not done correctly, the cloud can in fact add costs and introduce new silos. Storage Switzerland’s recent webinar with Actifio, “How to Create an Infrastructure-less Backup Strategy,” dives further into what storage professionals should look for in a software-as-a-service model to meet changing data protection requirements.

Sign up for our Newsletter. Get updates on our latest articles and webinars, plus EXCLUSIVE subscriber only content.

Senior Analyst, Krista Macomber produces analyst commentary and contributes to a range of client deliverables including white papers, webinars and videos for Storage Switzerland. She has a decade of experience covering all things storage, data center and cloud infrastructure, including: technology and vendor portfolio developments; customer buying behavior trends; and vendor ecosystems, go-to-market positioning, and business models. Her previous experience includes leading the IT infrastructure practice of analyst firm Technology Business Research, and leading market intelligence initiatives for media company TechTarget.

Tagged with: , , , , , , , , , , ,
Posted in Blog

Enter your email address to follow this blog and receive notifications of new posts by email.

Join 22,087 other followers

Blog Stats
  • 1,462,820 views
%d bloggers like this: