Rethink Your Storage Strategy to Enable the Software-defined Data Center

It’s nearly impossible to turn anywhere without hearing about “IT modernization.” The term may be overused, but it is hard to overstate its importance. At its core, the concept of IT modernization means adopting a software-defined data center architecture that provides the agility and scalability of public cloud services on-premises. These characteristics are paramount for modern data centers because businesses must respond to market dynamics more quickly than ever before, and are simultaneously relying on IT to do so more than ever before.

You don’t just flip a switch and have a modernized data center, though. In fact, with business and application needs being more dynamic than ever before, the software-defined data center is more of an ongoing initiative categorized by constant infrastructure optimization as opposed to a hard and fast destination.

Along those lines, projects typically start out small and are dispersed across multiple teams that are working with a variety of technologies to address specific application requirements. Further adding to this sprawl is the fact business critical applications such as Oracle still run on legacy infrastructure in many cases, and as a result must still be accommodated. Legacy infrastructure is typically also application specific. As a result, IT is faced with a management nightmare while at the same time being pressured to deliver unprecedented levels of data sovereignty, data protection, and agility.

Massive transformation around application architectures and delivery is the tip of the spear driving the need for data center modernization. Traditional applications such as Oracle database, Microsoft SQL server and virtual desktop infrastructure were designed with the philosophy that infrastructure is resilient, providing availability and redundancy. However, the new types of applications such as NoSQL, Hadoop, Splunk, and Spark that are emerging take on this redundancy – thus requiring a fundamentally different architecture. Especially at the beginning of the modernization journey, IT is faced with the challenge of supporting both of these types of applications.

Storage sits at the heart of changing data center architecture requirements – and as a result is the key roadblock to the software-defined data center. In contrast to the shared block storage that legacy applications expect, it does not meet latency, scalability and other requirements of modern applications and unstructured data – giving credence to new approaches including local and object storage.

The advent of containerization furthermore creates new challenges. Where virtual machines have one IP address that is recognizable, a containerized approach might require 10 or more IP addresses that are large and impossible to remember. Dozens, hundreds, thousands or more of these containers might be spun up and down by the business every day. Meanwhile, there are larger, newer types of data (e.g. machine learning) which are emerging and growing rapidly. Legacy infrastructure was simply not designed for this level of change and agility.

StorageSwiss Take

The transition to the modern, software-defined data center requires a full revamp of traditional storage environments. Serving application-specific needs is more important than ever in today’s application-driven economy. This, along with the fact that legacy applications still need to coexist with modern applications, means that IT departments will contend with silo sprawl for the foreseeable future. As a result, IT departments should prioritize a storage architecture that automates data placement, can orchestrate heterogeneous workloads, and has autonomous capabilities. With application service level agreements ruling not just storage – but broader, data center-level infrastructure decisions – the storage architecture should be “data center aware,” capable of actions such as identifying network traffic changes and taking corrective measures as a result. In this vein, as more functionality becomes abstracted into software, the storage software should be aware of the capabilities of specific nodes for optimization.

For additional insights on the storage requirements of a software-defined data center strategy, watch Storage Switzerland’s webinar in conjunction with Datera, “Overcoming the Storage Roadblock to Data Center Modernization,” on demand.

Senior Analyst, Krista Macomber produces analyst commentary and contributes to a range of client deliverables including white papers, webinars and videos for Storage Switzerland. She has a decade of experience covering all things storage, data center and cloud infrastructure, including: technology and vendor portfolio developments; customer buying behavior trends; and vendor ecosystems, go-to-market positioning, and business models. Her previous experience includes leading the IT infrastructure practice of analyst firm Technology Business Research, and leading market intelligence initiatives for media company TechTarget.

Tagged with: , , , , , , , , ,
Posted in Blog

Enter your email address to follow this blog and receive notifications of new posts by email.

Join 25,553 other subscribers
Blog Stats
%d bloggers like this: