Rethink Your Storage Strategy to Enable the Software-defined Data Center

It’s nearly impossible to turn anywhere without hearing about “IT modernization.” The term may be overused, but it is hard to overstate its importance. At its core, the concept of IT modernization means adopting a software-defined data center architecture that provides the agility and scalability of public cloud services on-premises. These characteristics are paramount for modern data centers because businesses must respond to market dynamics more quickly than ever before, and are simultaneously relying on IT to do so more than ever before.

You don’t just flip a switch and have a modernized data center, though. In fact, with business and application needs being more dynamic than ever before, the software-defined data center is more of an ongoing initiative categorized by constant infrastructure optimization as opposed to a hard and fast destination.

Along those lines, projects typically start out small and are dispersed across multiple teams that are working with a variety of technologies to address specific application requirements. Further adding to this sprawl is the fact business critical applications such as Oracle still run on legacy infrastructure in many cases, and as a result must still be accommodated. Legacy infrastructure is typically also application specific. As a result, IT is faced with a management nightmare while at the same time being pressured to deliver unprecedented levels of data sovereignty, data protection, and agility.

Massive transformation around application architectures and delivery is the tip of the spear driving the need for data center modernization. Traditional applications such as Oracle database, Microsoft SQL server and virtual desktop infrastructure were designed with the philosophy that infrastructure is resilient, providing availability and redundancy. However, the new types of applications such as NoSQL, Hadoop, Splunk, and Spark that are emerging take on this redundancy – thus requiring a fundamentally different architecture. Especially at the beginning of the modernization journey, IT is faced with the challenge of supporting both of these types of applications.

Storage sits at the heart of changing data center architecture requirements – and as a result is the key roadblock to the software-defined data center. In contrast to the shared block storage that legacy applications expect, it does not meet latency, scalability and other requirements of modern applications and unstructured data – giving credence to new approaches including local and object storage.

The advent of containerization furthermore creates new challenges. Where virtual machines have one IP address that is recognizable, a containerized approach might require 10 or more IP addresses that are large and impossible to remember. Dozens, hundreds, thousands or more of these containers might be spun up and down by the business every day. Meanwhile, there are larger, newer types of data (e.g. machine learning) which are emerging and growing rapidly. Legacy infrastructure was simply not designed for this level of change and agility.

StorageSwiss Take

The transition to the modern, software-defined data center requires a full revamp of traditional storage environments. Serving application-specific needs is more important than ever in today’s application-driven economy. This, along with the fact that legacy applications still need to coexist with modern applications, means that IT departments will contend with silo sprawl for the foreseeable future. As a result, IT departments should prioritize a storage architecture that automates data placement, can orchestrate heterogeneous workloads, and has autonomous capabilities. With application service level agreements ruling not just storage – but broader, data center-level infrastructure decisions – the storage architecture should be “data center aware,” capable of actions such as identifying network traffic changes and taking corrective measures as a result. In this vein, as more functionality becomes abstracted into software, the storage software should be aware of the capabilities of specific nodes for optimization.

For additional insights on the storage requirements of a software-defined data center strategy, watch Storage Switzerland’s webinar in conjunction with Datera, “Overcoming the Storage Roadblock to Data Center Modernization,” on demand.

Twelve years ago George Crump founded Storage Switzerland with one simple goal; to educate IT professionals about all aspects of data center storage. He is the primary contributor to Storage Switzerland and is a heavily sought after public speaker. With over 25 years of experience designing storage solutions for data centers across the US, he has seen the birth of such technologies as RAID, NAS and SAN, Virtualization, Cloud and Enterprise Flash. Prior to founding Storage Switzerland he was CTO at one of the nation's largest storage integrators where he was in charge of technology testing, integration and product selection.

Tagged with: , , , , , , , , ,
Posted in Blog

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

Enter your email address to follow this blog and receive notifications of new posts by email.

Join 21,956 other followers

Blog Stats
  • 1,346,832 views
%d bloggers like this: