The Next Generation Data Center is a Model all Enterprises can use

It’s true that the concept of the Next Generation Data Center (NGDC) was born in the cloud provider space as a way to describe the infrastructure  these companies need to meet their often extreme resource requirements. Cloud providers have to satisfy the appetites for storage, compute and networking that their customers develop, and do so for all of them, simultaneously. And they have to do it often with little advance notice, while under some unforgiving service level agreements – and still turn a profit. Interestingly, these are the same kinds of demands being placed on the traditional “mainstream” environments.

While some companies actually build these kinds of data centers, many more probably won’t be at that level. But the concept of NGDC still has a lot of value for those with more “mainstream” IT environments. It’s really a model of infrastructure performance and efficiency, of operational excellence and cost-effectiveness that all enterprises can use. In the final analysis, what IT manager doesn’t want to reduce cost, increase efficiency and improve services for the users they serve?

Watch the On Demand Webinar "Storage Requirements for the Next Generation Data Center"

Expanding storage has traditionally been done with a “worst case scenario” mindset, where systems were over-built because actual demand was difficult to predict and users couldn’t run out of capacity or performance. Making matters worse, the scale-up storage systems used didn’t offer the flexibility to expand gradually, they grew in “leaps and bounds”, adding frames and drive shelves in order to generate enough performance and wasting capacity in the process.

Scalability and Performance

Like the cloud environment, an enterprise’s storage capacity needs to expand with demand, but do so while maintaining the performance levels that are required to meet service expectations of users. In more and more cases this means a flash-based array that can produce much higher performance as it scales out to the capacity levels needed, instead of the legacy disk-based architecture described above.

Flexible and Efficient

The NGDC concept puts a premium on flexibility; being able to add just the resources that are needed, when they’re needed. This creates a level of efficiency that drives another primary requirement – cost containment. While an enterprise data center may be not be the profit center that the cloud provider is, minimizing costs for resources like storage is a still foundational requirement.

The NGDC infrastructure leverages data reduction technologies like deduplication, compression and thin provisioning, in-line with the primary data storage system. This minimizes the capacity required and thereby reduces effective cost per TB.

From an operational perspective, the NGDC model emphasizes automation whenever possible, to maximize resource utilization (also called “performance and capacity efficiency”) and keep costs down. Automation also reduces administrative overhead.

But the system must also ensure that users get the capacity and performance they need, all the time, which means a quality of service approach to resource management. The NGDC model has at its core a way to guarantee the performance for each user, even when the storage infrastructure is nearing its actual capacity.

Storage Swiss Take

The Next Generation Data Center model was developed in the “crucible” of the cloud provider market where companies simply must do a better job of resource utilization, efficiency and cost containment while they guarantee service levels to their customers. They’ve figured out how to provide utility-like resource provisioning with a quality of service aspect that’s still economical enough to support a business.

Enterprises can leverage the lessons from these cloud providers by applying the NGDC model in their own environments to make themselves better. This includes their storage infrastructures. To learn more about this, tune in to the on-demand webcast from Storage Switzerland and SolidFire “Storage Requirements for the Next Generation Data Center”.

Click To Watch On Demand

Click To Watch On Demand

SolidFire is a client of Storage Switzerland

Eric is an Analyst with Storage Switzerland and has over 25 years experience in high-technology industries. He’s held technical, management and marketing positions in the computer storage, instrumentation, digital imaging and test equipment fields. He has spent the past 15 years in the data storage field, with storage hardware manufacturers and as a national storage integrator, designing and implementing open systems storage solutions for companies in the Western United States.  Eric earned degrees in electrical/computer engineering from the University of Colorado and marketing from California State University, Humboldt.  He and his wife live in Colorado and have twins in college.

Tagged with: , , , , , , , ,
Posted in Blog
One comment on “The Next Generation Data Center is a Model all Enterprises can use

Comments are closed.

Enter your email address to follow this blog and receive notifications of new posts by email.

Join 22,223 other followers

Blog Stats
%d bloggers like this: