How to Track the Cost of Storage Part 2 – lowering TCO

In the first article we opened with a short discussion on why TCO calculations are worth the time they consume. Essentially, any realistic effort to reduce overall costs must start with an accurate baseline of the existing system cost. We also made mention of the startling fact that many IT organizations have no real idea how much their storage is costing them – even though it’s probably growing anywhere from 30 to 50 percent each year.

After generating at least an initial cost of ownership figure, IT mangers can turn their attention to lowering that cost. Aside from the obvious things like reducing duplicate data through snapshots, clones and deduplication, hidden costs need to be addressed. These include operational inefficiency, orphaned capacity and low utilization. In order to attack these areas, administrators may need better visibility into their storage infrastructures.

Efficiency

Factors like administrator efficiency can greatly impact the cost of storage. Administrative time can be affected by the quality of management software available, such as storage resource management (SRM) tools. Good SRM tools can reduce TCO by reducing the time needed to support each TB of storage from both the data protection management side (is information being backed up accurately and consistently?) and the capacity management side (how much storage does the environment have, and how much is not being used? How does the virtualized environment tie back to physical storage and how much is not being used?) In this area, third party products, like APTARE’s StorageConole8, can be especially effective since they often provide a better overview of the infrastructure than tools which are included with a storage system.

Capacity and Utilization

The amount of ‘raw’ storage required to provide adequate ‘usable’ capacity for new systems is also fundamental to a TCO calculation. This net usable percentage is a function of RAID levels implemented and available features, like thin provisioning, deduplication and compression. But efficiency is also affected by the utilization rate, or percentage of the usable storage capacity in a system that’s actually available to be written. Day to day operations in every environment can generate a significant amount of wasted space that can erode storage system utilization.

Examples of this are unallocated LUNs or LUNs that are allocated but undiscovered or unused by hosts. Likewise, ‘orphaned’ LUNs, those allocated to hosts that have since been decommissioned also consume storage capacity that’s not used but can be hard to actually identify in the storage environment. Good SRM tools can identify wasted space throughout the infrastructure and help reclaim it, improving utilization. They can also be used to drive best practices in storage provisioning and help maintain higher utilization.

For example, a standard practice is to automatically provision a larger amount of storage than needed for each application, mainly because storage admins don’t have a clear idea how much an application will consume and they don’t want to impact users. An SRM tool can provide the accurate data on storage usage that’s needed in order to set realistic provisioning amounts up front. This can result in less waste and lower costs without adversely affecting users.

Visibility

These tools can also increase visibility to the virtual infrastructure, the networks and storage, reducing management time and improving performance. In addition, good SRM tools can provide critical information in a timely fashion, resulting in better decisions and better uptime. For example, a lot of companies maintain spreadsheets to help with storage provisioning and management, others outsource the entire process. In those cases, the question to ask is: “What’s the cost of waiting for critical data to be culled from spreadsheets or to be collected and sent from outsourced infrastructure reporting services?” Similarly, is the quality of the intelligence gained from simple methods like spreadsheets supporting the best decisions? Finally, is the organization even bothering to collect certain types of data because it’s too labor-intensive with the current software tools, like those provided by hardware vendors?

As infrastructures expand and data sets grow, management costs and efficiency of storage systems can diminish. TCO calculations can provide a baseline to determine the efficiency of existing storage systems and a foundation for Return on Investment (ROI) tools when considering new systems. In those cases, accurate TCO calculations are essential to facilitate a meaningful comparison of alternatives, whether those are buying new systems or upgrading existing ones.

This article and the previous one, have summarized the key components of a TCO analysis for an enterprise storage organization and discussed ways to lower that cost. The goal for any enterprise should be to figure out what they are paying to own and manage their storage annually, then project what will happen to that cost moving forward as data continues to grow rapidly. Once that analysis is complete, the objective should be to manage the long-term annual TCO and bring it down through the use of good SRM tools.

Twelve years ago George Crump founded Storage Switzerland with one simple goal; to educate IT professionals about all aspects of data center storage. He is the primary contributor to Storage Switzerland and is a heavily sought after public speaker. With over 25 years of experience designing storage solutions for data centers across the US, he has seen the birth of such technologies as RAID, NAS and SAN, Virtualization, Cloud and Enterprise Flash. Prior to founding Storage Switzerland he was CTO at one of the nation's largest storage integrators where he was in charge of technology testing, integration and product selection.

Tagged with: , , , , ,
Posted in Article

Enter your email address to follow this blog and receive notifications of new posts by email.

Join 21,893 other followers

Blog Stats
  • 1,255,779 views
%d bloggers like this: