Visualizing Before Adopting the Big Three Storage Trends

In a recent paper entitled “3 Storage Trends, 1 Crucial Question” by Phil Godwin, Vice President of Marketing at Clear Tech, the author describes three storage trends: Flash Storage, Tiered Storage and Cloud Storage. He goes on to point out that IT Managers are aggressively investigating these trends, at a minimum, and in many cases are in the early phases of implementation. Each of these initiatives promises to reduce capital and operational costs while improving data center responsiveness, but each requires an understanding of the types of data sets that are in the data center to begin with and what their access profiles are. This is an area where most storage hardware vendors fall short and, as Godwin correctly points out, is an area that IT planners should be well equipped to address.

Information Not Data

There are countless tools that can collect data from storage systems, file systems, applications and operating systems. Most of these are provided by the vendor that manufactured the system. While these solutions have come a long way in recent years, there are problems when trying to mine that data for the valuable information it can contain. IT Planners don’t have time to sift through data, they need actionable information that can be quickly understood to determine how to best get the most out of the above emerging technologies.

Holistic Not Myopic

First and foremost, vendor provided data collection tools are usually myopic, meaning they take an overly-focused view of the environment. This probably comes from each vendor’s view of reality, that their product is center of the universe. Unfortunately, this view is simply not representative of the modern data center, which has a mixture of storage systems, hypervisors, virtual machines and applications.

The trends themselves are excellent examples of this mixture. Flash systems are frequently bought from a different vendor to augment an existing storage system, cloud storage is often bought or “outsourced” to offload data from an existing storage system and tiering is often used to help facilitate the movement of data between the two new initiatives (flash and cloud) and legacy storage. A vendor-myopic approach may not be able to analyze data across these different storage platforms and provide the IT planner with the information they need to take advantage of each of their unique characteristics.

Continuous Not Quarterly

Another key factor in developing an intelligence for the data in the data center is the ability to perform a storage asset analysis as frequently as possible. Traditionally storage assessments have been done by a third-party contractor at most once per quarter. It takes days if not weeks to organize the data which can quickly become out of date. Instead, the data center is better served when this analysis can be done on a continual basis. This near real-time access to information allows data to be relocated as needed and prevents the potential performance impact of data being on the wrong tier.

Visual Not Verbose

The final challenge with collecting data and transforming it into information is that the typical presentation of that data is textual and too verbose. As Godwin covers in his paper, 65% of people are visual learners. To make matters worse IT professionals are all stretched too thin, they don’t have time to plow through a verbose report. They need this data transformed into information and then have specific problem areas brought to their attention visually. As the saying goes “a picture is worth 1,000 words”. In the case of the overworked IT professional the visual presentation of this information may be the only way they have the time to consume it.

Applying Visual Information To Storage Trends

Armed with a visual assessment of their storage assets, IT professionals can make better decisions based on these new storage trends. Flash is still a premium vs. hard disk drives so making sure the right data is on flash storage is critical to maximizing performance and minimizing cost. Cloud storage is ideal for data that has become less active or where recall time will not impact revenue. Again, visualizing which data should be on this tier of storage is critical to project success. Tiering is the engine that makes the other two technologies practical. Knowing what to feed the tiering engine and when to override it will assure maximum benefit and minimize user impact.

Conclusion

In his paper, Godwin goes into much more detail about these technologies, their value to the data center, and how to identify this information in his report, which you can access here. In short, IT planners need to avoid the often vendor-motivated suggestion to blindly throw hardware at the problem. Instead, it makes sense to step back and learn what types of data you have and how to best apply these technologies to them.

Clear Technologies is a client of Storage Switzerland

Click Here To Sign Up For Our Newsletter

Twelve years ago George Crump founded Storage Switzerland with one simple goal; to educate IT professionals about all aspects of data center storage. He is the primary contributor to Storage Switzerland and is a heavily sought after public speaker. With over 25 years of experience designing storage solutions for data centers across the US, he has seen the birth of such technologies as RAID, NAS and SAN, Virtualization, Cloud and Enterprise Flash. Prior to founding Storage Switzerland he was CTO at one of the nation's largest storage integrators where he was in charge of technology testing, integration and product selection.

Tagged with: , , , , , ,
Posted in Article

Enter your email address to follow this blog and receive notifications of new posts by email.

Join 22,245 other followers

Blog Stats
  • 1,555,639 views
%d bloggers like this: