Storage architects have the responsibility to place workloads properly on the right storage tier. Workload placement requires a careful balance between the correct level of performance and the right price. The problem is these architects do not have the tools they need to assess workload performance requirements or to validate the storage architecture’s ability to deliver what is required. As a result, they end up counting on ad-hoc testing, vendor claims, and educated guesses. Because of the tool deficiency, organizations end up over-provisioning their storage architecture, which wastes money and still leaves it exposed to unpredictable results.
Most make storage performance validations after the fact. An application or set of applications has scaled to a point where the current storage architecture can no longer keep up, or at least do so consistently. This scenario leads the organization to embark on a storage refresh, but because of the tool deficiency, IT does not have the tools it needs to accurately test new storage hardware and software to see how and if it will perform better than the current solution. Once again, the organization is at the mercy of vendor claims and various benchmark tools that bear little resemblance to the customer’s application workloads.
Another problem facing IT is that many storage systems today claim, “mixed workload support.” Being able to mix workload, leads to storage consolidation that should reduce the cost of the overall storage architecture. But, since most tools are very specific to a type of basic tests and single workload, IT can’t confirm these claims, nor can they know how far the multiple workloads will scale on a given storage platform.
The final challenge in managing workload placement is that there are also no planning tools. Workload performance challenges happen suddenly, which forces a disruption in day-to-day IT activities and often leads to a rushed evaluation of storage systems to be part of the refresh.
Storage Performance Validation as a Practice
If IT treats workload placement like a practice, storage architects could better manage both workload requirements and storage refresh cycles. Not only will they know when the current system will hit the proverbial wall because of application traffic and data growth, but they will also be able to calculate the impact of a new application that was unknown when the organization originally purchased the storage system. With a storage performance validation practice in place, workload placement and storage refreshes can be a planned and scheduled event, instead of a drop everything fire drill. The practice should lead to more effective IT spending, less surprise performance problems and a more productive IT team. To enable a storage performance validation practice however, IT needs to have better tools at their disposal.
Virtual Instruments recently announced a significantly enhanced and re-branded version of its Load DynamiX Enterprise storage performance validation solution; WorkloadWisdom, which “records” current customer IO profiles either in real-time or from their existing storage platforms and then “plays them back” on any potential new storage system. More importantly, WorkloadWisdom provides IT with the ability to “turn-up” any aspect of their IO to understand how much further the storage platform will scale or what the impact of a new or changing workload will be on that workload’s performance. WorkloadWisdom can replay a variety of workloads simultaneously so that the storage architect can know how well the potential new system can handle the workload mix.
New features in WorkloadWisdom 6.0 include a new 25GbE workload generator to test high-performance Ethernet storage. The new platform also delivers a 5-10X improvement in results reporting over the previous version. Faster reporting means the storage evaluation team can run more tests more frequently, to better explore the full potential of the storage system. Lastly, the solution enhances the ability to test storage technologies accessed via the SMB protocol. The Workload Data Importer can now capture SMB production workloads directly from Virtual Instruments’ performance monitoring platform, VirtualWisdom. This addition means WorkloadWisdom can easily perform storage performance validation across NFS, Fibre Channel, Object Storage, iSCSI and now SMB protocols.
WorkloadWisdom also brings the former Load DynamiX product more in line with Virtual Instruments broader vision of an integrated application and storage performance monitoring and validation suite, what Virtual Instruments calls App-centric Infrastructure Performance Management. WorkloadWisdom 6.0 features an updated user interface, which now mirrors the look and feel of the easy-to-use VirtualWisdom interface.
Providing a consistent IT experience while managing costs is a constant IT challenge that is made more difficult by the rapidly expanding data center. Storage is almost always a key ingredient in delivering that consistent user experience. IT, though, is severely limited in its ability to determine if new storage systems will meet current and future workloads. The result is initial over-provisioning and then eventual inconsistencies in performance.
Products like WorkloadWisdom from Virtual Instruments address the performance validation problem head-on, enabling customers to efficiently evaluate different vendors and to determine the capabilities of both current and future systems. When combined with Virtual Instruments’ VirtualWisdom, the customer has a complete infrastructure performance management platform that can predict the future as well as manage the present.