This week Storage Switzerland attended SimpliVity’s first analyst and influencer day. The event provided the company with a chance to impress the press and analysts on the capabilities of the product, gave a glimpse into upcoming capabilities, and provided a chance to interact with some of its customers and resellers.
Moving From Greenfield Projects to Data Center Ownership
One of the points that stuck out at the event is SimpliVity’s statement that over 60% of the time, IT planners are selecting its solution not for greenfield or niche projects, but as the solution for core data center operations. In many cases, SimpliVity’s customers are now 100% hyperconverged on the SimpliVity platform.
The move from niche project to data center ownership is critical for SimpliVity and its customers. To fully realize the potential of a hyperconverged solution, a data center needs most, if not all of its applications, running on that platform.
Stepping Up to the Hyperconverged Challenges
Storage Switzerland, as we do with any IT concept or technology, points out areas of concern as well as capabilities we like. That does not mean we think the technology/concept is bad. We know no solution is perfect and IT professionals must be aware of weak spots and act accordingly.
One major hyperconvergence challenge is guaranteeing performance: In a shared-everything environment, assuring performance for a specific application is difficult. There are two approaches that we think will work. The first is to provide a performance priority but not a specific numerical assurance. Essentially, mission-critical applications can be assured always get X% of the available resources at all times. SimpliVity’s next generation of software will, I think, provide some tools to deliver these guarantees.
The second way to assure performance is to make sure that the whole environment performs so well that guarantees are less of an issue. SimpliVity provides this ability right now. Its efficiency in inter-node communication, its metadata handling capabilities, and its offloading of key processes onto a dedicated PCI card should smooth out performance spikes considerably. If you have an application that needs a million IOPS, can SimpliVity guarantee that level of performance 100 percent of the time? No. But the number of data centers needing that kind of assurance is a very distinct minority. For 99% of the data centers in the world, the SimpliVity approach should work well for them.
Data Protection Built In
Most SimpliVity customers don’t start talking to the company because of its built-in data protection, but it seems that it is the feature that cinches the deal for them. SimpliVity’s data protection leverages its built-in deduplication to make instant copies of virtual machines.
Normally, this would not pass the Storage Switzerland test. To pass, the technique has to be more than a glorified snapshot. SimpliVity’s difference is in the way it protects the data and, more importantly, protects the metadata table that tracks all the information when it creates a deduplicated copy. Assuming it replicates this data to a second SimpliVity cluster, which seems to be a common practice, it does pass the data protection test. The result is SimpliVity protects two copies of the data and the associated metadata, providing good protection for your files.
As a data protection purest, I still prefer a second copy of data stored on a totally different system and even a different type of media, such as tape. But SimpliVity is going a long way to make sure that data under its protection is truly safe from a variety of failures.
Hyperconvergence is at times over-hyped, and that is unfortunate, especially when you consider that vendors like SimpliVity are on the market with a unique set of products. Its solution is not just a re-packaged storage software solution. The good news for SimpliVity, and probably bad news for its competitors, is that there is more to come. SimpliVity is continuing its upward move into Enterprise data centers by providing automation, advanced monitoring, and analytics to allow its architecture to truly scale.