Have you ever needed to get somewhere really quickly? Consider the things that go through your mind. This parallels how storage management is typically done in today’s data centers. But we could do better.
For example, what if you lived in San Diego and need to be in San Francisco ASAP? You throw the two locations into Google Maps and see that you can drive there in eight hours. The train will take 12 hours, so that’s not going to work. You can fly from San Diego to San Francisco in 1.5 hours. Add a drive to the airport, arriving two hours prior to departure, and an hour or so to get from SFO to your customer and it will take at least six hours to get to where you’re going. Like many business travelers, you decide on the plane.
This is the way we manage data in today’s data centers. We first look at the objectives, and then we look at the options available to us. If the application needs the best performance available, we move it to flash. But what if the application also needs up-to-the-minute recovery capabilities, such as those CDP products provide? What if the data also has long-term storage needs such as those provided by an object storage solution? This requires you to place and manage the data on multiple platforms in order to meet those multiple needs.
IT can divide most storage needs into two categories: performance and protection. The performance category contains things like read/write speed, latency, bandwidth and IOPs. The protection category contains needs such as availability, persistence, security, disaster recovery, and long-term accessibility. What if you could simply define those needs and then place the data automatically into the appropriate place in order to meet those needs? This is referred to as managing data by its objectives.
Consider the data needs of a docker container. The container has high performance needs because it’s possible to create, start, shut down and delete a container all in a matter of a few seconds. (Apple, for example, starts up a separate container for every Siri request.)
If you are managing data by objectives, you would simply push the performance slider all the way to the right – especially if the slider goes to 11. But the same data has no protection requirements; it is completely ephemeral. Push that slider completely to the left. A data virtualization product that manages by objectives will ensure that this container’s data resides only on flash, has no snapshots or backups, and will definitely not be on any long-term storage systems.
The opposite is true of PDF copies of every order your company has ever received for its services. Performance is almost irrelevant, but persistence and long-term accessibility are paramount. Slide the performance slider all the way to the left and the protection slider all the way to 11. The data will magically be written to object based storage and replicated to multiple locations. It may also be copied into another system based on the capabilities of your environment.
Storage Swiss Take
Managing data by objectives makes so much more sense than managing by storage silo capabilities. Businesses will need to be careful, of course, that they do not allow business units to always slide both sliders to 11, or they will push all storage systems to the brim. The key to this, of course, is to match objectives to business value and cost. Data generating the most revenue and driving the most value to the company can finally get whatever it needs, without ignoring the needs of “less important” data.
Sponsored by Primary Data