In our recent webinar, “The Truth About All-Flash Deduplication”, which is now available for on-demand viewing, we asked attendees where are they applying deduplication technology to improve storage efficiency. It was little surprise that backup storage, at 36% of the respondents, was a top vote getter since deduplicating backup data provides such a high level of efficiency. What was surprising was that Primary Storage Deduplication was our top vote getter!
38% or our attendees indicated that they were already deduplicating data on primary storage. These were mostly NAS use cases attacking home directory data sets. Also interesting was the number of respondents already using deduplication in their hybrid and All-Flash array systems. While the primary storage deduplication use case does not provide the same level of efficiency as deduplicated backup, the cost of primary storage typically generates a much higher the rate of return on investment; especially in the All-Flash or Hybrid markets.
The adoption of primary storage deduplication is a significant development. When we asked a similar question about 18 months ago, the number of respondents that were using primary storage deduplication was less than 5%. That 30%+ gain in use is something that both storage vendors and IT Planners should be aware of.
Dedupe is An Arrow In The Quiver
During this webinar we were also asked to compare deduplication to other data efficiency technologies. The fact that data is growing comes as no surprise to anyone, it is the rate at which data is growing that is taking many people, even IT veterans, off guard. To control this growth, IT professionals need a variety of tools at their disposal.
Our view is that deduplication should be looked at as an arrow in the data efficiency quiver. You can eliminate a lot of data redundancy by leveraging thin provisioning, snapshots, clones and compression. A vendor can make deduplication take the place of some of those technologies or they can implement it alongside of those technologies to wring out every last bit of data efficiency.
Primary Storage Deduplication is here and users are implementing it. The uncertainty behind the technology is fading and I think we are getting close to the point where it is an expected part of the array feature set, much like snapshots are today. Despite this level of acceptance, IT professionals should be aware that not all deduplication is created equally and as we discussed on the webcast, flash storage has the potential to expose the shortcomings of some of the implementations. Knowing the deduplication inside your storage system is as important as knowing the CPU inside your server.