Data management companies are struggling to gain adoption. These companies offer to save customers money by moving data to the most appropriate tier of storage based on current needs. The thought is to have only a small fraction of data on a high-performance tier of storage and most of the data on less expensive tiers of storage. Data management vendors, like archive vendors, claim impressive returns on investments (ROI). Why then are they struggling to gain adoption?
Data Management vs. Archive
The first step in understanding the problem is to understand what the products attempt to accomplish. Data management is the modernization of archive. Archive solutions focus on moving inactive data to less and less expensive storage. It is, in essence, a one way street. Recalls from the archive, when they occur, usually go back to the original location, the software does not try to determine a better location for it.
Data management, on the other hand, provides more of a two-way street where data may move up a tier when its access pattern justifies it and down tiers when users access it less frequently. Some data management solutions can move data based on use case. For example, data management offers the ability to move a data set, based on policy, to the cloud, to leverage either cloud storage or cloud compute.
The Data Management Problem
The problem facing most data management solutions is they require the customer to implement a global file system and move all data into that file system. Moving all data to someone else’s file system is a significant commitment, especially if that file system is from a new vendor.
Another problem with the global file system approach is some workloads just don’t work well in a generic file system. Either they want a file system of their own, or they want to access data via a block protocol. This forces the customer to use the data management solution for only a portion of their data rather than all of their data, which means the data center still has to manage multiple silos of storage.
There is also a problem with the ROI claim of these solutions. While the data management solution can identify that 80% of the organization’s data can move to less expensive storage, no IT professional in their right mind is going to move data to another storage system the day after the implementation of the storage management solution. The cost-effective storage tier needs to start small and gradually grow. The problem is many secondary storage systems start off way too large.
There is also a problem with concerns about performance. IT lives in fear of users screaming when the performance of their application declines because the data management solution decides to move the application to a lower performing tier of storage. Making policies less aggressive and performance tiers larger can solve the user performance problem to some degree, but that hurts the ROI.
Finally, there is the problem of price. Developers deserve payment for their creations, but only the market can dictate what that value is. Most of the data management solutions are so expensive, it is less expensive (at least using short-term thinking) to just buy more primary storage, which is exactly what most organizations do. This is why data management companies are struggling.
We at Storage Switzerland are big fans of archiving and data management. Not only does implementing these processes reduce the long-term costs of storing data, they also make backup and disaster recovery better as well as ensuring the organization is meeting the requirements of regulations like GDPR and corporate governance.
Archive is getting closer to something that organizations can readily embrace. Object storage vendors are integrating archive capabilities directly into their solutions, or they are collaborating with archive vendors. Additionally, archive is not as big a leap since it does not require the same file system everywhere, just in the archive.
Data Management may just be a bridge too far, or a leap too great, at least right now. These vendors may be better off creating “lighter” versions of their software that are less expensive and focus on either archiving, data migration or data protection.