The volume of data that must be backed up continues to increase, to meet compliance regulations and to serve secondary business purposes such as business analytics. At the same time, recovery requirements are becoming more demanding. In the event of an outage, mission-critical applications must be back online within minutes, and they must be restored to a point in time as close to the outage as possible. The latter issue further increases capacity requirements, as it necessitates more copies to be created and stored to serve more frequent recovery points. Meanwhile, malware such as ransomware creates the use case for more sophisticated restore capabilities, such as the ability to conduct a system-level restore to months prior but a data-level restore to the previous night.
Against this backdrop, a tiered approach to underlying secondary storage infrastructure and a sophisticated data management strategy begin to make sense as a path to addressing the need to store more copy data and to deliver production levels of performance during recovery without breaking the budget.
Blending Flash and Object Storage for Copy Data
Today’s Tier One applications require immediate levels of recovery and fast performance during recovery, creating the use case for integrating a tier of high-performance flash capacity into the secondary storage environment. However, not all recovery requests are as urgent. For example, the business may have hours or days to respond to a legal discovery request. It does not make sense for the business to incur the premium price of flash storage for these workloads. This is especially true when considering the pace at which the volume of copy data is growing.
Adding lower-cost and more scalable and searchable on-premises object storage has become a very viable approach to serving long-term retention use cases. Object storage is always online, which helps to accelerate data recovery. Meanwhile, it facilitates a flat, global namespace and rich metadata – laying the groundwork for granular searching for, access to, and recovery of file data. Over the long term, large enterprises with vast pools of retention data that are growing quickly every day will likely find the on-premises object storage route to be more cost-effective than the cloud services model.
Intelligent Data Management
As secondary storage infrastructures become tiered, a smart data management strategy is required to make sure that data is always on the most appropriate infrastructure resource based on business demands. Storage managers should look for a platform that provides policy-based backup and centralized visibility across the organization’s copy data – and that furthermore takes the next step of applying monitoring and analytics for automatic movement and tiering of data across storage resources, based on user requirements.
Smart metadata management is core to achieving this. Furthermore, it can help the organization to ensure compliance with data privacy regulations and to safeguard against malware, for instance through facilitating staged data recovery to ensure that only files that are supposed to be recovered, are recovered. Decoupling file data from metadata and storing metadata on premises can facilitate a “map” of where data is stored for faster access and to avoid duplicate versions of the same data from being stored.