When designing a storage infrastructure for an artificial intelligence (AI) or deep learning (DL) workload, the default assumption is that an all-flash array (AFA) or something even faster must be at the heart of the design. The problem is as…
Most data protection solutions use the public cloud as a digital dumping ground to lower the cost of on-premises data protection infrastructure. To save costs, vendors often store the backup data set in a low-cost object store like Amazon’s Simple…
There is a difference between availability and backup. Providing a workload with high availability means that if a component of the infrastructure fails, adequate measures are in place to ensure that the use of the workload continues with little or…
The modern data center is increasingly microservice or container-based. These workloads are dynamic and unpredictable. The datasets within these workloads range from thousands of large files to billions of small files. Artificial Intelligence (AI) and Machine Learning (ML) are being…
One part of File Virtualization is a global file system that abstracts the physical location of data from the logical directory structure. Even though data may move between physical file servers or network-attached storage (NAS) systems, users continue to access…
The public cloud is typically looked on as the great consolidator. Organizations, small and large, use the cloud as a hub for data storage and distribution. Recently, however, there has been a proliferation of edge use cases, and the major…
When organizations try to establish a cloud strategy, one of the challenges they face is where to begin. The problem is the cloud can do so much, like run applications, and provide advanced services and store data, to name just…