Organizations across industries are looking for ways to harness massive sets of streaming data. This pain point is especially true in the media and entertainment industry. High-resolution content is vastly increasing file size, and a growing number of users must collaborate on the same set of files. This creates a challenge for storage professionals in the form of delivering globally-distributed access to a heterogeneous set of files and real-time processing performance, as cost efficiently as possible.
Quantum has a background in managing large structured and unstructured data sets, leading back to its heritage in tape storage and more than 15 years of selling high performance parallel file systems into media and entertainment companies, federal and state government agencies, research agencies, and enterprise backup implementations. Quantum continues to focus on the need to capture, protect, and monetize massive, streaming data stores cost efficiently.
Specifically, Quantum’s portfolio includes:
- StorNext scale-out parallel file systems for high performance video editing and processing of large unstructured data sets.
- Quantum F-Series NVMe storage ultra-fast, highly available non-volatile memory express (NVMe) storage arrays.
- Quantum QXS-Series Hybrid Storage arrays, which are high-performance and reliable solid-state drive (SSD) and hard-disk drive (HDD) arrays.
- Scalar-branded Linear Tape-Open (LTO) systems for lower-cost long-term data retention and, through providing a storage media that is offline and that is not disk-based, protection against ransomware.
- Lattus-branded object storage systems for a scalable, durable and geographically distributed nearline data library.
- Higher-performance DXi-branded backup appliances that provide replication as well as faster backups and restores.
According to Quantum, these portfolio pieces have facilitated a strong foothold in use cases including media production, research, and other large unstructured datasets. To become further entrenched in these use cases, Quantum has rounded out its portfolio with its new F-Series non-volatile memory express (NVMe) systems and its Quantum Cloud Storage Platform software-defined block storage.
Quantum’s newest family of storage appliances, the F-Series, introduces NVMe to its portfolio. Per Quantum, it designed the F-Series specifically to support the ultra-fast reads and writes and the massive parallel processing that are required for modern media workloads including editing, rendering and processing of high-definition video content. These workloads are performance-intensive, and they must draw from large unstructured data sets.
The first offering in the F-Series family, the Quantum F2000, is a 2U system that has a dual node-configuration for availability. Each system carries two hot-swappable compute canisters, and can hold up to 24 dual-ported NVMe drives for a total of 46 terabytes (TB), 92 TB or 184 TB of capacity. The fast-performing NVMe drives are complemented by the use of Remote Direct Memory Access (RDMA) networking, which enables direct access between users’ workstations and the storage system – as a result also facilitating more predictable performance. Specifically, fiber channel storage area network (FC SAN) and 100 Gigabit Ethernet (100GbE) connections are supported.
Quantum Cloud Storage Platform
Quantum’s F-Series arrays are the first to be built on the new Quantum Cloud Storage Platform, a software-defined block storage platform. The Quantum Cloud Storage Platform provides a centralized management application programming interface (API) that is designed to provide data services that are specifically relevant to streaming, large unstructured files such as video content. For example, it facilitates active/active clustering for high availability, and it handles key data protection tasks including erasure coding, RAID configuration and replication. The platform is completely decoupled from underlying hardware, and it can run in virtualized or bare metal environments. It supports a variety of storage devices and storage network interfaces, including SCSI, SAS, SATA, NVMe, FC SAN, iSCSI, RDMA and NVMe-oF respectively.
The Quantum Cloud Storage Platform integrates with Quantum’s StorNext parallel file system to facilitate comprehensive data access and lifecycle support. The StorNext support adds file and object storage access protocol support, as well as parallelism to enable a large number of applications and users to concurrently access the files that they need, regardless of where they are stored. Additionally, StorNext facilitates automated, policy-based migration of data between tiers.
Quantum has expanded significantly beyond pure-play backup and tape archive use cases. The ongoing development of StorNext and the additions of NVMe and block storage support via the F-Series and Cloud Storage Platform substantially enhance Quantum’s value proposition for its targeted use cases including post-production editing and playback, live-capture broadcast and video analysis and simulation. They also make Quantum an increasingly viable consideration outside of media and video use cases where real-time processing of vast amounts of streaming data is also required, for instance in genomics and life science research and in supporting machine learning (ML).
When it comes to delivering levels of throughput and latency required by these demanding workloads, storage managers should note that Quantum handles metadata out of band from data processing. The unstructured data that serves these workloads is highly metadata intensive, and in fact can account for 80% or more of the workload’s traffic. This is a unique approach that can greatly help to provide faster user access and application performance. Meanwhile, its ability to boost infrastructure utilization can help storage managers to gain back valuable data center floorspace, which is at a high premium. Looking ahead, storage managers should keep an eye out for more predictive management capabilities, for example more predictive data tiering based on user access, which can help to further optimize infrastructure utilization and at the same time streamline the complex processes associated with getting the right data to the right user at the right time.