The need to solve data problems at scale is arguably the most prominent storage driver today — whether hardware, software or cloud services. In this regard, Splunk is quickly becoming a textbook use case. Splunk is an application that analyzes…
While some applications in the data center require extreme performance, high performance is now a default requirement for all production applications. How performance is measured varies by application (for example, one application might require very high throughput while others might…
Simply put, unstructured data is breaking traditional network-attached storage (NAS) architectures. The scale-up nature of traditional NAS solutions renders the storage controller a bottleneck in being able to handle the intensive metadata operations that are associated with unstructured files, forcing…
Minimizing the total cost of ownership (TCO) and maximizing the uptime of storage infrastructure has never been more important. More data is being captured and utilized by the business, and a new tier of premium-priced non-volatile memory express (NVMe) arrays…
The modern data center has to support many different types of workloads, each of which makes different demands on the storage architecture. Today, standard all-flash arrays (AFA) are the mainstream storage system for the data centers and conventional best practice…
NVMe Flash Arrays promise an unprecedented level of performance thanks to the higher command count, queue depth and PCIe connectivity. Most NVMe Arrays are boasting IOPS statistics of close to one million IOPS and latency in the low hundreds of…