Big data is largely heralded as the new end all be all for modern competitive advantage, that is underpinning storage requirements and buying decisions. Big data indeed can be invaluable in helping businesses to streamline operations, to unlock new revenue opportunities, and to innovate and differentiate. However, the storage industry often neglects to discuss small data, which serves as a precursor to all big data initiatives and which is becoming increasingly prevalent.
Small versus Big Data
The term “big data” typically refers to data sets that are so large and complex that they require specialized tools (e.g. data mining and preparation software) and expertise (often a team of data scientists) to be processed into meaningful insights. It is typically defined by the “three Vs” – volume (a massive amount of data must be collected from a multitude of sources), variety (a broad roster of structured, unstructured and semi-unstructured file types), and velocity (data is being generated and also must be processed at a lightning-fast pace).
Big data analytics initiatives stand to uncover otherwise unattainable insights including correlations and trends from which businesses can make better decisions. For example, a ski resort may analyze weather and ticket sale data to better forecast demand and plan staffing levels. This potential upside is tremendous and getting there requires the ability to effectively harness small data. In fact, big data may be overkill for many businesses, especially for the many that are still developing their artificial intelligence (AI), analytics and Internet of Things (IoT) strategies.
Like big data, small data is informative and actionable for the business. The key difference is that small data can be processed by business leaders without the assistance of a data science expert. For example, many enterprises have already taken the step of integrating social networking data such as influencers into their customer relationship management (CRM) systems. This small data can be processed directly by the marketing team into a personalized experience.
Small data files are small individually, but an enterprise may be capturing millions or billions of them. They key problem is that this volume of small data files consumes a tremendous amount of inode space – thus forcing the enterprise to upgrade their network-attached storage (NAS) arrays before all of the usable capacity is utilized. Many enterprises are turning to object storage to address this problem, but this architecture brings with it its own set of enterprise application compatibility and disaster recovery concerns.
Join experts from Storage Switzerland and Qumulo to learn how a modernized NAS architecture can address small data requirements. Watch on demand now.