Is it Time to Retire Your NAS?

It’s time to retire Network Attached Storage (NAS) – at least as we know it. These systems, glorified file servers, are over two decades old. In that time, users have become more mobile, and organizations more diverse. What seemed like a lot of storage 20 years ago pales in comparison to the dozens if not hundreds of terabytes of unstructured data IT needs to store, manage and protect today. It might be time to hand this IT technology its golden watch.

The NAS Status Quo

The NAS status quo is single scale-up architecture that was originally intended to replace file servers and store user home directories. Those users almost always came into a single headquarters location and worked at a company-provided desktop. The concept of a user working from multiple offices or having multiple devices seemed like science fiction.

Plus, the type of data that users worked with was often office productivity applications like Word, Excel and PowerPoint. Today, the data has evolved and now users create much larger and more complex data sets. The data created by Architecture/Engineering/ Construction (AEC), media and entertainment (M&E), Energy, Healthcare, manufacturing and semiconductor professionals is large in size and often requires cross continent collaboration.

On top of that is data created by the machines themselves. Sensor data, video surveillance and data from systems used by the organization is all increasing in value, requiring more storage capacity and longer retention periods.

Most studies indicate that 90 percent or more of an organization’s data is file-based, unstructured data and the overwhelming majority of that data is stored on NAS systems.

The NAS On-Premises Issues

It would seem that the first NAS problem is the demands of storing all of this data. Actually, NAS vendors have done a good job addressing the obvious need of creating file systems that can be expanded large enough to store all this data and all its files. And scale-out systems have resolved the raw capacity issue by clustering storage nodes together to create a nearly limitless expanding system with a single point of management.

The bigger challenge is dealing with the on-premises issues that a large unstructured data set creates for the organization. The first issue is the physical space required to house a NAS system, or systems large enough to store all this data. New data centers don’t come cheap and it is expensive to power and cool them. Almost every organization is under pressure to reduce the data center footprint and traditional NAS systems just don’t do the job.

The second in-house issue is protecting all this data. Every file saved has to be protected, and in most cases multiple times. The expansion of the NAS system often has a 5X or more ripple effect on the secondary storage systems that have to protect that data. There is also legitimate concern over how fast this data can be protected and recovered. Backing up or restoring thousands of files at a time is one of the most difficult requests of backup software. First, it takes time. Second, every file has to be entered into the backup software’s database every time it is backed up, leading to very large backup databases.

The Multiple Location Issues

Organizations have changed. They now have multiple sites, each of which often creates unique data sets that sometimes requires collaboration with other locations. Moving data in a logical way between these offices is another challenge that IT has to solve.

The status quo is to setup a NAS system at each location and manually copy or move data between sites. If a user needs immediate access to a file not on their local NAS they need to login to a remote NAS and copy the file to their NAS, which is slow process. The result is often a version control issue where multiple versions of the same file reside in different locations, all with unique modifications to them. And someone gets the thankless job of manually integrating those edits.

Each of these locations also inherit the same on-premises issues of physical space and data protection. But data protection is often even worse, since most organizations will try to consolidate all their data to a primary data center.

The Cloud Issues

The cloud seems like the ideal antidote to the problem. It is centrally located, physical space becomes someone else’s problem and pricing is becoming increasingly affordable. But the cloud has issues of its own. First and foremost it lacks performance. Most creation and manipulation of unstructured data is on-premises, not cloud hosted. That means saving and retrieving data will be via an internet connection, which is barely acceptable for small office productivity files but unusable for the larger files generated by AEC, M&E and Big Data.

The second issue is that the cloud by itself lacks the intelligent data services that enterprises have come to expect. Services like snapshots and cloning are critical to protecting data against accidental deletion or corruption.

A third issue is not all data can or should be in the public cloud. Organizations need choice, and not just between the public cloud providers. They may need to create their own cloud storage environment. Object sStorage gives them the toolkit to get there but it has many of the same data service limitations as public cloud storage.

Finally, the cloud is disruptive to Windows and Linux workflows. The cloud does not natively support NFS or SMB. While, there are gateways and agents available, they often have to be deployed on a per client basis.

Break The Status Quo To Meet Unstructured Data Requirements

The cloud is the technological equivalent of throwing the baby out with the bathwater. There are aspects of the cloud that will really help organizations solve their unstructured data, but IT needs to combine it with the good parts of legacy NAS to create a complete solution.

Organizations should look for an unstructured data solution that leverages on-premises appliances (virtual or physical) that provide local performance and deliver enterprise class services. These appliances should use the cloud to interconnect with other appliances at all of the organizations locations. Then add cloud specific functionality like global namespace, global file locking, cloud archiving as well as encryption, deduplication and compression.

To learn more about how leveraging the cloud to solve your unstructured data storage challenges, join us for our on demand webinar “Overcoming the Top 3 Challenges with the NAS Status Quo“.

Watch On Demand

George Crump is the Chief Marketing Officer at VergeIO, the leader in Ultraconverged Infrastructure. Prior to VergeIO he was Chief Product Strategist at StorONE. Before assuming roles with innovative technology vendors, George spent almost 14 years as the founder and lead analyst at Storage Switzerland. In his spare time, he continues to write blogs on Storage Switzerland to educate IT professionals on all aspects of data center storage. He is the primary contributor to Storage Switzerland and is a heavily sought-after public speaker. With over 30 years of experience designing storage solutions for data centers across the US, he has seen the birth of such technologies as RAID, NAS, SAN, Virtualization, Cloud, and Enterprise Flash. Before founding Storage Switzerland, he was CTO at one of the nation's largest storage integrators, where he was in charge of technology testing, integration, and product selection.

Tagged with: , , , , , , , , , ,
Posted in Blog

Enter your email address to follow this blog and receive notifications of new posts by email.

Join 25,542 other subscribers
Blog Stats
%d bloggers like this: