Network Attached Storage (NAS) systems were supposed to replace the Windows file server. But, Windows file servers are still the predominate means to share data within an organization. With the increase of processing power, internal storage capacity and network bandwidth, the case against a file server gets harder to make. If your organization wants to continue using a Windows file server strategy, then at some point that server will run out of storage space and you’ll look to upgrade and maybe consider a NAS. Here is how to avoid it.
Out of Space Work Arounds
Initially, a Windows file server is a very inexpensive option to enable organizational file sharing and collaboration. And a Windows file server remains inexpensive until it runs out of internal storage capacity. At that point the options are to either add an external storage array, to attach the file server to a shared storage system or to add another new server with more internal storage capacity and have it become the file server for the organization. The organization could also give up on the Windows file server strategy and select a NAS solution. All of these options not only add considerable hard cost they also add operational costs like managing multiple systems and going through a delicate data migration process.
Windows File Servers Should Never Run out of Space
The reality is Windows file servers should never (or at least rarely) run out of storage capacity. Most data on a Windows file server data is only active for a few weeks from the point of creation. Industry surveys and reports consistently indicate that as much of 95 percent of the data on a Windows file server is inactive. By removing the in-active data, the current capacity on that server should never fill up.
Moving Data from A to B and B to A
While the inactive data sets have been with us for a while, the ability to seamless move that data to a more cost effective storage platform designed for long term data preservation has not. What’s needed is a software solution that integrates directly with Windows to identify data for transparent movement to a secondary storage device, like an object storage system. This secondary system could then provide cost effective capacity with software designed to preserve data and meet long term data retention requirements. The solution will also need to transparently move data back to that file server if a user accesses it in the future. The transparency of movement to and from secondary storage is critical without it IT’s workload goes up and the solution is no longer cost effective.
Quick Recalls are Key
For a Windows file server to appear like it never runs out of space the IT team needs to feel confident in the solution’s integration with Windows and its ability to quickly deliver data back to file server if it becomes active again. The more aggressive the organization becomes in moving data to secondary storage, the more likely there is to be a recall, so this seamlessness needs to deliver the file back to the user or app within seconds.
The result of a seamless and automated data management strategy is that as much as 95 percent of disk capacity can be freed up and the rate of capacity consumption slowed. In essence, “out of space” never comes up as a reason to upgrade. But as Storage Switzerland discusses in its on demand webinar “Capacity – Ransomware – Protection – Three Windows File Server Upgrades to Avoid“, this type of solution can also eliminate the need to upgrade to improve resiliency from ransomware or to upgrade to improve data protection.