StorageShort: Best Practices for 2018 – Eliminate Bad Practices

There are two storage habits that IT professionals should try to eliminate in 2018. The first is the practice of buying more primary storage when the current primary storage system reaches its capacity. The second is counting on the backup process to be the archive process. Both cost organizations money and put data at risk. The good news is the ability to put an end to both practices is well in hand.

Stop Buying Primary Storage

No matter how good an organization’s data management strategy is, there will come a time where it needs to buy more primary storage. But those times should be much fewer and further apart than they currently are. Certainly, the amount of active data is growing, but its pace is much slower than the rest of the environment. With current flash array technology, most organizations should be able to standardize on a single, scalable primary storage system and not have to change technology for more than five years.

Stop Counting on Backup as Archive

An overwhelming number of organizations count on backup process as the archive. The problem is that, while backup software has gotten much better at searching file content, it does not enable a critical aspect of archive, the ability to remove data from the primary tier. There are other important archive capabilities but removing data from primary storage is critical to eliminating the practice of buying additional primary capacity, and it will make the actual backups themselves perform better since there is less data to protect.

Archiving is Cool Again

The term data management sounds modern and is a big part of the lexicon now. But data archiving, whether part of a broader data management strategy or an archive only solution, can provide the most immediate impact for your organization. Archiving is the movement of data from expensive storage to increasing less expensive storage as its access pattern decreases.

The hesitance with archiving is that it sounds like more trouble than it’s worth. What if, even though data has been accessed for years, the moment after it is moved some user wants it again. The way to break through this trepidation is to start slow.

While most data audits will show that 80% or more of the organization’s data assets have not been accessed in years, most of that data is sitting on storage the organization has bought and paid for. The return on investment on an archive solution actually goes negative initially since the organization has to buy the archiving software and the archive storage system that will store the old data.

The reality is that there are two options for an organization to implement an archiving solution that is relatively seamless. The first option, for those very concerned about user reaction to archived data, is to implement a hybrid array for primary storage. These systems automatically move data between flash and hard disk.

There obviously is a performance difference between flash and hard disk, and if a user is accustomed to getting data from flash they might notice a performance difference. If that is a great concern, oversize the flash portion. A hybrid array that is 50% flash is still less expensive than an all-flash array that is 100% flash. On a hybrid array that is 50% flash (which is much larger than the norm) the chances of a cache miss on data that matters is almost zero.

The second option is to implement a secondary storage system and software that can automatically move data between primary storage and secondary storage. The software that does the automated data movement typically creates a seamless link between old and new file locations, meaning no disruption to the users and only the small potential of a noticeable change in performance. To understand why an archive storage system can seem as fast as a primary storage system see our article, “How Can Object Storage Recall as fast as Primary Storage?

StorageSwiss Take

IT faces a data management conundrum. Most organizations buy too much primary storage, and most organizations count on their backup process to do more than it should, like archive data. An explicit data management or archive strategy alleviates both problems, significantly reducing the growth rate of the primary storage tier, and the backup process only needs to protect data that is actually changing.

To learn more about the importance of managing data, check out our latest StorageShort, “How to Solve the Data Management Conundrum.”

To learn about a best practice that eliminates these two bad practices, check out our on demand webinar, “How to Create A Two Tier Enterprise With All-Flash and Object Storage”.

Watch On Demand

Twelve years ago George Crump founded Storage Switzerland with one simple goal; to educate IT professionals about all aspects of data center storage. He is the primary contributor to Storage Switzerland and is a heavily sought after public speaker. With over 25 years of experience designing storage solutions for data centers across the US, he has seen the birth of such technologies as RAID, NAS and SAN, Virtualization, Cloud and Enterprise Flash. Prior to founding Storage Switzerland he was CTO at one of the nation's largest storage integrators where he was in charge of technology testing, integration and product selection.

Tagged with: , , , , , , ,
Posted in StorageShort

Enter your email address to follow this blog and receive notifications of new posts by email.

Join 21,957 other followers

Blog Stats
  • 1,326,373 views
%d bloggers like this: