One of the seemingly endless tasks of today’s IT infrastructure planner is finding more ways to squeeze costs out of the data center environment. With data growth soaring across all industries, it is not surprising that storage is often at…
One of the seemingly endless tasks of today’s IT infrastructure planner is finding more ways to squeeze costs out of the data center environment. With data growth soaring across all industries, it is not surprising that storage is often at…
While the cost of raw data storage keeps dropping, the cost of deleting obsolete or inappropriate data can be prohibitive. No one wants to delete the file that might be needed someday. This results in the accumulation of data that has both cost and compliance ramifications. The key is to lower that risk by identifying good candidates for deletion. In this article Storage Switzerland Senior Analyst Eric Slack details the cost associated with file deletion and how to lower them through comprehensive file analysis.
Data sets are growing, but so are the periods of time that they’re being saved. Once primarily driven by regulatory compliance, companies are now finding that there are other factors pushing data retention to seemingly unlimited duration. The repurposing of…
Strictly speaking, object storage refers to a system where data is stored in discrete buckets or “objects”, in contrast to the directories and subdirectories of a traditional file system. It can be implemented in any storage architecture, but is usually…
One of the challenges facing IT planners is where to start with services like Amazon’s AWS. The logical first step is as a data repository to reduce the amount of on-premise data that needs to be stored and managed. The…
Unstructured data is becoming a big headache for many organizations. By some estimates, this type of data (user files, emails, PDFs, images, videos, etc.) now accounts for up to 90% of all new data growth. While most of this information quickly…
Hadoop is an open source software framework licensed by the Apache Software Foundation that uses a distributed compute infrastructure to handle large, batch analytics jobs on very large data sets. It does this by breaking these projects down into a…