IT professionals are tired of hearing that the growth of unstructured data is out of control. While attempting to curtail its growth is a noble cause, what IT really needs is the ability to control and manage it. Gaining control over unstructured data starts with knowledge. IT professionals need solutions that will provide them insight into the data set so they can understand where the growth is coming from and how quickly that growth will exceed existing resources.
The first step to gaining control over unstructured data is the creation of a holistic view of all the file assets regardless of what type of storage system and where they are located within the enterprise. Given the size and quantity of unstructured data, the data collection has to be something that is automatically created. The days of tracking file servers via spreadsheets have long passed us by.
The second step, once the information about unstructured data is captured, is to then present the data in a way that humans can understand. The goal should be able to have pre-canned reports that can either, globally for the enterprise, by location or by file server, provide information like total capacity consumed, capacity consumed by file type, by department and by user. The solution should also be able to report on file age, when was the data last accessed, can it be archived or even deleted.
Another area where a file analysis solution should provide help is dealing with governmental, industry and corporate regulations on data governance. For example, it should be able to provide insight on files that contain credit card information, social security numbers or any other types of personal information.
Finally, the file analysis tool should provide the ability for IT to forecast file growth so IT can predict when the next storage purchase or file archive needs to occur. Forecasting gives the organization to plan and budget for its next purchase or better plan to implement an archiving solution.
Most importantly the solution should “just work.” IT professionals are already overwhelmed with tasks in running the data center and they don’t have the time to babysit file management. The solution should not only be automated, it should also provide much of the above capabilities with pre-built templates and reports.
NTP Software’s File Report
NTP Software’s File Reporter is a file data analysis software that can scan a wide variety of Network Attached Storage (NAS) systems and file systems to gather information about the data stored there. It stores the information it captures into a SQL database so that organizations can run their own customer queries on it, if they so desire. But the software also comes with a wide selection of preconfigured reports and capabilities so after the scan is complete IT can jump right in and start managing their file data better.
Armed with File Report an IT professional can visually see which locations, departments and users are consuming the most files. IT can also see which type of files are causing the biggest growth, again by file system, location or organizationally. The software also provides file growth forecasting so IT can predict when its next purchase will occur or when it will be time to start an archiving process.
File Reporter 8.1
NTP recently updated to version 8.1 of the solution which provides powerful new capabilities as well as enhancing existing ones. First, the reporting can now filter by server. Essentially a filter is set once and then all subsequent reports will only show information for that particular server.
Capacity understanding is now more granular. With Storage Host Drill Down, users can inspect directories and files that consume the largest amount of displace. The software can now show data consumption by file type. They are no longer limited to categories like business or temporary files.
In addition to more granularity at the storage host the new release also includes greater granularity about the file owner. The software can now report on the types of files they own, the space the owner consumers and the age of their files.
Projections can now run on storage host, volumes or directories, including items that are not in a critical state. This capability allows organizations to identify servers that are growing slowing and are under-subscribed making them candidates for new workloads or to have existing workloads migrated to them.
Scan flexibility has also improved. First, Windows servers can be remotely scanned without having to install the scanning agent. Remote scanning is ideal for capturing information from remote offices where Windows file servers may spring up.
In addition to remote Windows scanning, the new release adds support for NetApp Cluster Mode so users can scan vServers (Storage Virtual Machines). And it is adding support for EMC Unity file system scanning.
Administrators can also control whether mapped drives and symbolic links are scanned and accounted for, which should eliminate potential overlap in file system information gathering.
StorageSwiss Take
Knowledge is power. Yet most IT professionals are under-informed when it comes to the data that resides on their file systems. They just blindly answer the never-ending call for more capacity. The time has come to regain control over unstructured data, but that control requires not only information about the unstructured data set, but the ability to visually see what’s happening with it. Solution like NTP Software’s File Reporter provides that insight.