Mergers and acquisitions are commonplace in today’s business climate. A key aspect of one of these transactions is how to combine the IT infrastructures of the two organizations. IT plays a critical role in making sure the combination is more than the sum of its parts, but it can also be a primary reason the combination fails. In the rush to complete the merger there are seven common mistakes IT professionals make.
Mistake #1 – Jumping to Execution
The first mistake is a rush to execution. The first step in the process should be for IT to get a scope of the digital assets available to the combined entity. This step involves using a software solution to inventory both the physical storage systems and the digital data that those systems store.
For physical systems, the assessment should inventory such things as make, model, types of storage media and which physical hosts are attached to it. It should also provide information on how much load each storage system is under. Load is different than available capacity, it relates to how busy is the system supporting the attached hosts.
The goal of the physical audit should be for the IT team to develop a strategy to determine what storage systems need decommissioning, which are available for increased workloads and which should stay as they are.
For the data itself, the assessment should provide detail on the actual data being stored. The organization is looking to categorize data based on activity and criticality. In most cases, the majority of this data (more than 85%) is inactive and viable for archiving or migration to a less expensive tier of storage. The identification of this data is critical at this juncture. It shows the IT planners exactly which chunk of data they need to be concerned with.
Mistake #2 – Working in an IT Vacuum
The second mistake is that IT moves forward with migration without understanding the various processes both organizations were running as well as the concerns of the stakeholders. It is important that IT engage the various stakeholders to get their input on the merger and to understand what are their concerns and priorities.
The reason IT and stakeholders avoid these meetings is they are often unproductive because neither side is armed with enough background information or a realistic vision of what a future strategy might be. To make these meetings productive, it is critical that the assessment step comes prior to engaging the stakeholders. At the end of the assessment IT should develop a basic strategy on what and where applications should be moved. Knowing exactly the logistics of that move is not as critical as just having a vision for how the move should look.
Essentially, IT wants to start the conversation with an IT inventory of storage options as a result of the merger. It should be the starting point for the conversation between IT and the stakeholders. Again, it is important that the conversation take place before applying any changes. The feedback during these conversations will impact the how the actual vision is executed.
Mistake #3: Limiting Your Options
The third mistake most data centers make is limiting their options by trying to consolidate to a single data center and even a single storage system within that data center. In the past, the motivation for this consolidation was based largely on the limits of technology because interconnecting multiple data centers and managing multiple storage systems was expensive and complicated.
Currently, thanks to readily available bandwidth, software defined networks (SDN) and modern data management software it is possible to cost-effectively manage multiple storage systems across multiple data centers.
A multi-location organization has several advantages. First, the talent pool for potential employees increases significantly. No matter the state of the company, finding talented potential employees is always a challenge. Forcing those employees to move to a centralized location makes matters worse. However, having a remote office is not always enough. Team members in remote locations need direct access to certain applications and data sets, which in many cases, means a localized data center.
The second advantage is having data closer to the point of creation. Many industries create or capture data in specific regions. Media and entertainment, for example, has on-location shoots. Energy and exploration has field services. Having a data center close to these locations may deliver advantages in time to value.
Finally, these additional sites create redundancy because they can back each other up. In the event of a data center outage, one of the other data centers can stand in its place. If IT implements technologies like SDN and data management, then failover and failback between these sites is seamless.
The key for the multi-site organization is to identify the role of each location and which data is needed in a specific location to support a specific role. IT needs to be careful not to allow the multi-site approach to lead to data inefficiencies. Even in the multi-site organization opportunities to consolidate and optimize data storage exist.
Mistake #4 – Moving a Cluttered House
A multi-site data center will still require data movement. Data will often need to move to another location to better support that location’s role. There should also be a large amount of data that can be removed (but not deleted) from its current storage system. In most data centers, at least 85% of the data on production systems has not been assessed in year. The amount of inactive data should have been verified by the steps taken to avoid mistake #1 “Jumping to Execution.”
The question is what to do with all this inactive data. In reality, most of it can probably be deleted. The problem is there is some data within that 85% that absolutely cannot be deleted, and there is other data within that 85% that might be of value in the future. Sorting out which is which, especially during an organizational merger, is a daunting task. In many cases it is less expensive, and safer, just to keep all data. That does not mean, however, all that data should be on primary storage.
Secondary storage systems are designed specifically to store this type of data, such as Quantum’s Lattus object storage, Scalar tape libraries and cloud services such as Glacier. To lower costs they are built from commodity hardware and leverage high capacity drives. They also have data resiliency features to make sure data stored on them is protected from both media failure and data corruption.
The key to realizing the full benefit of a secondary storage system is a data management software solution, which will automatically move inactive data from primary storage to secondary storage. These solutions will also provide transparent access to the data’s new location so users can access it as if it were never moved.
Assembling a solution can be challenging because the organization is asked to put together a cohesive strategy from dissimilar parts; a NAS storage system, data management software and several layers of secondary storage (object, tape and cloud). Fortunately Quantum’s Artico appliance, powered by StorNext, provides front end NAS data management as well as the ability to connect to object, tape or cloud storage for long term retention of data.
With a secondary storage system and data management solution in place, IT can safely archive inactive data, greatly reducing the working set of data, which allows focus on the active data. For example, in a data center with 500TBs of data, reducing to the working set of data to 75TB makes management much easier.
Mistake #5 – Missing Operational Savings
The fifth mistake most organizations make during a merger is the assumption that the merger will require lots of additional IT spending. The implementation of the secondary storage system and data management software should provide an immediate cost savings by freeing up capacity on production storage. Many organizations will not need to buy additional primary storage for years.
The implementation of a data management solution should also lead to an operational savings. IT spends hours every day making sure the right data is on the right storage, rebalancing storage resources because a system has run out of capacity or moving data to gain better performance. The data management solution solves these problems automatically, freeing up IT for other tasks.
Mistake #6 – The Endless Upgrade
IT assumes storage upgrades and system refreshes are a way of life. This is especially true in a merger where “new systems” may actually be older systems that are no longer needed by one area of the combined entity but needed by another. In the past, the requirement to migrate data from an old system to the new system was a time-consuming task with a high chance of error. Data had to be copied to the system and then profiles updated to indicate the new location of the data.
With a data management solution like Artico in place, data can be automatically moved by the data management software to the new system. Since the data management solution is the central point for data location, no profiles need to be updated. Data is automatically moved to the new system without interruption to the users.
Mistake #7 – Leaving the Back Door Unlocked
Another consolidation target after a merger is the data protection process. Organizations can spend an inordinate amount of time trying to choose one backup application to protect the entire enterprise. While a noble effort, the reality is most organizations need multiple applications to get the job done. Some applications are just better at protecting certain environments, but few are perfect at protecting everything.
An area where consolidation CAN occur is data protection hardware. Data protection hardware is often siloed within each data protection application. Hardware solutions, like Quantum’s DXI data protection appliances, enable all of an organization’s data protection applications to send data directly to a single appliance, consolidating backup storage and increasing overall secondary data storage efficiencies. These systems are also WAN efficient and can replicate data to other sites, cross-replicate data between sites and even replicate data to the cloud.
With consolidated backup storage in place, IT can start the process of selecting a single backup solution for the entire enterprise without the pressure of time. Since historical access is needed, a centralized storage solution also allows IT to easily keep one instance of an old backup application running.
Conclusion
The goal of a merger or acquisition is to make the combined organization better than two stand-alone companies. In other words, the goal is to make 1 + 1 = 3 (or more). The same opportunity is available to IT – to create a better IT organization that is more responsive to the needs of the business. First, it must get a handle on the data assets of the two organizations, which can appear to be a daunting project. If IT avoids the common merger mistakes by establishing a strong data management foundation and consolidating backup, the merger process can result in a stronger, more efficient data center.
Quantum is a leading expert in scale-out tiered storage, archive and data protection, providing high-performance solutions for capturing, sharing and preserving digital assets in demanding workflows.