More than Migration: Developing a Holistic Cloud Strategy

Most organizations are considering how to take advantage of the tremendous resources made available by the cloud. The first step for many of these organizations is to create a cloud migration strategy. As part of this strategy, the organization decides which solution is best suited to copy or move their data to a cloud provider. The problem is that cloud migration implies a one-way, one-time event, which leaves the full potential of the cloud untapped.

The reality is that most organizations want to leverage a hybrid-cloud architecture where some applications stay on-premises, others move to the cloud and still others move back and forth. Another critical factor is that cloud providers are developing unique areas of expertise. Organizations need to move data between clouds to take advantage of unique cloud capabilities. IT needs to develop a holistic cloud strategy that embraces the hybrid and multi-cloud models.

The challenge in creating a more comprehensive cloud strategy is the time it takes to design and implement it. Organizations need a solution that enables them to start small with a single use case, but which scales to cover a wide variety of use cases.

Creating a Cloud Data Fabric

Most organization chip around the edges of the cloud’s potential using it for a variety of initial use cases. Each of these initiatives typically leverages different products but they are seldom compatible with each other.

A holistic cloud strategy starts with a solid foundation. The organization should build that foundation on a data structure more powerful than a file system, instead using a data structure that acts as a fabric, bringing together on-premises storage with cloud storage, and even multiple cloud providers. The fabric’s seamless integration of multiple storage points enables the organization to start its cloud journey with small steps like backup and grow to fully native cloud applications.

Requirements for a Cloud Data Fabric

Requirement 1 – Broad Range of Protocol Support

The Cloud Data Fabric (CDF) needs to enable the organization to continue to use its existing storage protocols. Most organization’s applications and services use protocols like NFS, SMB, Apple File Protocol (AFP) and block protocols like iSCSI. The problem is that the cloud, by default, does not. A lack of compatible cloud based protocols makes leveraging the cloud more difficult and more limited for the organization.

The CDF needs to provide the same protocols across both on-premises and cloud based storage resources. Compatible protocols make the movement of data between on-premises storage and cloud storage seamless. The fabric requires no data conversion, so legacy applications and services run seamlessly. Using compatible protocols also better enables the movement of data back to on-premises and creates a true hybrid IT model.

Requirement 2 – Accelerate Cloud Transfers

The seamless transfer of data between on-premises and the cloud provides great benefit to the organization. The time it takes to transfer data to the cloud or back to on-premises may make the organization more resistant to the change. The transfer time may eliminate any potential performance gains from moving the workload.

Its fabric nature means that the CDF controls both ends of the cloud connection (on-premises and cloud). CDF should optimize data transfers between the storage points instead of counting on basic IP transfers. The CDF without breaking compatibility, provides a customer packet transfer method and customers should expect better link efficiency and reduced packet loss. The resulting 5X or better transfer speed, justifies data movement even for a small performance gain.

Requirement 3 – Optimize Cloud Data

Cloud storage provides the advantage of reduced upfront costs. It does not necessarily assure reduced long-term costs. The CDF needs to help the organization optimize its cloud storage spend. On-premises storage systems typically leverage data efficiency techniques like deduplication and compression but cloud based storage solutions seldom provide data reduction capabilities.

The CDF needs to provide deduplication and compression to the cloud storage resources it manages. Providing data efficiency enables CDF customers to enjoy the upfront cost savings of the cloud as while also reducing the long-term costs.

Requirement 4 – Leverage All of the Cloud

Every cloud provider has multiple storage tiers. The provider typically uses a performance tier for applications and active data, a moderately performing tier for less frequently accessed data sets or less performance sensitive data sets. Providers now offer cold storage tiers for data retention. Each tier is less expensive than the tier above it. If the organization intelligently moves data between these tiers, it further drives down cloud storage costs.

Typical cloud storage products lack support for these various tiers. A CDF should support multiple cloud tiers by automatically moving data between tiers of storage based on data access parameters or administrator established policies. The CDF could even span tiers between on-premises and cloud storage, moving inactive data from on-premises storage to cloud storage.

Using the Foundation

The CDF enables many use cases and makes the organization truly Hybrid IT ready. With a cloud based file fabric, the organization can easily copy data to the cloud and direct backup copies to the cloud. It is also easily able to implement interactive initiatives that use the cloud for more than just a digital dumping ground, like file services consolidation.

As we’ll detail in chapter 4, file services consolidation uses the cloud as the central storage repository for data typically stored on multiple NAS systems spread throughout the organization. The CDF creates a global SMB and NFS mount point, accessible by all the organization’s locations, eliminating the need for on-premises file servers and multi-site coordination of data.

StorageSwiss Take

Many organizations stumble into a cloud strategy, ending up with multiple strategies based on the various potential use cases. Instead, organizations need to start with a strong foundation, a cloud data fabric. The CDF enables and optimizes data movement between on-premises and cloud storage and across clouds. Once in place, the CDF makes the first few cloud use cases straightforward and lays the groundwork for more sophisticated future use.

Sponsored by SoftNAS

George Crump is the Chief Marketing Officer at VergeIO, the leader in Ultraconverged Infrastructure. Prior to VergeIO he was Chief Product Strategist at StorONE. Before assuming roles with innovative technology vendors, George spent almost 14 years as the founder and lead analyst at Storage Switzerland. In his spare time, he continues to write blogs on Storage Switzerland to educate IT professionals on all aspects of data center storage. He is the primary contributor to Storage Switzerland and is a heavily sought-after public speaker. With over 30 years of experience designing storage solutions for data centers across the US, he has seen the birth of such technologies as RAID, NAS, SAN, Virtualization, Cloud, and Enterprise Flash. Before founding Storage Switzerland, he was CTO at one of the nation's largest storage integrators, where he was in charge of technology testing, integration, and product selection.

Tagged with: , , , , , , , , , ,
Posted in Blog

Enter your email address to follow this blog and receive notifications of new posts by email.

Join 25,542 other subscribers
Blog Stats
  • 1,898,049 views
%d bloggers like this: