The Challenges of After-The-Fact Encryption

Encryption is a foundational component of a cyber-secure storage system. The problem is that most organizations implement encryption after-the-fact, months if not years after the initial implementation. When IT is considering a new storage system, they tend to focus on capacity or performance and not on the system’s security features.

The Problems with after-the-fact Encryption

Once an organization realizes it needs encryption it typically implements it by either installing self-encrypting drives (SED) into the system or installing software that creates an encrypted volume or file system. Both of these methods create problems for IT. Using SEDs requires placing these drives into a RAID group, and IT then needs to create a new volume. Selecting a software method requires IT to assign capacity to the volume or file system.

SEDs protect data only when replacing or discarding a drive. A disconnected SED drive doesn’t have access to the hardware encryption and is unreadable. Once an SED is in the storage system, it is readable by anyone with access to the storage system.

Software-based encryption requires that the attacker not only have access to the storage system but also an authorized login ID to access the volumes on that system. The file system or volume that software-based encryption creates may not have the same features as the one it is replacing. Additionally, the lack of feature parity may force the organization to sacrifice data protection or data management capabilities. As we discussed in our first blog, this also leads to multiple encryption solutions for each storage type in the environment.

The next step is for IT to move data from unencrypted volumes to the new encrypted stores. Even if the data is moving to a volume on the same storage system, the data needs to travel out of the system through a server initiating the movement and then back to the encrypted volume on the system. The network overhead only adds to the time it takes to move the data. Finally, IT needs to test applications counting on this data to make sure that they still function correctly.

Another challenge for IT is dealing with potential overhead caused by encryption. The process of encrypting every write and unencrypting every read, adds overhead. The problem is developers developed, and users experienced these applications without this overhead. The chance of them noticing a performance loss is higher because they are used to the unencrypted state. Additionally, because the storage software doesn’t integrate encryption as a core feature the encryption process isn’t as efficient as it could be.

The combination of the time required to move data to the encrypted volumes, the lack of feature parity, the lack of consistent data encryption throughout data lifecycle leads organizations to be more selective with what data has encryption applied to it which of course exposes the organization to higher security risks.

The Value of Always on Encryption

Always-on encryption is integrated directly into the same storage software that provides features like snapshots, replication, and auditing. Tighter integration with the core storage software leads to better efficiency and lower overall overhead due to the encryption process. The storage system encrypts all data from the point of initial power-up onward. Developers create applications and users experience data only in one state, encrypted. If a performance problem does present itself, IT can address the problem upfront before it impacts the user experience.

If the system is designed to consolidate storage silos and provide storage for all workloads, then the always-on encryption is consistent throughout the data’s lifecycle.

StorageSwiss Take

Encryption is the foundational component of a cyber-secure storage system but implementing it after the fact causes problems that may make it difficult for the organization to achieve broad adoption. Encryption should not require developing a strategy where IT tries to decide which data requires encryption and which does not. Encryption should be a holistic process, applied to all data, all the time. Always-on encryption makes the holistic use of encryption much more feasible.

In our prior blog we discussed the importance of not watering down always-on encryption by limiting user access and leveraging automation. The reality is though that even with limited user access and always-on encryption the chance of a breach, while much less likely, is still possible. In our next blog, we discuss the importance of real-time interactive reporting to confirm a secure state and to enable quick reaction if a breach occurs.

Watch On Demand

George Crump is the Chief Marketing Officer at VergeIO, the leader in Ultraconverged Infrastructure. Prior to VergeIO he was Chief Product Strategist at StorONE. Before assuming roles with innovative technology vendors, George spent almost 14 years as the founder and lead analyst at Storage Switzerland. In his spare time, he continues to write blogs on Storage Switzerland to educate IT professionals on all aspects of data center storage. He is the primary contributor to Storage Switzerland and is a heavily sought-after public speaker. With over 30 years of experience designing storage solutions for data centers across the US, he has seen the birth of such technologies as RAID, NAS, SAN, Virtualization, Cloud, and Enterprise Flash. Before founding Storage Switzerland, he was CTO at one of the nation's largest storage integrators, where he was in charge of technology testing, integration, and product selection.

Tagged with: , , , , , , , ,
Posted in Blog

Enter your email address to follow this blog and receive notifications of new posts by email.

Join 25,542 other subscribers
Blog Stats
%d bloggers like this: