Cloud 101CircleEventsBlog
Register for CSA’s free Virtual Cloud Trust Summit to tackle enterprise challenges in cloud assurance.

The Impact of Computing Power on Cryptography

The Impact of Computing Power on Cryptography

Blog Article Published: 09/21/2012

Advanced technology is a beautiful thing. Not only has it enabled the creation of new, more efficient methods of application delivery and data storage (the Cloud is a prime example), but it’s also helped propel the development of more sophisticated solutions for data protection as well (think tokenization, encryption). That said, there is a challenge that accompanies this evolution of technology – the savvy cybercriminal. Determined to keep pace with, or even ahead of, each advance in data protection, these criminals are posing a huge threat to corporations and governments worldwide. Professor Fred Piper, a renowned cryptographer from Britain’s Royal Holloway University, recently shared his views regarding the issue and they included a sobering assertion: cybercriminals – with the assistance of anticipated future breakthroughs in computing (known as quantum computing) – could theoretically be able to decipher encryption algorithms. Imagine the consequences.

But since quantum computing is not a reality yet, and it will take a bit longer for it to get into the hands of cyber criminals, so why worry, right? It turns out that while Piper was focused on the impact of quantum computing, he was actually helping shine a light on another threat: the increased access to supercomputing, made possible by the cloud. While evangelists of cloud-based supercomputer access tout the ease with which Small and Medium Enterprises (SMEs) can now utilize computing power to run things such as models for fluid dynamics, this computing power can also now be used to attack computer security systems, and weak cryptography in particular. Just knowing that cybercriminals will be able to harness this sort of computing power creates yet another reason for enterprises to make sure they use the strongest cryptographic approaches available when encrypting their most sensitive levels of data. Any sub-standard encryption is much more likely to be cracked using the tools now available via the cloud.

This strong security approach needs to be applied to data that is being transferred (“in flight”) to the cloud, being processed in the cloud or being stored (“at rest”) in the cloud. To help the industry stay ahead of these issues, organizations such as National Institute of Standards and Technology (NIST) have issued standards such as the Federal Information Processing Standards (FIPS) for use across the Federal government in the United States. The FIPS 140-2 standard is an information technology security accreditation program for validating that the cryptographic modules produced by private sector companies meet well-defined security standards. FIPS 140-2 has also been adopted as a minimum standard within many industries including finance and healthcare.

Case closed, right? Wrong. While standards are extremely valuable they have to be applied correctly. And, regrettably, confusion has been caused in the market by some players using terms such as “military grade encryption” attached to a technique known as “Functionality Preserving Encryption” (which has lesser validation than FIPS 140-2). Organizations should carefully consider the strength of the encryption being used to safeguard their information and avoid proprietary,” closed” approaches that have not been published or peer reviewed. There may also be industry or regulatory mandates to use a certain type of encryption depending on the business realm(s) in which the organization operates. And if the preservation of functionality of their SaaS applications, such as Searching and Sorting, is important to the organization, ensure this is possible when implementing the level of encryption that the enterprise wants (or is required) to use.

The challenge with encryption is that once attackers obtain the key, it is effectively broken because they can decipher all the data encrypted with that key. Weak cryptography that can be broken using newly available supercomputing power poses a serious risk to organizations that face criminal charges, civil liabilities, and brand damage should a data breach occur. It is therefore imperative that organizations use the strongest encryption they can to prevent accusations that slipshod security, especially when tied to cost-saving efforts, contributed to the breach.

Enterprises should also strongly consider tokenization as an option for obfuscating sensitive information. Tokenization is a process by which a data field, such as a primary account number (PAN) from a credit or debit card, is replaced with a surrogate value called a token. Only the token value is transmitted to the cloud, and the real value is securely stored inside the enterprise network. (De-tokenization is the reverse process of redeeming a token for its associated original value, and the process must occur within the enterprise firewall.)

While there are various approaches to creating tokens, they typically are simply randomly generated values that have no mathematical relation to the original data field. Herein lies the inherent security of the approach – it is practically impossible to determine the original value of the sensitive data field by knowing only the surrogate token value. The best you can do is guess. This means that if a criminal got access to the token in the cloud, they could not even use a supercomputer to “reverse” the token into its original value, because there is simply no path back to the original. (Even a “quantum computer” could not decipher it back into its original form.) The true data value never leaves the safety from behind an organizations firewall.

Some companies determine that they don’t even have a choice in the matter since legal requirements in certain jurisdictions mandate that data physically resides within country borders at all times. Even with strong encryption, these restrictions had previously blocked cloud computing solutions from even being considered. But tokenization technology provides a workable solution in these instances and overcomes the strict data residency rules enforced in many countries, satisfying both the need to capitalize on the latest breakthroughs in cloud computing, as well as ensuring the security and compliance of any sensitive information.

So, while advanced technology and computing models – coupled with increasing threats from hackers, code-breakers and cyber criminals – are forcing the creation of new innovations in cloud security, companies should know they have solid options here and now. Strong encryption is a requirement for any organization putting sensitive data in the cloud. Tokenization – often overlooked as a data protection method – offers one of the most compelling options to secure sensitive data, ensure application functionality, and enable regulatory compliance.

Eric Hay is PerspecSys’ worldwide director, field engineering. Eric and his team are responsible for deploying PerspecSys solutions for enterprise customers needing to secure sensitive information when using Cloud applications. A software industry veteran, Eric has specialized in computer security throughout his career at companies like Netegrity, Credant Technologies and Invincea.

Share this content on your favorite social network today!