When Good Is Not Good Enough: NIST Raises the Bar for Cloud Data Protection Vendors Arrow to Content

February 21, 2013 | Leave a Comment

Earlier this year, the National Institute of Standards and Technology (NIST) released a publication titled Cloud Computing Synopsis & Recommendations (Special Publication 800-146) describing in detail the current cloud computing environment, explaining the economic opportunities and risks associated with cloud adoption, and openly addressing the security and data privacy challenges. NIST makes numerous recommendations for companies or agencies considering the move to the cloud (including delivering a strong case for uniform management practices in the data security and governance arenas).

 

The report highlights several reasons why cloud-based SaaS applications present heightened security risks. As a means to offset the threats, NIST’s recommendation on cloud encryption is clear-cut: organizations should require FIPS 140-2 compliant encryption to protect their sensitive data assets. This should apply to stored data as well as application data, and for Federal agencies, it’s a firm requirement, not simply a best practice or recommended guideline.

 

What does FIPS 140-2 validation mean? An encryption vendor whose cryptographic module attains this validation attests that its solution:

 

  • Uses an approved algorithm,
  • Handles the encryption keys appropriately, and
  • Always handles the data to be encrypted in a certain way, in a certain block size, with a certain amount of padding, and with some amount of randomness so the ciphertext can’t be searched.

 

Compare this to another level of validation, FIPS 197. FIPS 197 is an algorithmic standard that addresses the Advanced Encryption Standard (AES). As a standard that is used worldwide, AES is approved by the U.S. government to satisfy only one condition listed above – condition (1) “Uses an approved algorithm.” However, an encryption solution that only incorporates the validated algorithms of FIPS 197 does not meet security requirements (2) and (3) above, and hence is insufficient to be certified as FIPS 140-2 (minimizing its usefulness for those looking to use strong encryption).

 

Why is validation important? Well – it is a big deal for security professionals entrusted with deploying systems for protecting sensitive data. These differing standards leave the door open for confusion amid various market claims. Some solution vendors say “We can do AES encryption so our encryption is good.” Or “We use Military Grade encryption.” The reality is that if it is not FIPS 140-2 validated, stating something is Military Grade is clearly misleading.

 

One of the hottest areas for encryption technology is the Cloud – specifically, encrypting sensitive data stored and processed within SaaS or PaaS applications such as Oracle CRM On Demand or Salesforce.com. When you strongly encrypt data, for example using a FIPS 140-2 validated algorithm, it can “break” the end user’s experience with an application. For example, what happens when you try to search on a field like LAST NAME if all of the values, such as “Smith,” stored in the LAST NAME field have been encrypted? Well, your search will come back empty (and you’d be a pretty frustrated user).

 

A new class of products, which Gartner calls Cloud Encryption Gateways, has emerged to tackle this challenge. These solutions encrypt sensitive data before it leaves an organization’s firewall so that only an encrypted value goes into the cloud for processing and storage. And they also promise to “preserve functionality,” so you can still pull up a last name like SMITH on a search of SMI* even though the last names put in the cloud have been encrypted. Cool, right?

 

But you have to be careful as some vendors do this “magic” by modifying the encryption algorithms to ensure that a few characters always line up the same way in order to preserve the functionality I described (common operations like searching and sorting, etc.). This approach utilizes a weakened form of encryption that is certainly not FIPS 140-2 encryption. From a certification standpoint it doesn’t have any strength behind it; it just has a certification that says “If you run these strings through a certain way, you will get a result that looks like this” (FIPS 197).

 

It is important to remember that the implementation of AES without FIPS 140-2 is treated by the U.S. Federal government as clear text. Why? When you water down an encryption algorithm (like in the earlier example), you open up the encryption engine to crypto analysis, which creates a much easier path to cracking the data. This, by definition, puts sensitive data at risk. Solutions using these weakened algorithms make enterprises wrestle with the difficult tradeoff between meeting requirements for data privacy/protection and the overall usability of their application. This is a no-win scenario.

 

The good news is that there are some innovative approaches out there that do not rely on this sort of methodology. So my advice is to do your homework, ask the hard questions of your suppliers, and make sure your information is protected by the strongest techniques possible. Enterprises can find solutions that will keep all of their interested parties satisfied:

 

  • Privacy & Security Professionals: Can use industry acknowledged strong encryption techniques, such as FIPS 140-2, or tokenization
  • Business End-Users: Can get all of the SaaS or PaaS application functionality they need – security does not “break” the application’s usability
  • IT Professionals: Can deploy a standards-based, scalable platform that meets security and business needs and scales to support multiple clouds

 

And an alternative technique, called tokenization, also deserves a mention. Tokenization, sort of a “first cousin” of encryption, is a process by which a data field, such as a primary account number (PAN) from a credit or debit card, is replaced with a surrogate value called a token. De-tokenization is the reverse process of redeeming a token for its associated original value.

 

While there are various approaches to creating tokens, they typically are simply randomly generated values that have no mathematical relation to the original data field. Herein lies the inherent security of the approach – it is nearly impossible to determine the original value of the sensitive data field by knowing only the surrogate token value. So if a criminal got access to the token in the cloud, there is no “key” that could ever decipher it. The true data value never leaves the safety of the token vault stored securely behind an organization’s firewall.

 

Tokenization as an obfuscation technique has proven especially useful for organizations in some geographic jurisdictions with legal requirements specifying that sensitive data physically reside within country borders at all times. Privacy and security professionals find that tokenization technology provides a workable solution in these instances and overcomes the strict data residency rules enforced in many countries.

 

So whether it is industry acknowledged strong encryption or tokenization, make sure you choose robust, strong and validated techniques that will allow you to meet your security objectives. Never lose sight of the primary goal of adopting a security solution and avoid the temptation to sacrifice security strength for usability benefits. In the end, it truly is not worth the risk.

 

David Stott is senior director, product management, at PerspecSys where he leads efforts to ensure products and services meet market requirements.

 

Related CSA Resources Arrow to Content

Comments:

Be the first to comment on this post!

Leave a Comment




Page Dividing Line