The Dark Side of Big Data: CSA Opens Peer Review Period for the “Top Ten Big Data and Privacy Challenges” Report Arrow to Content

February 25, 2013 | Leave a Comment

moonBig Data seems to be on the lips of every organization’s CXO these days. By exploiting Big Data, enterprises are able to gain valuable new insights into customer behavior via advanced analytics. However, what often gets lost amidst all the excitement are the very real and many security and privacy issues that go hand in hand with Big Data.  Traditional security schemes mechanisms were simply never designed to deal with the reality of Big Data, which often relies on distributed, large-scale cloud infrastructures, a diversity of data sources, and the high volume and frequency of data migration between different cloud environments.

To address these challenges, the CSA Big Data Working Group released an initial report, The Top 10 Big Data Security and Privacy Challenges at CSA Congress 2012, It was the first such industry report to take a holistic view at the wide variety of big data challenges facing enterprises. Since this time, the group has been working to further its research, assembling detailed information and use cases for each threat.  The result is the first Top 10 Big Data and Privacy Challenges report and, beginning today, the report is open for peer review during which CSA members are invited to review and comment on the report prior to its final release. The 35-page report outlines the unique challenges presented by Big Data through narrative use cases and identifies the dimension of difficulty for each challenge.

The Top 10 Big Data and Privacy Challenges have been enumerated as follows:

  1. Secure computations in distributed programming frameworks
  2. Security best practices for non-relational data stores
  3. Secure data storage and transactions logs
  4. End-point input validation/filtering
  5. Real-time security monitoring
  6. Scalable and composable privacy-preserving data mining and analytics
  7. Cryptographically enforced data centric security
  8. Granular access control
  9. Granular audits
  10. Data provenance

The goal of outlining these challenges is to raise awareness among security practitioners and researchers so that industry wide best practices might be adopted to addresses these issues as they continue to evolve. The open review period ends March 18, 2013.  To review the report and provide comments, please visit https://interact.cloudsecurityalliance.org/index.php/bigdata/top_ten_big_data_2013 .

Tweet this: The Dark Side of Big Data: CSA Releases Top 10 Big Data and Privacy Challenges Report. http://bit.ly/VHmk0d

CSA Releases CCM v 3.0 Arrow to Content

February 25, 2013 | Leave a Comment

The Cloud Security Alliance (CSA) today has released a draft of the latest version of the Cloud Control Matrix, CCM v3.0. This latest revision to the industry standard for cloud computing security controls realigns the CCM control domains to achieve tighter integration with the CSA’s “Security Guidance for Critical Areas of Focus in Cloud Computing version 3” and introduces three new control domains. Beginning February 25, 2013 the draft version of CCM v3.0 will be made available for peer review through the CSA Interact website with the peer review period closing March 27, 2013, and final release of CCM v3.0 on April 1, 2013.

The three new control domains; “Mobile Security”, “Supply Change Management, Transparency and Accountability”, and “Interoperability & Portability” address rapidly expanding methods cloud data is accessed, the need for ensuring due care is taken in the cloud providers supply chain, and the minimization of service disruptions in the face of a change to cloud provider relationship.

The “Mobile Security” controls are built upon the CSA’s “Security Guidance for Critical Areas of Mobile Computing, v1.0” and are the first mobile device specific controls incorporated into the Cloud Control Matrix.

The “Supply Change Management, Transparency and Accountability” control domain seeks to address risks associated with governing data within the cloud while the “Interoperability & Portability” brings to the forefront considerations to minimize service disruptions in the face of a change in a cloud vendor relationship or expansion of services.

The realigned control domains have also benefited through changes in language to improve the clarity and intent of the control, and, in some cases, realigned within the expanded control domains to ensure the cohesiveness within each control domain and minimize overlap.

The draft of the Cloud Control Matrix can be downloaded from the Cloud Security Alliance website and the CSA welcomes peer review through the CSA Interact website.

The CSA invites all interested parties to participate in the peer review and the CSA Cloud Controls Matrix Working Group Meeting to be held during the week of the RSA Conference, at 4pm PT on February 28, 2013, at the Sir Francis Drake Hotel
Franciscan Room
450 Powell St in San Francisco, CA.

CSA Drafts New SOC Position Paper Arrow to Content

February 25, 2013 | 3 Comments

Phil Agcaoili, Founding Member, Cloud Security Alliance

David Barton, Principal, UHY Advisors

 

In June 2011, the American Institute of Certified Public Accountants (AICPA) eliminated SAS 70 which had been a commonly used reporting standard within the information technology industry for providing third party audits of controls.  At that time, the AICPA introduced three Service Organization ControlSM (SOC) reporting options intended to replace SAS 70; SOC 1 (and the associated SSAE 16 guidance), SOC 2, and SOC 3.

The new AICPA reporting framework was created to eliminate confusion in the marketplace for service organizations (including Cloud providers) wishing to provide their customers with third party assurance on their controls.   Part of this confusion stems from the lack of knowledge by buyers regarding the purpose of each type of available report and their intended use.

The Cloud Security Alliance (CSA) has drafted the following position paper as a means to educate its members and provide guidance on selecting the most appropriate reporting standard.

After careful consideration of alternatives, the Cloud Security Alliance has determined that for most cloud providers, a SOC 2 Type 2 attestation examination conducted in accordance with AICPA standard AT Section 101 (AT 101) utilizing the CSA Cloud Controls Matrix (CCM) as additional suitable criteria is likely to meet the assurance and reporting needs of the majority of users of cloud services.

We’d like to thank the following people for their time and energy over the past year to help develop, define, harden, and deliver this guidance that we know will help our industry—Chris Halterman (E&Y), David Barton (UHY Advisors), Jon Long (CompliancePoint), Dan Schroeder (Habif, Arogeti & Wynne), Ryan Buckner (BrightLine), Beth Ross (E&Y), Jim Reavis (Cloud Security Alliance), Daniele Catteddu (Cloud Security Alliance), Audrey Katcher (Rubin Brown), Erin Mackler (AICPA), Janis Parthun (AICPA), and Phil Agcaoili (Cox Communications).

 

When Good Is Not Good Enough: NIST Raises the Bar for Cloud Data Protection Vendors Arrow to Content

February 21, 2013 | Leave a Comment

Earlier this year, the National Institute of Standards and Technology (NIST) released a publication titled Cloud Computing Synopsis & Recommendations (Special Publication 800-146) describing in detail the current cloud computing environment, explaining the economic opportunities and risks associated with cloud adoption, and openly addressing the security and data privacy challenges. NIST makes numerous recommendations for companies or agencies considering the move to the cloud (including delivering a strong case for uniform management practices in the data security and governance arenas).

 

The report highlights several reasons why cloud-based SaaS applications present heightened security risks. As a means to offset the threats, NIST’s recommendation on cloud encryption is clear-cut: organizations should require FIPS 140-2 compliant encryption to protect their sensitive data assets. This should apply to stored data as well as application data, and for Federal agencies, it’s a firm requirement, not simply a best practice or recommended guideline.

 

What does FIPS 140-2 validation mean? An encryption vendor whose cryptographic module attains this validation attests that its solution:

 

  • Uses an approved algorithm,
  • Handles the encryption keys appropriately, and
  • Always handles the data to be encrypted in a certain way, in a certain block size, with a certain amount of padding, and with some amount of randomness so the ciphertext can’t be searched.

 

Compare this to another level of validation, FIPS 197. FIPS 197 is an algorithmic standard that addresses the Advanced Encryption Standard (AES). As a standard that is used worldwide, AES is approved by the U.S. government to satisfy only one condition listed above – condition (1) “Uses an approved algorithm.” However, an encryption solution that only incorporates the validated algorithms of FIPS 197 does not meet security requirements (2) and (3) above, and hence is insufficient to be certified as FIPS 140-2 (minimizing its usefulness for those looking to use strong encryption).

 

Why is validation important? Well – it is a big deal for security professionals entrusted with deploying systems for protecting sensitive data. These differing standards leave the door open for confusion amid various market claims. Some solution vendors say “We can do AES encryption so our encryption is good.” Or “We use Military Grade encryption.” The reality is that if it is not FIPS 140-2 validated, stating something is Military Grade is clearly misleading.

 

One of the hottest areas for encryption technology is the Cloud – specifically, encrypting sensitive data stored and processed within SaaS or PaaS applications such as Oracle CRM On Demand or Salesforce.com. When you strongly encrypt data, for example using a FIPS 140-2 validated algorithm, it can “break” the end user’s experience with an application. For example, what happens when you try to search on a field like LAST NAME if all of the values, such as “Smith,” stored in the LAST NAME field have been encrypted? Well, your search will come back empty (and you’d be a pretty frustrated user).

 

A new class of products, which Gartner calls Cloud Encryption Gateways, has emerged to tackle this challenge. These solutions encrypt sensitive data before it leaves an organization’s firewall so that only an encrypted value goes into the cloud for processing and storage. And they also promise to “preserve functionality,” so you can still pull up a last name like SMITH on a search of SMI* even though the last names put in the cloud have been encrypted. Cool, right?

 

But you have to be careful as some vendors do this “magic” by modifying the encryption algorithms to ensure that a few characters always line up the same way in order to preserve the functionality I described (common operations like searching and sorting, etc.). This approach utilizes a weakened form of encryption that is certainly not FIPS 140-2 encryption. From a certification standpoint it doesn’t have any strength behind it; it just has a certification that says “If you run these strings through a certain way, you will get a result that looks like this” (FIPS 197).

 

It is important to remember that the implementation of AES without FIPS 140-2 is treated by the U.S. Federal government as clear text. Why? When you water down an encryption algorithm (like in the earlier example), you open up the encryption engine to crypto analysis, which creates a much easier path to cracking the data. This, by definition, puts sensitive data at risk. Solutions using these weakened algorithms make enterprises wrestle with the difficult tradeoff between meeting requirements for data privacy/protection and the overall usability of their application. This is a no-win scenario.

 

The good news is that there are some innovative approaches out there that do not rely on this sort of methodology. So my advice is to do your homework, ask the hard questions of your suppliers, and make sure your information is protected by the strongest techniques possible. Enterprises can find solutions that will keep all of their interested parties satisfied:

 

  • Privacy & Security Professionals: Can use industry acknowledged strong encryption techniques, such as FIPS 140-2, or tokenization
  • Business End-Users: Can get all of the SaaS or PaaS application functionality they need – security does not “break” the application’s usability
  • IT Professionals: Can deploy a standards-based, scalable platform that meets security and business needs and scales to support multiple clouds

 

And an alternative technique, called tokenization, also deserves a mention. Tokenization, sort of a “first cousin” of encryption, is a process by which a data field, such as a primary account number (PAN) from a credit or debit card, is replaced with a surrogate value called a token. De-tokenization is the reverse process of redeeming a token for its associated original value.

 

While there are various approaches to creating tokens, they typically are simply randomly generated values that have no mathematical relation to the original data field. Herein lies the inherent security of the approach – it is nearly impossible to determine the original value of the sensitive data field by knowing only the surrogate token value. So if a criminal got access to the token in the cloud, there is no “key” that could ever decipher it. The true data value never leaves the safety of the token vault stored securely behind an organization’s firewall.

 

Tokenization as an obfuscation technique has proven especially useful for organizations in some geographic jurisdictions with legal requirements specifying that sensitive data physically reside within country borders at all times. Privacy and security professionals find that tokenization technology provides a workable solution in these instances and overcomes the strict data residency rules enforced in many countries.

 

So whether it is industry acknowledged strong encryption or tokenization, make sure you choose robust, strong and validated techniques that will allow you to meet your security objectives. Never lose sight of the primary goal of adopting a security solution and avoid the temptation to sacrifice security strength for usability benefits. In the end, it truly is not worth the risk.

 

David Stott is senior director, product management, at PerspecSys where he leads efforts to ensure products and services meet market requirements.

 

Critical Infrastructure and the Cloud Arrow to Content

February 1, 2013 | 2 Comments

Cloud computing continues to be a hot topic. But so what if people are talking about it, who is actually adopting it? One of the questions I have been asking myself is, ‘Will cloud be adopted for critical infrastructure? And what is the security perspective on this?

Naturally a blog to answer that question will never really do the topic any justice. But it is a crucial issue. I wrote about critical cloud computing already a year ago on my blog ( http://blogs.mcafee.com/enterprise/the-audacity-of-cloud-for-critical-infrastructure ), and over the past years I have worked on these issues, for example with the European Network and Information Security Agency (ENISA), who have published the white paper; Critical Cloud Computing: A CIIP Perspective on cloud computing services.

The ENISA paper focusses on large cyber disruptions and large cyber attacks, as in the EU’s Critical Information Infrastrcuture Protection (CIIP) plan, e.g.) and looks at the relevant underlying threats like natural disaster, power network outages, software bugs, exhaustions due to overload, cyber attacks, etc. It underlines the strengths of cloud computing, when it comes to dealing with natural disasters, regional powercuts and DDoS attacks. At the same time it highlights that the impact of cyber attacks could be very large, because of the concentration of resources. Everyday people discover software exploits, in widely used software (this week UPnP, last month Ruby on Rails, and so on). What would be the impact if there was a software exploit for a cloud platform used widely across the globe?

As an expert on the ENISA Cloud Security and Resilience Working Group, I see this white paper as the starting point for discussions about what are the big cloud computing risks from a CIIP perspective. Revisiting the risk assessments we worked on in the past is important, mainly because the use of cloud computing is now so different, and because cloud computing is being adopted in critical sectors like finance, energy, transport and even governmental services.

A discussion about the CIIP perspective on cloud computing becomes all the more relevant in the light of the EU’s Cyber Security strategy, which will focus on critical sectors and preventing large-scale cyber attacks and disruptions. The strategy will be revealed by the European Commission in February and it will be interesting to see what role cloud computing will play in the strategy.

The report is available on the ENISA website at; https://resilience.enisa.europa.eu/cloud-security-and-resilience/cloud-computing-benefits-risks-and-recommendations-for-information-security/view

There is no doubt that internet connections and cloud computing are becoming the backbone of our society. The adoption within critical infrastructure sectors means that resilience and security becomes even more imperative for all of us.

By Raj Samani, EMEA Strategic Advisor CSA  and EMEA CTO McAfee

Twitter@Raj_Samani

Page Dividing Line