Five Distinguished Security Experts to Keynote SecureCloud 2014 Arrow to Content

January 22, 2014 | Leave a Comment

SecureCloud 2014 is just around the corner and the CSA is pleased to announce the keynote speaker lineup for this must-attend event, which is taking place in Amsterdam on April 1-2.

Secure Cloud Speakers

This year’s event will feature keynote addresses from the following five security experts on a wide range of cloud security topics:

  • Prof. Dr. Udo Helmbrecht, executive director of the European Network and Information Security Agency (ENISA) will speak on the uptake of Cloud computing in Europe and how ENISA supports Cloud Security in the Member States.
  • Prof. Dr. Reinhard Posch, CIO for the Austrian Federal Government will present on the European Cloud Partnership and Austrian Government approach to cloud
  • Alan Boehme, Chief of Enterprise Architecture for The Coca-Cola Company will present on the CSA Software Defined Perimeter initiative
  • Jim Reavis, CEO of the Cloud Security Alliance will discuss trends and innovation in cloud security and CSA activities in 2014
  • Richard Mogull, CEO of Securosis will give the closing keynote on Automation & DevOps

If you haven’t already registered, early bird discount pricing is being offered through February 14. Registration information can be found at:

https://cloudsecurityalliance.org/events/securecloud2014/#_reg

We look forward to seeing all of you in Amsterdam in the Spring!

The Dark Side of Big Data: CSA Opens Peer Review Period for the “Top Ten Big Data and Privacy Challenges” Report Arrow to Content

February 25, 2013 | Leave a Comment

moonBig Data seems to be on the lips of every organization’s CXO these days. By exploiting Big Data, enterprises are able to gain valuable new insights into customer behavior via advanced analytics. However, what often gets lost amidst all the excitement are the very real and many security and privacy issues that go hand in hand with Big Data.  Traditional security schemes mechanisms were simply never designed to deal with the reality of Big Data, which often relies on distributed, large-scale cloud infrastructures, a diversity of data sources, and the high volume and frequency of data migration between different cloud environments.

To address these challenges, the CSA Big Data Working Group released an initial report, The Top 10 Big Data Security and Privacy Challenges at CSA Congress 2012, It was the first such industry report to take a holistic view at the wide variety of big data challenges facing enterprises. Since this time, the group has been working to further its research, assembling detailed information and use cases for each threat.  The result is the first Top 10 Big Data and Privacy Challenges report and, beginning today, the report is open for peer review during which CSA members are invited to review and comment on the report prior to its final release. The 35-page report outlines the unique challenges presented by Big Data through narrative use cases and identifies the dimension of difficulty for each challenge.

The Top 10 Big Data and Privacy Challenges have been enumerated as follows:

  1. Secure computations in distributed programming frameworks
  2. Security best practices for non-relational data stores
  3. Secure data storage and transactions logs
  4. End-point input validation/filtering
  5. Real-time security monitoring
  6. Scalable and composable privacy-preserving data mining and analytics
  7. Cryptographically enforced data centric security
  8. Granular access control
  9. Granular audits
  10. Data provenance

The goal of outlining these challenges is to raise awareness among security practitioners and researchers so that industry wide best practices might be adopted to addresses these issues as they continue to evolve. The open review period ends March 18, 2013.  To review the report and provide comments, please visit https://interact.cloudsecurityalliance.org/index.php/bigdata/top_ten_big_data_2013 .

Tweet this: The Dark Side of Big Data: CSA Releases Top 10 Big Data and Privacy Challenges Report. http://bit.ly/VHmk0d

CSA Releases CCM v 3.0 Arrow to Content

February 25, 2013 | Leave a Comment

The Cloud Security Alliance (CSA) today has released a draft of the latest version of the Cloud Control Matrix, CCM v3.0. This latest revision to the industry standard for cloud computing security controls realigns the CCM control domains to achieve tighter integration with the CSA’s “Security Guidance for Critical Areas of Focus in Cloud Computing version 3” and introduces three new control domains. Beginning February 25, 2013 the draft version of CCM v3.0 will be made available for peer review through the CSA Interact website with the peer review period closing March 27, 2013, and final release of CCM v3.0 on April 1, 2013.

The three new control domains; “Mobile Security”, “Supply Change Management, Transparency and Accountability”, and “Interoperability & Portability” address rapidly expanding methods cloud data is accessed, the need for ensuring due care is taken in the cloud providers supply chain, and the minimization of service disruptions in the face of a change to cloud provider relationship.

The “Mobile Security” controls are built upon the CSA’s “Security Guidance for Critical Areas of Mobile Computing, v1.0” and are the first mobile device specific controls incorporated into the Cloud Control Matrix.

The “Supply Change Management, Transparency and Accountability” control domain seeks to address risks associated with governing data within the cloud while the “Interoperability & Portability” brings to the forefront considerations to minimize service disruptions in the face of a change in a cloud vendor relationship or expansion of services.

The realigned control domains have also benefited through changes in language to improve the clarity and intent of the control, and, in some cases, realigned within the expanded control domains to ensure the cohesiveness within each control domain and minimize overlap.

The draft of the Cloud Control Matrix can be downloaded from the Cloud Security Alliance website and the CSA welcomes peer review through the CSA Interact website.

The CSA invites all interested parties to participate in the peer review and the CSA Cloud Controls Matrix Working Group Meeting to be held during the week of the RSA Conference, at 4pm PT on February 28, 2013, at the Sir Francis Drake Hotel
Franciscan Room
450 Powell St in San Francisco, CA.

Towards a “Permanent Certified Cloud”: Monitoring Compliance in the Cloud with CTP 3.0 Arrow to Content

January 29, 2013 | 1 Comment

Cloud services can be monitored for system performance but can they also be monitored for compliance? That’s one of the main questions that the Cloud Trust Protocol aims to address in 2013.

Compliance and transparency go hand in hand.

The Cloud Trust Protocol (CTP) is designed to allow cloud customers to query cloud providers in real-time about the security level of their service. This is measured by evaluating “security attributes” such as availability, elasticity, confidentiality, location of processing or incident management performance, just to name a few examples. To achieve this, CTP will provide two complementary features:

  • First, CTP can be used to automatically retrieve information about the security offering of cloud providers, as typically represented by an SLA.
  • Second, CTP is designed as a mechanism to report the current level of security actually measured in the cloud, enabling customers to be alerted about specific security events.

These features will help cloud customers compare competing cloud offerings to discover which ones provide the level of security, transparency and monitoring capabilities that best match the control objectives supporting their compliance requirements. Additionally, once a cloud service has been selected, the cloud customer will also be able to compare what the cloud provider offered with what was later actually delivered.

For example, a cloud customer might decide to implement a control objective related to incident management through a procedure that requires some security events to be reported back to a specific team within a well-defined time-frame. This customer could then use CTP to ask the maximum delay the cloud provider commits to for reporting incidents to customers during business hours. The same cloud customer may also ask for the percentage of incidents that were actually reported back to customers within that specific time-limit during the preceding two-month period. The first example is typical of an SLA while the second one describes the real measured value of a security attribute.

CTP is thus designed to promote transparency and accountability, enabling cloud customers to make informed decisions about the use of cloud services, as a complement to the other components of the GRC stack. Real time compliance monitoring should encourage more businesses to move to the cloud by putting more control in their hands.

From CTP 2.0 to CTP 3.0

CTP 2.0 was born in 2010 as an ambitious framework designed by our partner CSC to provide a tool for cloud customers to “ask for and receive information about the elements of transparency as applied to cloud service providers”. CSA research has begun undertaking the task of transforming this original framework into a practical and implementable protocol, referred to as CTP 3.0.

We are moving fast and the first results are already ready for review. On January 15th, CSA completed a first review version of the data model and a RESTful API to support the exchange of information between cloud customers and cloud provider, in a way that is independent of any cloud deployment model (IaaS, PaaS or SaaS). This is now going through the CSA peer review process.

Additionally, a preliminary set of reference security attributes is also undergoing peer review. These attributes are an attempt to describe and standardize the diverse approaches taken by cloud providers to expressing the security features reported by CTP. For example, we have identified more than five different ways of measuring availability. Our aim is to make explicit the exact meaning of the metrics used. For example, what does unavailability really mean for a given provider? Is their system considered unavailable if a given percentage of users reports complete loss of service? Is it considered unavailable according to the results of some automated test to determine system health?

As well as all this nice theory, we are also planning to get our hands dirty and build a working prototype implementation of CTP 3.0 in the second half of 2013.

Challenges and research initiatives

While CTP 3.0 may offer a novel approach to compliance and accountability in the cloud, it also creates interesting challenges.

To start with, providing metrics for some security attributes or control measures can be tricky. For example, evaluating the quality of vulnerability assessments performed on an information system is not trivial if we want results to be comparable across cloud providers. Other examples are data location and retention, which are both equally complex to monitor, because of the difficulty of providing supporting evidence.

As a continuous monitoring tool, CTP 3.0 is a nice complement to traditional audit and certification mechanisms, which typically only assess compliance at a specific point in time. In theory, this combination brings up the exciting possibility of a “permanently certified cloud”, where a certification could be extended in time through automated monitoring. In practice however, making this approach “bullet-proof” requires a strong level of trust in the monitoring infrastructure.

As an opportunity to investigate these points and several other related questions, CSA has recently joined two ambitious European Research projects: A4Cloud and CUMULUS. A4Cloud will produce an accountability framework for the entire cloud supply chain, by combining risk analysis, creative policy enforcement mechanisms and monitoring. CUMULUS aims to provide novel cloud certification tools by combining hybrid, incremental and multi-layer security certification mechanisms, relying on service testing, monitoring data and trusted computing proofs.

We hope to bring back plenty of new ideas for CTP!

Help us make compliance monitoring a reality!

A first draft of the “CTP 3.0 Data Model and API” is currently undergoing expert review and will then be opened to public review. If you would like to provide your expert feedback, please do get in touch!

by Alain Pannetrat 

apannetrat@cloudsecurityalliance.org

Dr. Alain Pannetrat is a Senior Researcher at Cloud Security Alliance EMEA. He works on CSA’s Cloud Trust Protocol providing monitoring mechanisms for cloud services, as well as CSA research contributions to EU funded projects such as A4Cloud and Cumulus. He is a security and privacy expert, specialized in cryptography and cloud computing. He previously worked as a IT Specialist for the CNIL, the French data protection authority, and was an active member of the Technology Subgroup of the Article 29 Working Party, which informs European policy on data protection. He received a PhD in Computer Science after conducting research at Institut Eurecom on novel cryptographic protocols for IP multicast security.

 

 

EMEA Congress Recap Arrow to Content

October 24, 2012 | Leave a Comment

The inaugural EMEA Congress in Amsterdam was an unqualified success, with hundreds of security visionaries in attendance and featuring presentations from some of the leading voices from across the cloud security landscape. What follows are just a sample of the discussions and some of the key takeaways from the two-day event:

EMEA Congress Presenters

  • Monica Josi, Microsoft’s Chief Security Adviser EMEA presented on Microsoft’s compliance strategy, emphasizing the importance of a common mapping  strategy to define compliance standards. Microsoft has mapped over 600 controls and 1500 audit obligations onto the ISO27001 framework and are using CSA’s CCM and ISO27001 to certify their Dynamic CRM, Azure and Office365 platforms. They have also published all relevant documentation on the CSA’s STAR repository.
  • Chad Woolf, Global Risk and Compliance Leader for Amazon Web Services highlighted the difference between security IN the cloud as opposed to security OF the cloud. According to Chad, security IN the cloud presents a much greater risk and discussed some of the different assurance mechanisms provided by AWS.
  • Data security and privacy expert Stewart Room provided an update on some of the more pressing legal issues facing cloud security, including a plea for more realistic legislation (e.g. subcontractor recommendations of Art 29 WP)
  • Mark O’Neill, CTO for Vordel gave an update on IDM standards, including oAuth 2.0 and OpenID Connect and how they fit into the cloud ecosystem. oAuth 2.0 is now a stable standard which can be used to give granular, revocable access control. It is lighter than SAML and therefore more suitable for mobile/REST scenarios.
  • Phil Dunkelberger made an impassioned call to arms for the industry to create a standard authentication protocol which would allow for the integration of appropriate authentication mechanisms into diverse services.
  • Jean-François Audenard, Cloud Security Advisor, for Orange Business Services presented their Secure Development Lifecycle that covers security and legal obligations, mitigation plans, security reviews and on-going operational security and the roles of their security Advisors, Architects and Managers in the lifecycle.

Panel Discussion Takeaways:

  • While Gartner has some 26 definitions for Cloud, according to Bruce Schneier it can be boiled down to the fact that it’s simply your data on somebody else’s hard disk that you access over the Internet!
  • Cloud provider specialization and reputation means better security in many respects. As to the question of what can be more difficult in the cloud, forensics is a major issue (e.g., ‘freezing the crime scene’, confiscation of hardware, etc)
  • As a customer, there is a lot you can and should do to monitor the cloud service provider (either independently and/or via executive dashboards). This also allows you to establish trust in smaller companies with less history.
  • Internal IT teams are not redundant . There are lots of security-related tasks still need to be taken care of. This is especially true for IaaS providers ( e.g. credential management ). The cloud provides opportunities for many of these individuals to perform higher value tasks (i.e., security training of staff, service monitoring, etc).
  • Business is consuming technology quicker than IT can provide it; as a result more internal business users are utilising external third party and cloud vendors to process their information. For example, MARS Information Services is using a modified version of ISO27001 (ISO++) and the CSA’s CCM to risk assess their third party vendors. As engagement move from Iaas to Paas and SaaS the level of risks increase as the controls are given to the service provider.
  • Historically, organizations have been largely concerned with securing the network, not the information that resides on it. We need to now protect information based on the risk associated with the compromise of that data. As such, a risk based approach to security requires data to be “high level” classified.
  • Once data has migrated to the Cloud, access and authentication becomes key. Authentication is currently taken for granted (passport, room key, ID badge, airline ticket, cards), except online where credentials are often re-used. If they are compromised, all systems using those credentials are vulnerable.
  • As data moves to the Cloud, there will situations that will require the data to be recovered, in a forensically sound way. The use of multi-tenant environments across multi-jurisdictions introduces numerous e-disclose and chain of custody challenges that are yet to be solved.

 

“Great conference with a number of speakers that really provided up to date, timely and in-depth information” - Peter Demmink, Merck / MSD

“The CSA delivered an excellent intro to all the aspects of cloud security and compliance” - Albert Brouwer, AEGON

 

 

 

Will the Cloud Cause the Reemergence of Security Silos? Arrow to Content

January 19, 2011 | Leave a Comment

by: Matthew Gardiner

Generally in the world silos relate to things that are beneficial, such as silos for grain or corn.  However in the world of IT security, silos are very bad.  In many forensic investigations application silos turn up as a key culprit that enabled data leakage of one sort or another.  It is not that any one application silo is inherently a problem – one can repair and manage a single silo much as a farmer would do – it is the existence of many silos, and silos of so many type, that is the core problem.  Farmers generally don’t use thousands of grain silos to handle their harvest; they have a handful of large, sophisticated, and centralized ones.

The same approach has proven highly effective in the world of application security, particularly since the emergence of the Web and its explosion of applications and users.  Managing security as a centralized service and applying it across large swaths of an organization’s infrastructure and applications is clearly a best practice.  However with the emergence of the Cloud as the hot application development and deployment platform going forward, organizations are at significant risk of returning to the bad days of security silos.  When speed overruns architecture, say hello to security silos and the weaknesses that they bring.

What do I mean by security silos?  I think of silos as application “architectures” which cause security (as well as IT management in general) to be conducted in “bits-and-pieces”, thus uniquely within the specific platform or system.  Applications are built this way because it feels faster in the short term. After all, the project needs to get done.  But after this approach is executed multiple times the organization is left with many inconsistent, custom, and diverse implementations and related security systems.  These systems are inevitably both complex to operate and expensive to maintain as well as easy to breach on purpose or by accident.

Perhaps this time it is different?  Perhaps IT complexity will magically decline with the Cloud?  Do you really think that the move to the Cloud is going to make the enterprise IT environment homogeneous and thus inherently easier to manage and secure?  Not a chance.  In fact, just the opposite is most likely. How many organizations will move all of their applications and data to public clouds?  And for that matter to a single public cloud provider.  Very few.  Given this, it is imperative that security architects put in place security systems that are designed to operate in a highly heterogeneous, hybrid (mixed public cloud and on-premise) world.  The cloud-connected world is one where applications and data will on one day be inside the organization on a traditional platform, the next day hosted within the organizations private cloud, the next day migrated to live within a public cloud service, and then back again, based on what is best for the organization at that time.

Are security silos inevitable with the move to the Cloud?  In the short term, unfortunately, probably yes.  With every new IT architecture the security approach has to do some catch-up.  It is the security professionals’ job to make this catch-up period as short as possible.

How should we shorten the catch-up period?

  • First update your knowledge base around the Cloud and security.  There are a lot of good sources out there; one in particular that I like is from the Cloud Security Alliance (CSA), Security Guidance for Critical Areas of Focus in Cloud Computing.
  • Second rethink your existing people, processes, and technology (sorry for the classic IT management cliché) in terms of the cloud.  You will find the control objectives don’t change, but how you will accomplish them will.
  • Third start making the necessary investments to prepare your organization for the transition to the cloud that is likely already underway.

While there are many areas covered in the above CSA document, let me focus on one area that in particular highlights some cloud specific security challenges, specifically around Identity and Access Management.

The CSA document says it well, “While an enterprise may be able to leverage several Cloud Computing services without a good identity and access management strategy, in the long run extending an organization’s identity services into the cloud is a necessary precursor towards strategic use of on-demand computing services.”  Issues such as user provisioning, authentication, session management, and authorization are not new issues to security professionals.  However, accomplishing them in the context of the cloud requires that the identity management systems that are on-premise in the enterprise automatically “dance” with the equivalent systems at the various cloud service providers.  This dance is best choreographed through the use of standards, such as SAML, XACML, and others.  In fact the rise of the cloud also raises the possibility of outsourcing even some of your identity management services, such as multi-factor authentication, access management, and other capabilities to specialized cloud security providers.

While in the short term it would seem that the emergence of some security silos is inevitable with organizations’ aggressive move to the cloud, it doesn’t have be this way forever.  We know security silos are bad, we know how to avoid them, and we have much of the necessary technology already available to eliminate them. Our necessary action is to take action.

Matthew Gardiner is a Director working in the Security business unit at CA Technologies. He is a recognized industry leader in the security & Identity and Access Management (IAM) markets worldwide. He is published, blogs, and is interviewed regularly in leading industry media on a wide range of IAM, cloud security, and other security-related topics. He is a member of the Kantara Initiative Board of Trustees. Matthew has a BSEE from the University of Pennsylvania and an SM in Management from MIT’s Sloan School of Management.  He blogs regularly at: http://community.ca.com/members/Matthew-Gardiner.aspx and also tweets @jmatthewg1234.  More information about CA Technologies can be found at www.ca.com.

Multi-tenancy and bad landlords Arrow to Content

June 13, 2010 | 2 Comments

So there’s been a lot of discussion about multi-tenancy recently and what it means for cloud providers and users. To put it simply: multi-tenancy is highly desirable to providers because they can provide a service or a platform (such as WordPress) and cram a kajillion users into it without having to constantly customize it, modify it or otherwise do much work to sell it individually. The reality is that whether or not users like multi-tenancy, the providers love it, so it’s here to stay.

So what happens when you have a bad, or just unlucky landlord? In the last few months WordPress.com has had a number of outages:

What Happened: We are still gathering details, but it appears an unscheduled change to a core router by one of our datacenter providers messed up our network in a way we haven’t experienced before, and broke the site. It also broke all the mechanisms for failover between our locations in San Antonio and Chicago. All of your data was safe and secure, we just couldn’t serve it.
http://en.blog.wordpress.com/2010/02/19/wp-com-downtime-summary/

And more recently:

If you tried to access TechCrunch any time in the last hour or so, you probably noticed that it wasn’t working at all. Instead, you were greeted by the overly cheery notice “WordPress.com will be back in a minute!” Had we written that message ourselves, there would have been significantly more profanity.

http://techcrunch.com/2010/06/10/wordpress-gives-us-the-vip-treatment-goes-down-on-us-again/

So what can we do to support this leg (availability) of the A-I-C triad of information security?

I don’t honestly know. It’s such a service/provider specific issue (do they control DNS? do you control DNS? can you redirect to another provider with the same service who has a recent copy of your data? If you do so can you then export any updates/orders/etc. back to your original provider when they come back? etc.) that pretty much any answer you’ll get is useless unless it’s specifically tailored to that provider or service.

If you have an answer to this, please post it in the comments.

Backups and security for cloud applications Arrow to Content

June 10, 2010 | Leave a Comment

Backups, the thing we all love to hate, and hate to love. Recreating data is rarely cheap, especially if it involves detailed analysis and combination. So we back it up.

Take for example this blog, it’s based on WordPress; which is about as standard and supported as you can get for a blog. Backing up the entire blog isn’t that bad, just grab a copy of the database and you are mostly good to go, except for the minor things like custom web pages and CSS files. So what is one to do? Well the obvious thought is to outsource your cloud service backups to a cloud service backup service.

http://blog.vaultpress.com/2010/03/30/announcing/

Update: Trend Micro appears to be getting in on the secure online backup thing.

http://www.mspmentor.net/2010/06/14/trend-micro-cloud-storage-and-saas-security-converge/

Put your chauffeur on the upgrade treadmill Arrow to Content

June 3, 2010 | Leave a Comment

I don’t know if anyone here remembers the “Billion Dollar Brain” by Len Deighton. One scene that stuck with me is General Midwinter making his minion (a chauffeur or bodyguard, I can’t remember which) do his time on the exercise bike for him and asking “how many miles did we bike today?”

Wouldn’t it be great if we were all rich enough to hire someone to do the horrible chores that have to be done every day (or weekly) like exercising in order to keep our bodies fit?

This is one of the more appealing aspects of Software-as-a-service (SaaS). In fact this blog is a perfect example of upgrade and maintenance avoidance. Rather than hosting the blog in-house and having to maintain and upgrade WordPress every few weeks we decided to simply outsource it to WordPress.com. Now there are some downsides; we can’t run all the plugins we’d like to (basically you get what WordPress gives you and you learn to like), but on the upside I will never have to upgrade a WordPress plugin or WordPress itself ever again (which is a security disaster waiting to happen as many have found out).

News roundup for May 28 2010 Arrow to Content

May 28, 2010 | Leave a Comment

Financial Services Like The Cloud, Provided It’s Private - http://www.informationweek.com/cloud-computing/blog/archives/2010/05/financial_servi.html

Novell Identity Manager extended to cloud - http://www.computerworlduk.com/technology/applications/software-service/news/index.cfm?newsid=20357

Amazon CEO Jeff Bezos: Cloud services can be as big as retail business - http://www.zdnet.com/blog/btl/amazon-ceo-jeff-bezos-cloud-services-can-be-as-big-as-retail-business/35111

Page Dividing Line