Cloud Fundamentals Video Series: Bring Your Own Device and the Cloud Arrow to Content

March 28, 2012 | Leave a Comment

Another great video out on the Trustworthy Computing site…

This latest video features Tim Rains, director, Trustworthy Computing, speaking with Jim Reavis of the CSA about the consumerization of IT and the issues that can be encountered when employees place an organization’s data on their personal devices such as smart phones. The challenges are heightened by the varied ways people share data today, such as through cloud services and social networks, and organizations are still learning how to manage the risks and their data security. Here’s the video, along with more details:

http://blogs.technet.com/b/trustworthycomputing/archive/2012/03/27/cloud-fundamentals-video-series-bring-your-own-device-and-the-cloud.aspx

Jim notes, “We’ve certainly seen an acceleration in cloud adoption. A lot of the organizations and enterprises we’re tracking are not only adopting both public and private cloud, but we’re seeing quite a change in Bring Your Own Device (BYOD).”

“The reality is, we’re going to see a lot of these highly mobile devices that may never make it inside of an enterprise actually need to be managed by cloud based services. That’s why at this RSA Conference we announced the CSA Mobile Initiative, in which we’re going to do a lot of research very similar to our original guidance to break down the different issues,” said Jim. “We’re looking to get the whole ecosystem involved.”

Secure Cloud – Myth or Reality? Arrow to Content

March 19, 2012 | 2 Comments

Cloud Security is not a myth.  It can be achieved.  The biggest hindrance on debunking this myth is for enterprise businesses to begin thinking about the Cloud differently.  It is not the equipment of co-location dedicated servers, or on-premises technology, as it is changeable, flexible and transforming everyday at the speed of light.  With these changes come better security and technology to protect ‘big data’.  The other issue to take into consideration is the human factor, there will always be people involved in building clouds and managing them, and there will always be people who want to attack them.  Therefore we need to consider these two key factors when enterprises choose their Cloud with security in mind.

First, technology and layers of security. “It’s more about giving up control of our assets and data (and not controlling the associated risk) than any technology specific to the cloud.” – This quote is from ‘2011 Data Breach Investigations Report’, a study conducted by the Verizon RISK Team. If architected with security in mind, it seems there is no evidence that specifically proves the Cloud is any more or less secure than a dedicated environment.  In fact, regulatory compliance such as PCI-DSS 2.0 for credit card information and HIPAA for healthcare data is regularly achieved in the public cloud.  It seems the biggest reservation of organizations resistant to moving into the Cloud is the fact that a majority of the infrastructure is shared.

 

Depending on your goals, there are essentially two key ingredients for true security in the Cloud.  The first and most important is separation.  This is absolutely essential – not only should your data be segregated from other tenants on the infrastructure, your network traffic, virtual machines and even security policies should be separate.

 

For instance, although a firewall or web application firewall may be shared, it’s imperative that policy modification does not impact anyone other than the tenant it was modified for.

 

The other key ingredient is transparency and auditability.  So you’ve decided to move to the cloud?  Great.  But how do you know you are getting what was advertised?  Simply put, you don’t.  Transparency is essential in keeping tabs on your Cloud hosting provider.  Being able to see behind the curtain should allow you to see exactly how your environment is being protected.  Not only does it give you peace of mind, but it’s required to perform regulatory compliance audits.

 

With data separation and being able to keep a watchful eye on your resources, most organizations are better off moving to the Cloud security-wise.  Reducing cost by only paying for resources you need, when you need them is a substantial benefit, but being able to leverage a provider’s security infrastructure is much better.  Most organizations don’t have the expertise, much less the budget to implement security measures such as high-end firewalls, DDoS mitigation, VPN with two factor authentication, web application firewalls, IDS, IPS, patch management, anti-virus and a host of other security measures.  As a result, some may actually be more secure in the cloud.

 

Secondly, defending against attacks.  Cyber attacks are created and launched by people, and they happen in many ways, some more common than others.  In April 2011, Sony PlayStation players were compromised.  It is estimated that 100 million players’ data containing names, addresses, e-mail accounts and passwords were stolen.  Some customers were hacked over and over again, as much as 10 times per customer.  This was the result of a planned and calculated hack.  In a letter to the U.S. House Commerce Committee the Chairman of Sony said they had shut down the affected system while it investigated the attack and beefed up security.  This larger breach followed another, and Sony believes while they were tracking and defending the large DDoS attack the vulnerability was exposed, allowing the larger and more troubling breach to happen.   While Sony was working to address and mitigate the DDoS attack, the group Anonymous was able to infiltrate the system and cause an even greater breach.  Security updates and bug fixes must be constantly monitored and applied to all applications.

 

Last year, CitiGroup was hacked by criminals who stole more than 200,000 Citigroup customer bank account details.  Unfortunately for Citigroup, this damage was done through what was apparently a trivial, insecure direct object reference vulnerability – number four on the OWASP top ten.  By simply manipulating the URL in the address bar, authenticated users were able to jump from account to account, as they did tens of thousands of times.  This vulnerability could have easily been detected by not using direct references to account numbers, secure code review, or web application firewalls and application log monitoring and review.

 

In conclusion, from a security perspective, there are a number of perceived obstacles to implementing a public Cloud infrastructure.  All of these may appear, at first sight, to be perfectly valid.  This is largely because many existing public Cloud environments have been built with capacity, connectivity, scalability and other core attributes for hosting as a priority, with security implemented as a secondary layer.  A truly secure public Cloud is possible, but only if it is built upon a secure framework – this ensures that, no matter how hosting technologies change and develop – as well as the arrival of new tactics devised by hackers to exploit them – there is always a secure foundation underpinning the entire architecture.

 

Chris Hinkley is a Senior Security Engineer at managed hosting provider FireHost where he maintains and configures network security devices, and develops policies and procedures to secure customer servers and websites. Hinkley has been with FireHost since the company’s inception. In his various roles within the organization, he’s serviced hundreds of customer servers, including Windows and Linux, and overseen the security of hosting environments to meet PCI, HIPAA and other compliance guidelines.

 

Seeing Through the Clouds: Gaining confidence when physical access to your data is removed Arrow to Content

March 12, 2012 | Leave a Comment

Cloud computing brings with it new opportunities, new frontiers, new challenges, and new chances for loss of intellectual property.  From hosting simple web sites, to entire development environments, companies have been experimenting with cloud-based services for some time.  Whether a company decides to put a single application or entire datacenters in the cloud, there are different risks and threats that the businesses and IT need to think about.  All of these different uses, all of these different scenarios are going to require thorough planning and development in order to make sure whatever gets put in the cloud is protected.  When implemented properly, companies may actually find that they have improved their overall security posture.

 

When putting systems and information into your own datacenter, certain security measures have to be in place to ensure external threats are minimized.  One of the big security measures is the datacenter itself, with a security boundary only allowing authorized personnel to have direct access to the physical systems.  Within the datacenter, dedicated network connections ensure the data flows properly with little concern of unauthorized snooping.

 

These and other physical controls go away when working in a cloud environment.  Regardless of whether you choose an Infrastructure-as-a-Service (IAAS), Platform-as-a-Service (PaaS) or Software-as-a-Service (SaaS) cloud model, the physical boundary has gone from a select few authorized people to an unknown number of people who are not even part of your company.

 

Other controls inherent to locally hosted systems include firewalls, network segmentation, physical separation of systems and data and a dizzying array of monitoring tools.  When going to a cloud model, whether it’s a public or private cloud, most of these controls either go away entirely or have significant limitations to them.  The controls may still be there, but may not be under your direct management.  In other cases some of these controls may be removed entirely.  The three tenants of security are confidentiality, integrity, and availability.  When our data sits in our own datacenters we feel confident that we have a pretty good level of control over all three of those tenants.  When we put our data in the cloud, we feel that we have lost control of all three.  This doesn’t mean that a cloud-based solution is bad, rather it means we need to look at what it is we’re migrating to the cloud and make sure the three tenants are still covered.

 

Simply picking up an application or an in-house service and moving it to a cloud-based solution isn’t good enough, and will most likely leave information exposed.  You need to review:

  • How the information is secured
  • How access is authorized
  • How integrity and confidentiality are controlled

 

It may require new technologies to help enhance these things, but it may also just be a matter of tighter IT processes around how systems are configured and managed.

 

One of the attractive components of moving to a cloud-based service is the ability to expand on demand.  This model allows a company to handle high load periods and have those services pulled back when not needed.  Implementing additional authentication and authorization controls, as well as data encryption at rest as well as in motion will also help increase the level of security and control kept by a company when using cloud services. Since there will be a number of components of cloud-based services outside of an administrators control, the implementation of additional security controls including host-based and next-gen firewall and monitoring activities will also help enhance security while providing peace of mind.

 

The cloud has several attributes that make it attractive to business besides cost savings, including:

  • The ability to have highly redundant, geographically diverse systems help companies handle both disaster scenarios and enhance customer experience.
  • The ability to quickly add more systems helps companies handle spikes in traffic.
  • Speed of deployment can also help a company to keep a competitive edge.
  • If implemented with the appropriate security controls in place, companies can have safe, secure systems that not only rival those they could have built within their own datacenters, but with more features and security than traditional IT deployments.

 

Expanding into the cloud requires the IT staff to think differently about security. Decisions in the past may have provided enough security for information stored within your datacenter, but when using the cloud security and monitoring has to be reassessed and modified to account for the changes in the risk boundaries.

 

David Lingenfelter is the Information Security Officer at Fiberlink.  David is a seasoned security professional with experience in risk management, information security, compliance, and policy development. As Information Security Officer of Fiberlink, David has managed projects for SAS70 Type 2 and SOC2 Type 2 certifications, as well as led the company through audits to become the first Mobile Device Management vendor with the FISMA authorization from the GSA.  Through working with Fiberlink’s varied customer-base, David has ensured the MaaS360 cloud architecture meets requirements for HIPPA, PCI, SOX, and NIST.  He has been an instrumental part in designing Fiberlink’s cloud model, and is an active member of the CSA, as well as the NIST Cloud working groups.


Fiberlink is the recognized leader in software-as-a-service (SaaS) solutions for secure enterprise mobile device and application management. Its cloud-based MaaS360 platform provides IT organizations with mobility intelligence and control over mobile devices, applications and content to enhance the mobile user experience and keep corporate data secure across smartphones, tablets and laptops. MaaS360 helps companies monitor the expanding suite of mobile operating systems, including Apple iOS, Android, BlackBerry and Windows Phone. Named by Network World as the Clear Choice Test winner for mobile device management solutions and honored with the 2012 Global Mobile Award for “Best Enterprise Mobile Service” at Mobile World Congress, MaaS360 is used to manage and secure more than one million endpoints globally. For more information, please visit http://www.maas360.com.

Lock Box: Where Should You Store Cloud Encryption Keys Arrow to Content

March 12, 2012 | Leave a Comment

Whether driven by regulatory compliance or corporate mandates, sensitive data in the cloud needs protection along with access control. This usually involves encrypting data in transit as well as data at rest in some way, shape or form, and then managing the encryption keys to access the data. The new conundrum for enterprises lies in encryption key management for data in the cloud

When considering a Software-as-a-Service (SaaS) or Platform-as-a-Service (PaaS) offerings, protection for data-at-rest typically rests in the hands of the cloud service provider. Digging into the the terms of service or master subscription agreement reveals the security commitments of the SaaS/PaaS provider.  For example, Salesforce.com’s Master Subscription Agreement indicates “We shall maintain appropriate administrative, physical, and technical safeguards for protection of the security, confidentiality and integrity of Your Data.” For Infrastructure-as-a-Service (IaaS), the security burden typically falls primarily on the cloud consumer to ensure protection of their data.  Encryption is a core requirement for protecting and controlling access to data-at-rest in the cloud, but the issue of who should control the encryption keys poses new questions in this context.

When weighing where to maintain encryption keys, enterprises should consider issues including security of the key management infrastructure (a compromised key can mean compromised data), separation of duties (for example, imposing access controls so administrators can backup files but not view sensitive data), availability (if a key is lost, data is cryptographically shredded), and legal issues (if keys are in the cloud, law enforcement could request and obtain encrypted data along with the keys without the enterprise’s consent).

There are a variety of ways to protect data-at-rest in the public cloud, such as tokenization or data anonymization. The most commonly used approach is to encrypting the data at rest. Whether encrypting a mounted storage volume, a file, or using native database encryption (sometimes referred to as “Transparent Data Encryption”, or TDE), all of these operations involve an encryption key. Where should that encryption key be stored and managed?  There are three primary options (with lots of variations of the three).

Keys in Enterprise Datacenter:  Holding the keys in the datacenter ensures maximum security and availability.  There is no risk of an external party being compromised (as in the RSA SecureID breach) and a high availability/disaster recovery configuration can be implemented to ensure keys are always available.   There are various deployment decisions including whether to use a virtual appliance or a hardware appliance depending on risk tolerance levels.

SaaS Key Management: A second alternative is using a SaaS key management solution. This involves having a SaaS vendor take responsibility for the keys. While this approach takes advantage of cloud economics, there are risks. Since the SaaS key management vendor assumes responsibility for availability of the keys – if they experience an outage, the data could become unavailable. If keys are somehow lost or corrupted, you data could be permanently unavailable. The vendor is also responsible for the security of the keys – any compromise of the SaaS infrastructure puts customer data at risk (the RSA SecureID episode again comes to mind).  There are also legal issues to consider if you do not hold the encryption keys- a cloud service provider (SaaS or IaaS) could be compelled to turn over encryption keys and data via the USA PATRIOT Act without the data owner being aware (a Forrester Research blog posting by Andrew Rose provides a nice summary of the issue).

IaaS Manages Keys: A third option is to rely on tokenization or encryption services provided by your favorite IaaS vendor. This provides a checkbox that data is encrypted, but creates similar security and availability risks posed by the SaaS alternative (you are relying on the security and availability of your IaaS provider’s key management and effectively making the IaaS provider custodian of both the encryption keys and encrypted data – not an ideal separation of duties).  Some IaaS providers offer encryption options that allow customers to choose whether they want to manage the keys themselves or have the vendor assume management responsibility. For example, Amazon’s S3 storage includes encryption options to encrypt volumes of data while enabling you to either manage your own encryption keys or to have Amazon hold the keys.

The cloud may create new key management challenges, but the principles for choosing between the various alternatives remain the same. Enterprises must assess their risk tolerance and audit requirements before they can select a solution that best meets their encryption key management needs.

Todd Thiemann is senior director of product marketing at Vormetric and co-chair of the Cloud Security Alliance (CSA) Solution Provider Advisory Council.

Page Dividing Line