March 30, 2011 | 3 Comments
By Margaret Dawson
The “cloud” is one of the most discussed topics among IT professionals today, and organizations are increasingly exploring the potential benefits of using cloud computing or solutions for their businesses. It’s no surprise Gartner predicts that cloud computing will be a top priority for CIOs in 2011.
In spite of this, many companies and IT leaders remain skeptical about the cloud, with many simply not knowing how to get started or how to evaluate which cloud platform or approach is right for them. Furthermore, uncertainty and fears around cloud security and reliability continues to permeate the market and media coverage. And finally, there remains confusion around the definition of what is the cloud and what is it not, leading some CIOs to want to scrap the term “cloud” altogether.
My number one advice to companies of all sizes is to not buy the cloud, but rather, buy the solution. Just as we have always done in IT, begin with identifying the challenge or pain that needs to be solved. In evaluating solutions that help address your challenge, include both on-premise and “as a service” based solutions. And then use the same critical criteria to evaluate those cloud solutions as you would any other, making sure it addresses your requirements around data protection, identity management, compliance, access control rules, and other security capabilities.
Also, do not get sucked into the hype. Below, I attempt to dispel some of the most common myths about cloud security that are common today:
1. All clouds are created equal
One of the biggest crimes committed by the vendor community and media over the last couple of years has been in talking about “the cloud” as if it was a single, monolithic entity. This mindset disregards the dozens of ways companies need to configure the infrastructure underlying a cloud solution, and the many more ways of configuring and running applications on a cloud platform.
Often people lump together established, enterprise-class cloud solutions with free services offered by social networks and similar “permanent beta” products. As a result of this definition of “the cloud”, many organizations fear that cloud solutions could expose critical enterprise resources and valuable intellectual property in the public domain. An unfortunate result of this fundamental disservice to the cloud security discussion is that it will only increase apprehension towards cloud adoption.
While the cloud can absolutely be as secure as or even more secure than an on-premise solution, all clouds are NOT created equal. There are huge variances in security practices and capability, and you must establish clear criteria to make sure any solution addresses your requirements and compliance mandates.
2. Cloud security is so new, there’s no way it can be secure
With all the buzz surrounding the cloud, there’s a misconception that cloud security is a brand new challenge that has not been addressed. What most people don’t understand is that while the cloud is already bringing radical changes in cost, scalability and deployment time, most of the underlying security concerns are, in fact, not new or unattainable. It’s true that the cloud represents a brand new attack vector that hackers love to go after, but the vulnerabilities and security holes are the same ones you face in your traditional infrastructure.
Today’s cloud security issues are much the same as any other outsourcing model that organizations have been using for years. What companies need to remember is that when you talk about the cloud, you’re still talking about data, applications and operating systems in a data center, running the cloud solution.
It’s important to note that many cloud vendors leverage best-in-class security practices across their infrastructure, application and services layers. What’s more, a cloud solution provides this same industry-leading security for all of its users, often offering you with a level of security your own organization could not afford to implement or maintain.
3. All clouds are inherently insecure
As previously mentioned, a cloud solution is no more or less secure than the datacenter, network and application on which it is build. In reality, the cloud can actually be more secure than your own internal IT infrastructure. A key advantage to third-party cloud solutions is that a cloud vendor’s core competency is to keep its network up and deliver the highest level of security. In fact, most cloud service providers have clear SLAs around this.
In order to run a cloud solution securely, cloud vendors have the opportunity to become PCI DSS compliant, SAS 70 certified and more. Undergoing these rigorous compliance and security routes can provide organizations with the assurance that cloud security is top of mind for their vendor and appropriately addressed. The economies of scale involved in cloud computing also extend to vendor expertise in areas like application security, IT governance and system administration. A recent move towards cloud computing by the security-conscious U.S. Federal Government is a prime example of how clouds can be extremely secure, depending on how they are built.
The one area to remember that folks often forget is the services piece of many cloud solutions. Beyond the infrastructure and the application, make sure you understand how the vendor controls access to your data by their services and support personnel. Ac
Anxiety over cloud security is not likely to dissipate any time soon. However, by focusing on the facts and addressing the market’s concerns directly – like debunking cloud security myths – it will go a long way in helping companies gain confidence in deploying the cloud. There are also an increasing number of associations and industry forums, such as the Cloud Security Alliance, that provide vendor-neutral best practices and advice. In spite of the jokes, cloud security is not an oxymoron, but in fact, an achievable and real goal.
Margaret Dawson is Vice President of Product Management for Hubspan (www.hubspan.com). She’s responsible for the overall product vision and roadmap, and works with key partners in delivering innovative solutions to the market. She has over 20 years experience in the IT industry, working with leading companies in the network security, semiconductor, personal computer, software, and e-commerce markets, including Microsoft and Amazon.com. She is a frequent speaker on cloud security, cloud platforms, and other cloud-related themes. Dawson has worked and traveled extensively in Asia, Europe and North America, including ten years working in the Greater China region, consulting with many of the area’s leading IT companies, and serving as a BusinessWeek magazine foreign correspondent.
March 29, 2011 | Leave a Comment
By Eric Baize
For years, the security industry has been complacent, using complex concepts to keep security discussions isolated from mainstream IT infrastructure conversation. The cloud revolution is bringing an end to this security apartheid. The emergence of an integrated IT infrastructure stack, the need for information-centric security and the disruption brought by virtualization are more and more making security a feature of the IT infrastructure. The industry consolidation, initiated by EMC’s acquisition of RSA in 2006 and now well on its way with the recent acquisition of McAfee by Intel and Arcsight by HP, is demonstrating that the security and IT infrastructure conversation are one in the same.
We, the security people, must follow this transition and lay out a vision that non-security experts can understand without having to take a PhD course in prime number computation.
Let me give it a try by using the video rental industry as an example on why security in the cloud will be different and more effective.
Video rental industry:
1 – You start with a simple need: Most families want to watch movies in their living room, a movie of their choosing, at a time of their choosing.
2 – A new market emerges: Video rental stores with chains such as Blockbuster in the U.S. Do you remember the late fees?
3 – Then comes a new business model. Instead of paying per movie and driving to the store, you pay a monthly subscription fee and movies are delivered directly to your home. Netflix* jumps in and makes the new delivery model work with legacy technology by sending DVDs through postal mail.
4 – Increase in network bandwidth makes video on demand possible on many kinds of end-user devices from cell phones to video game consoles. Netflix expands its footprint by embedding its technology into any video viewing device that makes it into your home: Game consoles, streaming players and smart phones.
5 – Blockbuster has filed for Chapter 11 bankruptcy. Netflix is uniquely positioned to help consumers transition from the old world of video viewing with DVDs to video on-demand. The customer wins with better movie choices delivered faster.
The Security Industry
The parallel with the evolution the security industry is going through is striking:
1 – You start with a simple need from CIOs and CSOs: They want to secure their information.
2 – A new market emerges: IT security with early players focusing on perimeter security: Building firewalls around information and bolting on security controls on top of insecure infrastructure.
3 – Here comes the cloud, a different way of delivering, operating and consuming IT. IT is delivered as a service. Enterprises use virtualization to build private clouds operated by internal IT teams. The IT infrastructure is invisible and security is becoming much more information-centric. New security solutions such as the RSA Solution for Cloud Security and Compliance emerge, that focus on gaining visibility over the new cloud infrastructure and on controlling information.
4 – Increase in bandwidth makes it possible to expand private cloud into hybrid clouds, using a cloud provider’s IT infrastructure to develop new applications or to run server or desktop workloads. Security is changing as controls are directly embedded in the new cloud infrastructure, making it security aware. The need for visibility expands to cloud provider’s IT infrastructure and new approaches such as the Cloud Security Alliance GRC Stack enable enterprises to expand their GRC platform to manage compliance of their cloud provider infrastructure.
5 – What will happen to the security industry? It must adapt and manage the transition from physical to virtual to cloud infrastructures. First, by dealing with traditional security controls in physical IT infrastructure; then, by embedding its control in the virtual and cloud infrastructure to build a trusted cloud; and finally by providing a consolidated view of risk and compliance across all types of IT infrastructure: physical or virtual, on-premise or on a cloud provider’s premises. The customer wins: IT infrastructures have become security-aware, making security and compliance more effective and easier to manage.
So, does this explanation work for you? I welcome all comments below!
* Netflix is a registered trademark of Netflix, Inc.
Eric Baize is Senior Director in the RSA’s Office of Strategy and Technology with responsibility for developing RSA’s strategy for cloud and virtualization. Mr Baize also leads the EMC Product Security Office with company-wide responsibility for securing EMC and RSA products.
Previously, Mr. Baize pioneered EMC’s push towards security. He was a founding member of the leadership team that defined EMC’s vision of information-centric security, and which drove the acquisition of RSA Security and Network Intelligence in 2006.
Mr Baize is a Certified Information Security Manager, holder of a U.S. patent and author of international security standards. He represents EMC on the Board of Directors of SAFECode.
March 29, 2011 | 2 Comments
By Anil Chakravarthy and Deepak Mohan
Over the past few years, information explosion has inhibited organizations’ ability to effectively secure, manage and recover data. This complexity is only increasing as organizations try to manage the data growth by moving it to the cloud. It’s clear that storage administrators must regain control of information to reduce costs and recovery times while complying with regulatory compliance standards, including privacy laws.
Data growth is currently one of the biggest challenges for organizations. In a recent survey by Gartner, 47 percent of respondents ranked data growth as the biggest data center hardware infrastructure challenge for large enterprises. In fact, IDC says that enterprise storage will grow an average of 60 percent annually.
As a result, companies are turning to the cloud to help them alleviate some of the pains caused by these issues.
The Hype of the Cloud: Public, Private and/or Hybrid?
There is so much hype associated with cloud computing. Companies often struggle with defining the potential benefits of the cloud to their executives, and which model to recommend. In short, the cloud is a computing environment that can deliver on-demand resources in a scalable, elastic manner and is typically, but not necessarily, accessible from anywhere – through the internet (https://blog.cloudsecurityalliance.org/). The cloud encompasses the principle that users should have the ability to access data when, where and how they want – regardless of device.
The public cloud is typically when a third party owns the infrastructure and operations and is delivering a service to multiple private entities (i.e., cloud-based email or document sharing). While these services typically provide low-cost storage, this model has a few drawbacks: companies have limited control over implementation, security, privacy. This can be less than ideal for some organizations.
We believe most enterprises will implement a private cloud over the next few years. A private cloud retains control by enabling the company to host data and applications on their own infrastructure and maintain their own operations. This gives them maximum control, protecting against unforeseen issues. Private clouds can be scalable and elastic (similar to public clouds), providing them the best online storage operations and options to improve performance, capacity, and availability as needed.
A hybrid approach enables organization to combine the inexpensive nature of public cloud and private clouds, but giving additional control over the management, performance, privacy and accessibility of the cloud for their organization. For example, an organization may define a private cloud storage infrastructure for a set of applications and take advantage of public cloud storage for off-site copies and/or longer-term retention. This gives the organization the flexibility to deliver a service-oriented model to their internal customers.
Deciding which model to use is crucial. Each organization ought to evaluate their application portfolio, determine the corporate risk tolerance, and look to an agile way to consume cloud services. For small and medium-sized enterprises the propensity for public cloud applications and infrastructure can be much greater than large enterprise organizations.
The Private Cloud and Virtualization: Tools to Minimize Data Growth
As companies look to private clouds, often leveraging server virtualization, to more efficiently deliver applications to their business, it can also help manage, backup, archive and discover information. Adding to this issue, IDC reports that companies often waste up to 65 percent of storage capacity as disk utilization rates range from 28-35 percent. Cloud initiatives seem like the natural solution.
The private cloud is the clear answer. Combined with virtual environments, if managed correctly, the cloud can help organization save money, increase application delivery times, increase application availability, and reduce risks.
As a best practice, organizations need to increase storage utilization within virtual infrastructures as virtual machines deployments can often result in unused and wasted storage. Moreover, there can be performance implications as companies look to virtual desktops when a large number of users log into their desktops simultaneously, performance can suffer dramatically. Organizations can use new tools that address the storage challenges of virtual environments and integrate with the virtual machine management console for rapid provisioning of servers and virtual desktops. This would include the cloning and set up of virtual machines, desktops, applications, and files needed across all virtual servers.
By having intelligent storage management tools, organizations can reduce the number of copies stored for virtual machines and desktops yet still deliver the same number of applications and desktops to the business. This enables administrators to utilize the appropriate storage (including the appropriate characteristics cost, performance, availability, etc). According to our own tests, this can eliminate as much as 70 percent of storage requirements and costs by storing only the differences between VM boot images.
In addition, by utilizing appropriate management tools that look across all environments – whether physical, virtual or cloud-based – organizations can drive down costs by giving them a better understanding of how they are using storage to improve utilization and make better purchasing decisions. Furthermore, using such centralized management tools will help them to better automate tasks to improve services and reduce errors. This automation helps organizations deliver storage as a service (a key tenant for private cloud computing) with capabilities including on-host storage provisioning, policy-driven storage tiering and chargeback reporting.
Another example is when organizations back up applications within their virtual environment, they have normally done two separate backups: one for the full image recovery and one of the individual files within the environment for recovery later. Organizations can reduce this waste by implementing solutions that will do a single backup that is off-host, in the cloud, and will allow them to do two separate recoveries of the full image and of granular files. This more effective implementation of deduplication keeps data volumes lower and allows for better storage utilization.
Hybrid Cloud Solutions: Control of Storage Utilization and Archiving
Data protection and archiving environments within virtual and cloud environments tend to grow faster than anticipated. These environments will need to be managed closely to keep costs down. Luckily, there are software tools that address this quite effectively.
Implementing a hybrid model allows organizations to get storage offsite (through public cloud storage), eliminating tape rotations and other expenses associated with off-premise storage. However, organizations should be cautious when looking at tools that don’t provide consistent management across physic, virtual, and cloud-based infrastructures.
Many organizations are examining cloud-based email such as Google’s Gmail and Microsoft Office 365. But, as a best practice for this hybrid model, organizations can’t compromise corporate security and governance policies. This often results in organizations needing to maintain on-premise email archiving and discovery capabilities with information that resides in the cloud. In doing so, organizations now have a consistent way to discovery information that in their private cloud as well as information hosted in the public cloud.
Of course, organizations that integrate tightly with major cloud storage partners will see the biggest benefit of this hybrid approach – especially if they need to quickly deploy a cloud implementation to meet rapid growth.
Moving Forward with an Eye to the Sky
IDC reports that 62 percent of respondents to a recent survey say that they will be investing in data archiving or retirement in 2011 to address the challenges associated with data growth. IT organizations are in the process of trying to re-architect their environments to meet these challenges. Private and hybrid clouds, combined with virtualization, seem key in addressing these challenges.
By implementing cloud solutions, storage administrators are regaining control of information, helping them to reduce storage costs, and better deal with tomorrow’s challenges.
Anil Chakravarthy, Senior Vice President, Storage and Availability Management Group and Deepak Mohan, Senior Vice President, Information Management Group, Symantec Corporation
March 22, 2011 | Leave a Comment
By Allen Allison
The emerging Public Cloud versus Private Cloud debate is not just about which solution is best. It extends to the very definition of cloud. I won’t pretend that my definitions of public cloud and private cloud match everybody elses, but I would like to begin by establishing my point of reference for the differences between public and private cloud.
Public Cloud: A multi-tenant computing environment that can deliver on-demand resources in a scalable, elastic manner that is both measured and metered, and often charged, on a per-usage basis. The public cloud environment is typically, but not necessarily, accessible from anywhere – through the internet.
Private Cloud: A single tenant computing environment that may provide similar scalability and over-subscription to the Public Cloud, but solely within the single tenant’s infrastructure. This infrastructure may exist on the tenant’s premises, and may be delivered in a dedicated model through a managed services offering. The private cloud environment is typically accessible from within the tenant’s infrastructure. However, it may be necessary to enable external access via the internet or other connectivity.
It is commonly understood that a cloud environment, whether public or private, has several benefits including lower total cost of ownership (TCO). However, there are considerations that should be made when determining whether the appropriate option is a public or private cloud. Below are some key points to consider, as well as some perceptions, or misperceptions, of the benefits of each.
In a Private Cloud, the owner or tenant may have more flexibility in establishing policies and procedures for provisioning, usage, and security. If there are specific controls, that may otherwise impact other tenants in a shared environment, there may be greater control given to the organization within a dedicated environment.
In a Public Cloud, the tenant has less control over the shared resources, the security of the platform, or the compliance of the infrastructure. The tenant, however, may be able to leverage common security controls or compliance certifications that may inspire greater confidence in the use of a managed cloud offering. For example, if the public cloud infrastructure is included in the SAS70 (soon to be replaced by SSAE16) audit by a 3rd party, the tenant may be in a position to offer the controls and compliance as part of their own compliance program.
In a Private Cloud, the owner or tenant may be able to leverage the scalability and capacity management of a platform that is able to handle the over-subscription or provisioning processes of a multi-resource infrastructure. This allows for a consolidation of hardware and management resources, a potential reduction in administrative costs, and a scale that enables the use of idle resources (e.g. memory, CPU, etc.). However, these benefits may come with a significant capital expense, depending on the cost model.
In a Public Cloud, the tenants enjoy greater scalability and capacity benefits because the costs of adding resources or managing the environment is not tied to a single tenant, but spread over all tenants of the platform. Typically, in a public cloud, the tenant is only billed for the use of those resources. This allows for a lower initial expense and a growth in cost to match utilization, which, in many cases, can equate to growth in revenue for the hosted application. Likewise, when the need for resources is reduced, the total cost is also reduced. This can be especially helpful when the platform is used to support a seasonal business (e.g., online merchant).
In a Private Cloud, the tenant has more control over maintenance schedules, upgrades, and the change-management process. This allows for greater flexibility in the managed platform to comply with specific requirements, such as the FDA CFR 21 or NIST 800-53. As the stringent requirements of these regulations impair the flexibility of cloud environments, it is easier to maintain the entire dedicated cloud platform to these specific controls rather than to attempt to carve out exceptions in an otherwise multi-tenant cloud environment.
In a Public Cloud, the costs of the shared security infrastructure that may be available to customers can be spread over multiple tenants. For example, the cloud provider may enable the use of shared firewall resources for the inspection of traffic at the ingress of the cloud environment. Customer can share costs of the maintenance and management as well as the shared hardware resources used to deliver those firewall services. This is important to note when those security resources include threat management and intrusion detection services. Often, the deployment and support of dedicated security infrastructure can be expensive. Furthermore, most security infrastructure can be tailored to comply with most specific regulations or security standards, such as HIPAA, PCI DSS, and others.
It is important to understand how cloud providers deliver managed cloud services on a public cloud platform. Typically, the elastic environment is built on a robust, highly scalable platform with the ability to grow much larger than any individual private cloud environment. This implies that there are a significant number of benefits of scale built into a common platform. This allows for the following benefits to the provider, with a trickle-down effect to each tenant.
- The per-unit cost of each additional resource is greatly reduced because a greater number of enhancements can be performed in a public cloud platform than in a private cloud platform.
- When a provider delivers security services in a public cloud environment, each tenant gains the benefits of security measures enforced for other clients. An example of these benefits would be if a specific, known vulnerability is remediated for one customer, the same vulnerability remediation may be easily applied to all customers.
- The cloud provider’s reputation may work to the tenant’s advantage. A cloud provider may take better precautions, such as adding additional redundancy, adding capacity sooner, or establishing more stringent change-management programs, for a shared public cloud infrastructure than they may be willing to deliver in a dedicated private cloud. This may lend itself to better Service Level Agreements (SLA), greater availability, better flexibility, and rapid growth.
It is rare that a new cloud customer will require a dedicated cloud infrastructure. This is most often reserved for those in the government, servicing the government, or in highly regulated industries. For the rest, a public cloud infrastructure will likely provide the flexibility, growth, cost savings, and elasticity necessary to make the move from a dedicated physical environment to the cloud. Those who choose to move to the public cloud understand the benefits and are able to leverage their providers to deliver the service levels and manageability to make the cloud experience a positive one.
Allen Allison, Chief Security Officer at NaviSite (www.navisite.com)
During his 20+ year career in the information security industry, Allen Allison has served in management and technical roles, including the development of NaviSite’s industry-leading cloud computing platform; chief engineer and developer for a market-leading managed security operations center; and lead auditor and assessor for information security programs in the healthcare, government, e-commerce, and financial industries. With experience in the fields of systems programming; network infrastructure design and deployment; and information security, Allison has earned the highest industry certifications, including CCIE, CCSP, CISSP, MCSE, CCSE, and INFOSEC Professional. A graduate of the University of California, Irvine, Allison has lectured at colleges and universities on the subject of information security and regulatory compliance.
March 21, 2011 | Leave a Comment
By Slavik Markovich, CTO of Sentrigo
The move to Cloud Computing brings with it a number of attributes that require special consideration when it comes to securing data. And since in nearly every organization, their most sensitive data will be stored either directly in a relational database, or ultimately in a relational database through an application, how these new risks impact database security in particular is worth considering. As users move applications involving sensitive data to the cloud, they need to be concerned with three key issues that affect database security:
1) Privileged User Access– Sensitive data processed outside the enterprise brings with it an inherent level of risk, because outsourced services bypass the physical, logical and personnel controls IT departments exert over in-house programs. Put simply, outsiders are now insiders.
2) Server Elasticity – One of the major benefits of cloud computing is flexibility, so aside from the fact that you may not know (or could have little control over) exactly where your data is hosted, the servers hosting this data may also be provisioned and de-provisioned frequently to reflect current capacity requirements. This changing topology can be an obstacle to some technologies you rely on today, or a management nightmare if configurations must be updated with every change.
3) Regulatory Compliance: Organizations are ultimately responsible for the security and integrity of their own data, even when it is held by a service provider. The ability to demonstrate to auditors that their data is secure despite a lack of physical control over systems, hinges in part on educating them, and in part on providing them with the necessary visibility into all activity.
Access control and monitoring of cloud administrators is a critical issue to ensuring sensitive data is secure. While you likely perform background checks on your own privileged users and may also have significant physical monitoring in place as well (card keys for entry to the datacenter, cameras, and even monitoring by security personnel) — even if this is being done by your cloud provider — it is still not your own process. And that means giving up some element of control. Yet, these individuals may have nearly unlimited access to your infrastructure, something they need in many cases to ensure the performance and availability of the cloud resources for all customers.
So, it is reasonable to ask the cloud provider what kinds of controls exist on the physical infrastructure – most will have this well under control (run away, do not walk, if this is not the case). The same is likely true for background checks on administrators. However, you’ll also want to know if a malicious administrator at the cloud provider makes an unauthorized copy of your database, or simply connects directly to the database and changes records in your customer accounts. You can’t trust simple auditing solutions as they are easily bypassed by DBAs, and audit files can be doctored or deleted by System Administrators.
You have a number of ways to address this (encryption, tokenization, masking, auditing and monitoring), but in all cases you need to make sure the solution you deploy cannot be easily defeated, even by privileged users, and will also work well in the distributed environment of the cloud. This brings us to our next point.
Much has been written about how the location of your data assets in the cloud can impact security, but in fact potentially even more challenging, is the fact that the servers hosting this data are often reconfigured over the course of a day or week, in some cases without your prior knowledge. In order to provide high availability and disaster recovery capabilities, cloud providers typically have data centers in multiple locations. And to provide the elastic element of cloud computing, where you can expand capacity requirements in near real-time, additional resources may be provisioned as needed wherever capacity is available. This results in an environment that is simply not static, and unless you are hosting your own private cloud, you may have limited visibility into these physical infrastructure updates.
How does this impact security? Many of the traditional methods used to protect sensitive data rely on an understanding of the network topology, including perimeter protection, proxy filtering and network sniffing. Others may rely on physical devices or connections to the server, for example some types of encryption, or hardware-assisted SSL. In all of these cases, the dynamic nature of the cloud will render these models untenable, as they will require constant configuration changes to stay up-to-date. Some approaches will be impossible, as you will not be able to ensure hardware is installed in the servers hosting your VMs, or on specific network segments along with the servers.
To work in this model, you need to rethink database security, and utilize a distributed approach – look for components that run efficiently wherever data assets are located (locally on your cloud VMs), and that requires minimal (if any) configuration as VMs are provisioned, de-provisioned, and moved.
Lastly, you will likely face a somewhat more challenging regulatory audit as you move data subject to these provisions to the cloud. It’s not that this is inherently less secure, but more so due to the fact that it will be something different for most auditors. And to the majority of auditors, different is not usually a good thing (apologies up front to all those very flexible auditors that are reading this – why is it we never have you on our customer audits?) So, if the data you need for an application hosted in the cloud is subject to Sarbanes-Oxley, HIPAA/HITECH, PCI DSS, or many other regulations, you need to make sure the controls necessary to meet compliance are in place, AND that you can demonstrate this to your auditor.
We’re seeing many cloud providers trying to placate these concerns by getting their own SAS-70 certifications, or even PCI DSS certifications done generally for their environment. However, while this is a nice touch and can even be helpful in your own audit, you are ultimately responsible for your own data and the processes related to it — and YOUR auditor will audit YOUR environment, including any cloud services. So, you will need to be able to run reports on all access to the database in question and prove that in no case could an insider have gained access undetected (assuming your auditor is doing his or her job well, of course). The key here is to look for strong segregation of duties, including the ability for you (or a separate 3rd party, NOT the cloud provider) to monitor all activity on your databases. So, if a privileged user touches your data, alerts go off, and if they turn off the monitoring all together, you are notified in real-time.
It is certainly possible to address these issues, and implement database security that is not easily defeated, that operates smoothly in the dynamic environment of the cloud, and provides auditors with demonstrable proof that regulatory compliance requirements have been satisfied. But, it very well may mean looking at a whole new set of security products, developed with the specific needs of cloud deployments in mind.
Slavik is CTO and co-founder of Sentrigo (www.sentrigo.com), a leading provider of database security for on-premises and cloud computing environments and corporate member of the Cloud Security Alliance (CSA). Previously, Slavik was VP R&D and Chief Architect at DB@net, a leading IT architecture consultancy, and led projects for clients including Orange, Comverse, Actimize and Oracle. Slavik is a recognized authority on Oracle and JAVA/JavaEE technologies, has contributed to open source projects and is a regular speaker at industry conferences. He holds a BSc in Computer Science.
March 10, 2011 | Leave a Comment
The Problem with Passwords
by Patrick Harding, CTO, Ping Identity
The average enterprise employee uses 12 userid/password pairs for accessing the many applications required to perform his or her job (Osterman Research 2009). It is unreasonable to expect anyone to create, regularly change (also a prudent security practice) and memorize a dozen passwords, but is considered today to be a common practice. Users are forced to take short-cuts, such as using the same userid and password for all applications, or writing down their many strong passwords on Post-It notes or, even worse, in a file on their desktop or smartphone.
Even if most users could memorize several strong passwords, there remains risk to the enterprise when passwords are used to access cloud services (such as Google Apps or Salesforce.com) where they can be phished, intercepted or otherwise stolen.
The underlying problem with passwords is that they work well only in “small” spaces; that is, in environments that have other means to mitigate risk. Consider as an analogy the bank vault. Its combination is the equivalent of a strong password, and is capable of adequately protecting the vault’s contents if, and only if, there are other layers of security at the bank.
Such other layers of security also exist within the enterprise in the form of locked doors, receptionists, ID badges, security guards, video surveillance, etc. These layers of security explain why losing a laptop PC in a public place can be a real problem (and why vaults are never located outside of banks!).
Ideally, these same layers of internal security could also be put to use securing access to cloud services. Also ideally, users could then be asked to remember only one strong password (like the bank vault combination), or use just one method of multi-factor authentication. And ideally, the IT department could administer user access controls for cloud services centrally via a common directory (and no longer be burdened by constant calls to the Help Desk from users trying to recall so many different passwords).
One innovation in cloud security makes this ideal a practical reality: federated identity.
Federated Identity Secures the Cloud
Parsing “federated identity” into its two constituent words reveals the power behind this approach to securing the cloud. The identity is of an individual user, which is the basis for both authentication (the credentials for establishing the user is who he/she claims to be) and authorization (the cloud services permitted for use by specific users). Federation involves a set of standards that allows identity-related information to be shared securely between parties, in this case: the enterprise and cloud-based service providers.
The major advantage of federated identity is that it enables the enterprise to maintain full and centralized control over access to all applications, whether internal or external. Further, federated single sign-on (SSO) allows a user to login once then access all authorized cloud services via a portal or other convenient means of navigation. The IT department essentially controls how users authenticate; including whatever credentials may be required. A related advantage is that, with all access control provisions fully centralized, “on-boarding” (adding new employees) and “off-boarding” (terminating employees) become at once more secure and substantially easier to perform.
Identity-related information is shared between the enterprise and cloud-based service providers through security tokens; not the physical kind, but as cryptographically signed documents (e.g. XML-based SAML tokens) that contain data about a user. Under this trust model, the good guys have good documents (security tokens) issued from a trusted source; the bad guys never do. For this reason, both the enterprise and the service providers are protected. These security tokens essentially replace the use of a password at each cloud service.
When enabling a federated relationship with different cloud services, there are always two parties: the Identity Provider (IdP) and the Relying Party (RP)[PD1] . The Identity Provider (the enterprise) is the authoritative source of the identity information contained in the security tokens. The Relying Parties (the cloud service providers) establish relationships with one or more Identity Providers and verifies and trusts the security tokens containing the assertions needed to govern access control.
The authoritative nature of and the structured relationship between the two parties are fundamental to federated identity. Based on the trust established between the Enterprise and the cloud service the Relying Parties have full confidence in the security tokens they receive. [PH2]
As the popularity of cloud-based services continues to grow, IT departments will increasingly turn to federated identity as the preferred means for managing access control. With federated identity, users and the IT staff both benefit from greater security but also from greater convenience and productivity. Users log in only once, remembering only one strong password, to access all authorized cloud services.
To learn more about Identity’s role in Cloud Security, visit Ping Identity www.pingidentity.com.
Patrick Harding, CTO, Ping Identity
Harding brings more than 20 years of experience in software development, networking infrastructure and information security to the role of Chief Technology Officer for Ping Identity. Previously, Harding was a vice president and security architect at Fidelity Investments where he was responsible for aligning identity management and security technologies with the strategic goals of the business. An active leader in the Identity Security space, Harding is a Founding Board Member for the Information Card Foundation, a member of the Cloud Security Alliance Board of Advisors, on the steering committee for OASIS and actively involved in the Kantara Initiative and Project Concordia. He is a regular speaker at RSA, Digital ID World, SaaS Summit, Catalyst and other conferences. Harding holds a BS Degree in Computer Science from the University of New South Wales in Sydney, Australia.
[PD1] Service Provider is used above
[PH2] I would suggest that this paragraph be deleted as a lot of the terms and concepts have not been introduced or explained.
March 8, 2011 | 1 Comment
Chris Wysopal, CTO, Veracode
Developers and IT departments are being told they need to move applications to the cloud and are often left on their own to navigate the challenges related to developing and managing the security of applications in those environments. Because no one should have to fly blind through these uncertain skies, it’s important to dispel the myths, expose the realities and establish best practices for securing cloud-based applications.
Whether we are talking about IaaS (Infrastructure as a Service), PaaS (Platform as a Service) or SaaS (Software as a Service), perceived security vulnerabilities in the cloud are abundant. A common myth is that organizations utilizing cloud applications should be most concerned about someone breaking in to the hosting provider, or an insider gaining access to applications they shouldn’t. This is an outdated, generic IT/infrastructure point of view. What’s more important and elemental is to examine if the web application being used is more vulnerable because of the way it was built, then deployed in the cloud – versus focusing on cloud security risks from an environmental or infrastructure perspective.
It’s imperative to understand the inherent (and non-storied) threats facing applications in virtualized environments. Common vulnerabilities associated with multi-tenancy and cloud provider services, like identity and access management, must be examined from both a security and compliance perspective. Obviously in a multi-tenant environment, hardware devices are being shared among other companies – potentially by competitors and other customers, as well as would-be attackers. Organizations lose control over physical network or computing systems, even local storage for debugging and logging is remote. Additionally, auditors may be concerned about the fact that the cloud provider has access to sensitive data at rest and in transit.
Inherent threats are not only present in the virtualized deployment environment, but also in the way applications for the cloud are developed in the first place. Consider the choices many architects and designers are forced to make when it comes to developing and deploying applications in the cloud. Because they are now in a position where they are relying on external controls put in place by the provider, they may feel comfortable taking short cuts when it comes to building in application security features. Developers can rationalize speed time to market advantages related to by being able to use, and test, less code. However, by handing external security controls to the provider, new attack surfaces quickly emerge related to VM, PaaS APIs and cloud management infrastructure.
Security – Trust No One
Security trust boundaries completely change with the movement of applications from internal or DMZ, to the cloud. As opposed to traditional internal application infrastructures, in the cloud the trust boundary shrinks down to encompassing only the application itself, with all the users and related storage, database and identity management systems becoming “external” to that application. In this situation, “trust no one” takes on great significance to the IT organization. With all these external sources wanting access to the application, how do you know what request is legitimate? How can we make up the lack of trust? It boils down to establishing an additional layer of security controls. Organizations must encrypt all sensitive data stored or transmitted and treat all environmental inputs as untrusted in order to protect assets from attackers and the cloud provider itself.
Fasten Your Seatbelts
Best practices aimed at building protection must be incorporated into the development process to minimize risks. How can you help applications become more secure? It starts with a seatbelt – in the form of application level security controls that can be built into application code or implemented by the cloud services provider itself. Examples of these controls can include encryption at rest, encryption in transit, point-to-point and message contents, auditing and logging, or authentication and authorization. Unfortunately, in an IaaS environment, it may not be an option to have the provider manage these controls. The advantages of using PaaS APIs to establish these controls, for example, is that in most cases the service provider has tested and debugged the API to speed time to market for the application. SaaS environments offer no choice to the developer, as the SaaS provider will be totally in control of how data is secured and identity managed.
Traditional Application Security Approaches Still Apply
Another myth that must be debunked is the belief that any approach to application security testing – perhaps with a slightly different wrapper on it – can be used in a cloud environment. While it is true that traditional application security issues still apply in the cloud, and that you still need to take advantage of established processes associated with requirements, design, implementation and testing, organizations can’t simply repackage what they know about application security. Applications in the cloud require special care. IT teams can’t be content to use mitigation techniques only at the network or operating system level anymore.
Security testing must be done at the application level, not the environmental level. Threat modeling and design phases need to take additional cloud environmental risks into account. And, implementation needs to use cloud security aware coding patterns in order to effectively eliminate vulnerability classes such as Cross-Site Scripting (XSS) and SQL Injections. Standards such as OWASP Top 10 and CWE/SANS Top 25 are still applicable for testing IaaS and PaaS applications, and many SaaS extensions.
Overall, dynamic web testing and manual testing are relatively unchanged from traditional enterprise application testing, but it’s important to get permission and notify your cloud provider if you plan to do dynamic or manual testing, especially on a SaaS extension you have written, so it doesn’t create the appearance that your organization is attempting an attack on the provider.
It’s also important to note that cloud design and implementation patterns are still being researched, with efforts being led by organizations like the Cloud Security Alliance and NIST. Ultimately, it would be valuable for service providers to come up with a recipe-like implementation for APIs.
After applications have been developed, application security testing has been performed according to requirements of the platform, and you are presumably ready to deploy, how do you know you are ready? Each environment, IaaS, PaaS or SaaS, requires its own checklist to ensure the applications are ready for prime time. For example, for an IaaS application, the organization must have taken steps such as securing the inter-host communication with channel level encryption and message based security, and filtered and masked sensitive information sent to debugging and logging functions. For a PaaS application, threat modeling must have incorporated the platform API’s multi-tenancy risks. For SaaS, it’s critical to have reviewed the provider’s documentation on how data is isolated from other tenants’ data. You must also verify the SaaS provider’s certifications and their SDLC security processes.
Myth: just because you are prepared for a safe flight, doesn’t mean it will be. Even with all the best preparation and safety measures in place, there is no debating the nascent nature of this deployment environment, leaving much more research that needs to be done. One effective approach it to use threat modeling to help developers better understand the special risks of applications in the cloud. For example, using this approach they can identify software vulnerabilities that can be exploited by a “pause and resume” attack where a virtual machine becomes temporarily frozen. A seemingly innocent halt to end-user productivity can actually mean a hacker has been able to enter a system to cause damage by accessing sensitive information or planting malicious code that can be released at a future time.
As a security community, with security vendors, cloud service providers, research organizations and end-users who all have a vested interest in secure deploying applications in the cloud, we have the power establish guidelines and regular best practices aimed at building protection into the development process to prevent deployment risks. Fasten your seatbelts, it’s going to be a fun ride.
As co-founder and CTO of Veracode (www.veracode.com), Chris Wysopal is responsible for the security analysis capabilities of Veracode technology. He’s a recognized expert and well-known speaker in the information security field. His groundbreaking work while at the company @stake was instrumental in developing industry guidelines for responsibly disclosing software security vulnerabilities. Chris is co-author of the award winning password auditing and recovery application @stake LC (L0phtCrack), currently used by more than 6,000 government, military and corporate organizations worldwide.
March 2, 2011 | Leave a Comment
Cloud computing has become an integral part of all IT decision making today across industries and geographies. This market is growing at a rapid pace. By 2014, IDC expects public cloud spending to rise to $29.5 billion growing at 21.6 percent per year. At the same time, Forrester predicts the cloud security market to grow to $1.5 billion by 2015. This is good news, yet there are many CIOs sitting on the fence and not jumping on the opportunity cloud computing presents as they worry about security of data and applications. The figure below lists survey results from top CIOs when asked about their top of mind concern for using cloud services by TechTarget.
Loss of control, Compliance implications, and Confidentiality and auditing topped the results. Under these 3 themes, the issues they listed are:
- They find it hard to trust cloud providers security model
- Manage proliferation of user accounts across cloud application providers
- Extended enterprise boundary complicates compliance
- Shared infrastructure, if the cloud gets hacked so do you
- Audit log silos on proprietary cloud platforms
This blog post lists a potential solution to address these issues and more.
First, lets look at the various layers that are required to secure cloud applications and data.
You need to protect applications and data for assurance and compliance, access control, and defend against malicious attacks at the perimeter. Yet, the weakest link remains the client as malware and phishing attacks can send requests as if it were coming from a human user. To achieve end-to-end security, you need to look holistically at how to provide “trusted client to cloud access”. You can watch a webinar on this topic I recently did with security expert Gunnar Peterson.
One solution to this problem is to have a trusted broker that provides the glue between client security and cloud security. It should be able to determine if cloud applications are being accessed from trusted and attested client devices or not, and block access from all non-trusted clients. One way to get client attestation is through Intel® Identity Protection Technology (IPT) which embeds 2nd factor authentication in the processor itself.
While a trusted broker enforces above check it should also be able to provide supplemental security on top of what cloud applications provide by offering:
- Federated Single Sign-On (SSO) using industry standards such as SAML, OAUTH and OpenID
- 2 factor strong authentication with convenient soft OTP token support
- Elevated authentication (term to represent step-up authentication on a per request basis, coined by Mark Diodati of Burton group in his latest report on Authentication Decision Point Reference Architecture)
- Automated account provisioning and deprovisioning with automated identity attribute synchronization to ensure that all identity attributes across enterprise and cloud applications never go out-of-sync
- Centralized audit repository with common audit record across cloud applications
- Orphan account reporting to catch unauthorized account creation by administrators in cloud applications
- And, a single dashboard to get 360 degree visibility on how cloud applications are being accessed by users (aka user activity monitoring)
Such a “Trusted Broker” software can insure that Enterprises adopt cloud applications providing tools to achieve “Control, Visibility, and Compliance” when accessing cloud applications. View more on Intel’s solutions in this space.
Cloud Security Alliance (CSA) is working feverishly to provide awareness and guidance with reference implementations to address some of the security concerns listed earlier in this blog post. At the CSA summit 2011 held at RSA conference 2011, I presented a roadmap for Trusted Cloud Initiative (TCI) which is one of the sub groups of CSA. In it’s reference architecture, TCI lists the following use cases for trusted access to the cloud.
TCI also published a whitepaper covering identity and access control for cloud applications.
While cloud application providers continue to enhance their security posture, it’s in the best interest of enterprises to supplement it with additional security controls using technologies such as “Trusted Broker” that enable end-to-end secure client to cloud access and provide 360 degree visibility and compliance into how various cloud applications are being accessed by enterprise users. One such implementation of a “Trusted Broker” is provided by Intel Expressway Cloud Access 360 product. Visit http://www.dynamicperimeter.com to learn more.
|Vikas Jain, Director of Product Management for Application Security and Identity Products with Intel Corporation has over 16 years experience in the software and services market, with particular expertise in cloud security, identity and access management, and application architecture. Prior to joining Intel, Vikas has held leadership roles in product management and software development at a wide-range of technology companies including Oracle, Oblix, Wipro and Infosys. You can follow him on twitter @ VikasJainTweet|