Cloud Security: An Oxymoron? Arrow to Content

November 29, 2011 | 1 Comment

Written by Torsten George, Vice President of Worldwide Marketing at Agiliance

 

Cloud computing represents today’s big innovation trend in the information technology (IT) space. Because it allows organizations to deploy quickly, move swiftly, and share resources, cloud computing is rapidly replacing conventional in-house facilities at organizations of all sizes.

 

However, the 2012 Global State of Information Security Survey, which was conducted by PwC US in conjunction with CIO and CSO magazines among more than 9,600 security executives from 138 countries, reveals that uncertainty about the ability of cloud service providers’ security policies is still a major inhibitor to cloud computing. More than 30 percent of respondents identified their company’s uncertain ability to enforce their cloud providers’ security policies as the greatest security threat from cloud computing. With this in mind, is cloud security even achievable or just an oxymoron?

 

In their eagerness to adopt cloud platforms and applications, organizations are neglecting to recognize and address the compliance and security risks that come with implementation. Often the ease of getting a business into the cloud – a credit card and a few keystrokes is all that is required – combined with service level agreements provides a false sense of security.

 

However, shortcomings in the cloud providers’ security strategy can trickle down to the organizations that leverage their services. Damages can range from pure power outages impacting business performance, data loss, unauthorized disclosure, data destruction, copyright infringement, to brand reputational loss.

 

Cloud Computing Vs. Cloud Security

 

A naturally risk-adverse group, IT professionals are facing a strong executive push to harness the obvious advantages of the cloud (greater mobility, flexibility, and savings), while continuing to protect their organization against new threats that appear as a result.

 

For organizations planning to transition their IT environment to the cloud, it is imperative to be cognizant of often overlooked issues such as loss of control and lack of transparency. Cloud providers may have service level agreements in place, but security provisions, the physical location of data, and other vital details may not be well-defined. This leaves organizations in a bind, as they must also meet contractual agreements and regulatory requirements for securing data and comply with countless breach notification and data protection laws.

 

Whether organizations plan usage of public clouds, which promise an even higher return on investment, or private clouds, better security and compliance is needed. To address this challenge, organizations should institute policies and controls that match their pre-cloud requirements. At the end, why would you apply less stringent requirements to a third-party IT environment than your own – especially if it potentially impacts your performance and valuation?

 

Most recent cyber attacks and associated data breaches of Google and Epsilon (a leading marketing services firm) are prime examples of why organizations need to think about an advanced risk and compliance plan that includes their third-party managed cloud environment.

 

Enabling Cloud Security

 

With most organizations beyond debating whether or not to embrace the cloud model, IT professionals should now re-focus their resources on managing the move to the cloud so that the risks are mitigated appropriately.

 

When transitioning your IT infrastructure to a cloud environment you have to find ways to determine how to trust your cloud provider with your sensitive data. Practically speaking, you need the ability to assess security standards, trust security implementations, and prove infrastructure compliance to auditors.

 

As part of a Cloud Readiness Assessment, organizations should evaluate potential cloud service models and providers. Organizations should insist that the cloud service providers grant visibility into security processes and controls to ensure confidentiality, integrity, and availability of data. It is important not only to rely on certifications (e.g., SAS 70), but more importantly document security practices (e.g., assessment of threat and vulnerability management capabilities, continuous monitoring, business continuity plan), compliance posture, and ability to generate dynamic and detailed compliance reports that can be used by the provider, auditors, and an organization’s internal resources.

 

Considering that many organizations deal with a heterogeneous cloud eco-system, comprised of infrastructure service providers, cloud software providers (e.g., cloud management, data, compute, file storage, and virtualization), and platform services (e.g., business intelligence, integration, development and testing, as well as database), it is often challenging to gather the above mentioned information in a manual fashion. Thus, automation of the vendor risk assessment might be a viable option.

 

Following the guidelines developed by the Cloud Security Alliance, a non-profit organization formed to promote the use of best practices for providing security assurance within cloud computing, organizations should not stop with the initial Cloud Risk Assessment, but continuously monitor the cloud operations to evaluate the associated risks.

 

A portion of the cost savings obtained by moving to the cloud should be invested into increasing the scrutiny of the security qualifications of an organization’s cloud service provider, particularly as it relates to security controls, and ongoing detailed assessments and audits to ensure continuous compliance.

 

If at all possible and accepted by the cloud service provider, organizations should consider leveraging monitoring services or security risk management software that achieves

 

  • Continuous compliance monitoring.
  • Segregation and virtualization provisioning management.
  • Automation of CIS benchmarks and secure configuration management integrations with security tools such as VMware vShield, McAfee ePO, and NetIQ SCM.
  • Threat management with automated data feeds from zero-day vendors such as VeriSign and the National Vulnerability Database (NVD), as well as virtualized vulnerability integrations with companies such as eEye Retina and Tenable Nessus.

 

Automated technology, which allows a risk-based approach and continuous monitoring for compliance would be suitable for organizations seeking to protect and manage their data in the cloud.

 

Many cloud service providers might be opposed to such measures, but the increasing number of cyber security attacks and associated data breaches are offering great incentives to offer these capabilities to their clients not only as a sign of establishing trust, but also as a competitive advantage.

Cloud Security Considerations Arrow to Content

November 14, 2011 | Leave a Comment

Can a cloud be as secure as a traditional network?  In a word, yes!  I agree that some may find this statement surprising.  Depending on the network, that may be a low bar, but good security principles and approaches are just as applicable to cloud environments as they are to traditional network environments.  However, the key is to know how to extend a multi-layered defense into the cloud/virtualization layer.

 

One of the cloud security benefits frequently mentioned is standardization and hardening of VM images.  This can help reduce complexity and ensure that all systems start from a good security posture.  Also, it helps enable a rapid response to fix identified issues.  Some people claim that complexity, or the diversity, of different systems in a traditional network environment is a security benefit because a single vulnerability is not capable of compromising all systems. However, the reality is it is usually more difficult to manage the disparate systems because of the tools and expert resources required to maintain them.

 

Hardening is not only for VMs.  It has to be extended throughout the cloud environment to include the hypervisor, management interfaces, and all other virtual components, such as network devices.  This requires some time and expertise in understanding how to control functionality without losing productivity.  If you ask your service provider or internal team about hardening the virtualization layer and you get blank stares back, you may have a problem.  Also, you should not accept the default statement that “the hypervisor is essentially a hardened O/S” as a complete answer.  Securing the virtualization layer is one of the new and key areas to providing protection for cloud environments.

 

Strong authentication and authorization methods are critical to address, since this is an often neglected area in traditional networks.  It is important to do it right.  It is worth noting that the Verizon 2011 Data Breach Investigative Report cites “exploitation of default or guessable credentials” and “use of stolen login credentials” as some of the most used hacking attacks.  Whether a private or public cloud environment, there needs to be a solid layer of protection from unauthorized access.  Two-factor authentication is a must for remote and administrative access; it is a best practice to require two-factor authentication throughout the virtualized environment, wherever it is practicable.

 

Encryption should be utilized for both data in-transit, as well as data–at-rest.  In addition to providing confidentiality and integrity, encryption plays a critical role in protecting data that is in environment where it may not be able to be destroyed by normal methods.  Once encrypted data is no longer needed, the encryption key for that data set can be destroyed. However, this requires that the organization retain and manage the encryption keys and not the service provider.

 

Encryption is also being used in innovative ways to create an isolated environment within a cloud.  This can be used to extend security and compliance controls from an organization’s traditional network into a cloud.  This can help overcome barriers to cloud security by enabling enterprises to run selected applications and maintain data in the cloud with the same protection and control available internally.

 

Summary

Clouds, like a traditional network environments, require careful security planning, design, and operations.  The various types our clouds and delivery models will have varying degrees of security and flexibility, some with the ability to layer in additional levels of security controls.  This is why it is important to have a firm understanding of security and compliance requirements prior to moving to the cloud.

 

It is fortunate that good security practices are applicable to the cloud.  However, the virtualization layer is a new area – one that requires specialized attention understanding and proficient when it comes to implementing security controls.  Hardening, access control, and encryption are three primary areas of focus in building a multi-layered defense in cloud environments.  Clouds can meet security and compliance requirements, but only if essential security practices are applied throughout them.

 

About the Author

Ken Biery is a principal security consultant with Terremark, Verizon’s IT services subsidiary, focused on providing governance, risk, and compliance counsel to enterprises moving to the cloud. With extensive knowledge in the area of cloud computing, he enables companies around the globe to securely migrate to the cloud and crate more efficient IT operations.

Leveraging Managed Cloud Services to Meet Cloud Compliance Challenges Arrow to Content

November 4, 2011 | 1 Comment

By Allen Allison

 

Regardless of your industry, customer base, or product, it is highly likely that you face regulatory compliance requirements.  If you handle Protected Health Information (PHI), the Health Insurance Portability and Accountability Act (HIPAA) – along with the HITECH enhancements – are a primary concern for your organization.  If you work with government agencies, you may need to be compliant with the Federal Information Security Management Act (FISMA) or National Institute of Science and Technology (NIST) requirements.  In addition, most states have privacy laws protecting Personally Identifiable Information for residents.

It is a common misunderstanding that these regulatory compliance requirements preclude many organizations from being able to leverage outsourced, managed cloud services.  Depending on the cloud services provider you choose, you may not only be able to meet your existing compliance concerns, but the cloud provider is likely to have controls and processes that improve your compliance program.

When HIPAA was enhanced by the Health Information Technology for Economic and Clinical Health (HITECH) Act, companies with PHI began to panic.  Not only were they expected to protect patient health information, but they had the added requirement of ensuring that third-party providers enabled the same stringent controls on the systems they support.  Furthermore, these organizations had the added responsibility of providing breach notification in the event of a loss of confidentiality.

If nothing else, HITECH gives us two things.  First, the heightened awareness of the sensitivity of each individual’s health information provides more enhanced security programs and assurance to the public that privacy is being protected.  Second, because no organization wants to be in the headlines for a security breach, HITECH spurs organizations to improve their information security, enhance their response services, and enable a platform to notify affected individuals if their information has been compromised.  I can, with all honesty, say that I do feel a bit more secure with my Protected Health Information.

I use HIPAA and HITECH as an example, not because it is the model information security regulation (it is not), but because it is a topic that everyone can relate to.  Similar security requirements stretch across most industries.  What HITECH has done for cloud service providers is enable them to build a common control platform, implement technologies that may be too expensive for some organizations to implement themselves, and leverage a world class security and compliance platform to ensure that the PHI, which is vital to the ongoing management of health care, remains secure, protected, and confidential.

When searching for a cloud provider, it is important to understand which controls the provider has built into the underlying platform are applicable to your compliance.  I recommend asking these three questions:

  1. How many customers in my industry do you have as a customer in your cloud platform?
  2. May I see your most recent SSAE 16 SOC report or other applicable audit?
  3. What is the development lifecycle process your team undergoes to build cloud services and the underlying platform?

With a complete understanding of how ingrained security is in a cloud service provider’s technology and processes, you can begin to understand how it will deal with your sensitive data.

I would like to point out one pitfall.  Not all compliance programs apply to a cloud service provider’s customers.  For example, the SSAE 16 program is of great benefit to customers of cloud service providers.  And customers to whom SSAE 16 extends can rely on the SOC report as part of their own internal controls and compliance.  On the other hand, a provider’s compliance with, for example, Safe Harbor does not extend to the customer; the customer must pursue Safe Harbor, separately.

You must remember, working with a reputable cloud service provider may be an excellent way to leverage expertise and processes you may not otherwise have in-house, and mitigate some risk by assigning responsibility to a 3rd party you can hold accountable to protect your data.  The cloud is rapidly becoming the hosting platform of choice for highly regulated industries because more organizations are leveraging the expertise of these pure information-centric service providers.

 

Allen Allison, Chief Security Officer at NaviSite (www.navisite.com)

During his 20+ year career in information security, Allen Allison has served in management and technical roles, including the development of NaviSite’s industry-leading cloud computing platform, chief engineer and developer for a market-leading managed security operations center; lead auditor and assessor for information security programs in the healthcare, government, e-commerce, and financial industries. With experience in systems programming, network infrastructure design/deployment, and information security, Allison has earned the highest industry certifications, including CCIE, CCSP, CISSP, MCSE, CCSE, and INFOSEC Professional. A graduate of the University of California, Irvine, Allison has lectured at universities and spoken at industry shows such as Interop, RSA Conference, Cloud Computing Expo, MIT Sloan CIO Symposium, and Citrix Synergy.

 

Cloud Security: Confident, Fearful, or Surprised Arrow to Content

November 4, 2011 | 1 Comment

By Ken Biery

 

This two-part guest blog series explores the topic of cloud security.  Part one of the series focuses on the questions enterprise IT decision makers should ask when considering moving business applications to a cloud-based computing environment.

 

 

There is no shortage of information about cloud security. There are those that say cloud security is inherently more secure because of its ability to create and maintain a more hardened centralized environment.  Others claim, because of multi-tenancy, virtual systems and data will never be even modestly secure.

 

The big surprise about cloud security may be that there are not really any big surprises.  The good security practices that work in a traditional network also work for cloud-based IT.  The key is understanding how to apply security practices to a cloud environment and to develop a security strategy that uses known and sound security foundations to address various cloud environments.

 

A more secure cloud is the product of careful planning, design, and operations.  This begins with understanding the type of cloud (public, private, hybrid) that is being used and then its model, whether it be software-as-a-service (SaaS), platform-as-a-service (PaaS) or infrastructure-as-a-service (IaaS.) These two factors will determine the type and amount of security controls needed and who is responsible for them.

 

Public and Private Clouds

Public clouds typically tend to have a limited number of security measures, providing a more open and flexible computing environment.  These clouds usually have a lower cost since their security features are basic.  While this may be perfectly acceptable for some circumstances, such as non-critical or non-sensitive environments, it will not usually meet the requirements of most enterprise users.

 

Public clouds also generate the most concern about using a shared virtualized environment.  These are mainly centered on how to properly segment systems and isolate processing resources.  Segmentation and isolation can be challenging to accomplish and measure, especially for an auditor or assessor looking at these primary security control areas.  Another factor is that many public cloud providers do not, or cannot, sufficiently support the types of controls required by enterprises to meet security and compliance requirements.

 

When considering a public cloud, it is important to ask the provider about their security measures, such as segmentation, firewalls/intrusion protection systems, monitoring, logging, access controls, and encryption.  Their responses and transparency about the details of their environment’s security measures speak volumes of what to expect.  Also, you may want to do some searches on the provider as they may have a reputation for harboring “bad neighborhoods”, which tend to host botnets or malware sites.

 

Private clouds can be internally hosted or located at a service providers’ facility.  For internally hosted clouds, just like traditional environments, the security design and controls can be highly customized and controlled by the organization.  If hosted at a service provider, the number of controls can vary considerably depending on the model selected.  This is not to say that a service provider cannot provide a good set of default and optional security controls.  Obviously, this is why having a good understanding of the provider’s cloud design and its features, as well as your own requirements, is crucial.

 

Multi-tenancy, Segmentation, and Isolation

Multi-tenancy is one of the major issues when it comes to security and compliance in the cloud.  In some cases, multi-tenancy may require that an environment’s controls be set to the lowest level to support the broadest set of requirements for the largest number of potential users.  One of the main concerns around multi-tenancy is that, due to the use of a shared resource pool of computing resources, one entity’s virtual machine (VM) could compromise another entity’s VM.  A lack of proper segmentation between the two entities’ environments could make this possible.

 

This lack of separation can also create compliance challenges for multi-tenancy environments.  Assessors and auditors are looking for sufficient controls to help prevent information leakage between virtual environment components.  Improperly configured hypervisors, management interfaces, and VMs have the potential to become a leading cause for non compliance and risk exposure.  In a traditional network, if a system is misconfigured, it can be compromised.  If a virtual environment is misconfigured, it can compromise all of the systems within it.

 

It is important to note that there has not been any major publically disclosed compromise of hypervisors.  However, it is only a matter of time.  The virtualization layer is too tantalizing of a target for hackers not to pursue aggressively.

 

One of the cleanest ways to show separation within a virtualized environment is to have VMs with compliance or higher security requirements run on dedicated physical hardware.  Yes, this is contrary to one of the benefits of cloud computing until the effort and cost of compliance and robust security is considered.  This approach can be easier to establish and maintain since only a smaller number of systems may need to have advanced protection.

 

Isolation needs to be performed at the operating system (O/S) layer and no two VM operating systems should be shared. Specifically, the rapid-access memory (RAM), processor and storage area network (SAN) resources should be logically separated, with no visibility to other client instances. From a network perspective, each entity is separated from the next by use of a private virtual local area network (VLAN.)

 

The second part of this blog series will explore the cloud security best practices that can be employed to create a multi-layered defense for cloud-based computing environments.

 

About the Author

Ken Biery is a principal security consultant with Terremark, Verizon’s IT services subsidiary, focused on providing governance, risk, and compliance counsel to enterprises moving to the cloud. With extensive knowledge in the area of cloud computing, he enables companies around the globe to securely migrate to the cloud and crate more efficient IT operations.

Test Accounts: Another Compliance Risk Arrow to Content

October 7, 2011 | Leave a Comment

By: Merritt Maximi

A major benefit associated with deploying identity management and/or identity governance into an organization is that these solutions provide the ability to detect and remove orphan accounts.  Orphan accounts refer to active accounts belonging to a user who is no longer involved with that organization.  From a compliance standpoint, orphan accounts are a major concern since orphan accounts mean that ex-employees  and former contractors or suppliers still have legitimate credentials and access to internal systems.  Identity management and identity governance solutions can help identify potential orphan accounts which the IT and audit teams can review and determine if these accounts should be deleted.  By actively monitoring and managing orphan accounts, organizations can reduce IT risk and better manage their overall users and entitlements more effectively.

However, there is another type of account that can present many of the same problems as orphans, yet these accounts are often overlooked during the certification and governance process. The accounts in questions are test accounts which reside on almost every application.  Test accounts serve a very valuable function, especially as organizations prep to move a new app or version from test to production.  The test account is how IT can verify functionality.  Because of the application requirements, most test accounts have full administrative privileges meaning that the account has access to every capability in the given application.  The challenge is test accounts serve a valuable purpose and they cannot be removed completely.

The preferred best practice is to only have test accounts in test environment, or staging environment at most, but never in the production environment.

However, as is often the case in today’s highly complex, heterogeneous and distributed IT environment, test accounts often end up in production environments.  Even worse, these test accounts often lie undetected or in a large grouping of unaligned accounts.  And generally speaking, the longer an application has been in production in an organization, the greater the probability is that test accounts reside within those systems.

So what is the best approach for managing test accounts?

  1. First, if a test account is to reside in a production environment (and there may be legitimate business reasons for this), make sure that this account is assigned the least possible privileges possible.  This allows for some basic testing of the production system without exposing the entire application.
  2. Leave the full test accounts for the test and staging environments.
  3. Adopt an organization-wide common syntax for test accounts.  Make them all called “test” or something else.  This will make step 4 even easier.
  4. Conduct periodic audits of your production environments to identify potential test accounts.  This can be laborious manual work, but your auditors (and others) will thank you in the long run.  The simplest way would be to start looking in the group of unaligned accounts (those not tied to any individual) as well as for other syntax like “test” or “12345” which are often what developers use to name a test account.
  5. When conducting your periodic review of test accounts, you should also do a review of the test account activities. The helps determine if anyone was using the test accounts for actual changes that could affect the production environment.   The impact could be indirect (e.g. policy changes that are done in staging environment may impact production if, by error, these changed policies are automatically pushed into the production environment as part of some bigger configuration rollout) and analyzing the activity can help prevent these issues.
  6. Consider utilizing privileged user password management (PUM) functionality to protect all your test accounts, especially those in production.  PUM solutions can help mitigate the risk of test accounts by securing those accounts in a secure encrypted vault., and can ensure appropriate access to the passwords based on a documented policy.  Doing this also helps make the users of the test accounts accountable and also means that test account use is no longer anonymous as all user actions are securely recorded.

In summary, test accounts are not the enemy, but they do represent a potential risk that every IT organization should manage.

When It Comes To Cloud Security, Don’t Forget SSL Arrow to Content

September 30, 2011 | Leave a Comment

By Michael Lin, Symantec

 

Cloud computing appears here to stay, bringing with it new challenges and security risks on one hand, while on the other hand boasting efficiencies, cost savings and competitive advantage. With the new security risks of cloud and the mounting skill and cunning of today’s malicious players on the Web, Secure Sockets Layer (SSL) certificates are here to stand up to the risks. Using SSL encryption and authentication, SSL certificates have long been established as a primary security standard of computing and the Internet, and a no-brainer for securely transferring information between parties online.


What is SSL?

SSL Certificates encrypt private communications over the public Internet. Using public key infrastructure, SSL consists of a public key (which encrypts information) and a private key (which deciphers information), with encryption mathematically encoding data so that only the key owners can read it. Each certificate provides information about the certificate owner and issuer, as well as the certificate’s validity period.

Certificate Authorities (CAs) issue each certificate, which is a credential for the online world, to only one specific domain or server.  The server sends the identification information to the browser when it connects, then sends the browser a copy of its SSL Certificate. The browser verifies the certificate, and then sends a message to the server and the server sends back a digitally signed acknowledgement to start an SSL-encrypted session, letting encrypted data transfer between the browser and the server.


How does it secure data in the cloud?

If SSL seems a little old-school in comparison to the whiz-bang novelty of cloud computing, consider this:  since SSL offers encryption that prevents prying eyes from reading data traversing the cloud, as well as authentication to verify the identity of any server or endpoint receiving that data, it’s well-suited to address a host of cloud security challenges.

Where does my data reside, and who can see it? Moving to the cloud means giving up control of private and confidential data, bringing data segregation risks. Traditional on-site storage lets businesses control where data is located and exactly who can access it, but putting information in the cloud means putting location and access in the cloud provider’s hands.

This is where SSL swoops in to quell data segregation worries. By requiring cloud providers to use SSL encryption, data can securely move between servers or between servers and browsers. This prevents unauthorized interceptors from reading that data. And, don’t forget that SSL device authentication identifies and vets the identity of each device involved in the transaction, before one bit of data moves, keeping rogue devices from accessing sensitive data.

How can I maintain regulatory compliance in the cloud? In addition to surrendering control of the location of data, organizations also need to address how regulatory compliance is maintained when data lives in the cloud.  SSL encryption thwarts accidental disclosure of protected or private data according to regulatory requirements. It also provides the convenience of automated due diligence.

Will my data be at risk in transit? Putting data in the cloud usually means not knowing where it physically resides, as discussed earlier. The good news is that cloud providers using SSL encryption protect data wherever it goes. This approach not only safeguards data where it lives, but also helps assure customers that data is secure while in transit.

Another point to note here is that cloud providers using a legitimate third-party SSLCAwill not issue SSL certificates to servers in interdicted countries, nor store data on servers located in those countries. SSL therefore further ensures that organizations are working with trusted partners.


Will any SSL do?

Recent breaches and hacks reinforce the fact that not all SSL is created equal, and neither are all CAs. Security is a serious matter and needs to be addressed as organizations push data to the cloud. Well-established best practices help those moving to the cloud make smart choices and protect themselves. Here are some things to keep in mind while weighing cloud providers:

  • Be certain that the cloud providers you work with use SSL from established and reliable independent CAs. Even among those trusted CAs, not all SSL is the same, so choose cloud providers that ensure that those providers have SSL certificates from certificate authorities that:
  • Ensure that the SSL your cloud provider uses supports at least AES 128-bit encryption, preferably stronger AES 256-bit encryption, based on the new 2048-bit global root
  • Require a rigorous, annual audit of the authentication process Maintain military-grade data centers and disaster recovery sites optimized for data protection and availability

Who will you trust? That’s the question with cloud computing, and with SSL. Anybody can generate and issue certificates with free software. Partnering with a trusted CA ensures that it has verified the identity information on the certificate. Therefore, organizations seeking an SSL Certificate need to partner with a trusted CA.

SSL might not be the silver bullet for cloud security, but it is a valuable tool with a strong track record for encrypting and authenticating data online. Amid new and complex cloud security solutions, with SSL, one of the most perfectly suited solutions has been here all along.

 

Securing Your File Transfer in the Cloud Arrow to Content

September 30, 2011 | Leave a Comment

By Stuart Lisk, Sr. Product Manager, Hubspan Inc. 

File transfer has been around since the beginning of time. Ok, well maybe that is an exaggeration, but the point is, file transfer was one of the earliest uses of “network” computing dating back to the early 1970’s when IBM introduced the floppy disk. While we have been sharing files with each other for ages, the security of the data shared is often questionable.

Despite File Transfer Protocol (FTP) being published in 1971, it took until the mid-80s for systems to catch up to the original vision of FTP, as LANs were beginning to find their way into the business environment. During this time period, transferring files internally became easier and the ability to move files externally by leveraging the client server typology eliminated the “here’s the disk” approach. If you think about it, these were pretty confined environments with the client and server having a true relationship. Securing the file in this scenario had more to do with making sure that no one could access the data as oppose to worrying about protecting the transport itself. Centralized control and access was the way of the world back in these “good ole days.”

Fast forward to the proliferation of the internet and the World Wide Web, the concern of securing files while in transit to its location then became top of mind. IT managers were ultimately concerned that anyone within a company could log on via the web and access a self-service, cloud based, File Transfer application without IT’s knowledge, adding to the increased security risk for file transfer.

Performing file transfer over the internet, via the “cloud”, has provided major benefits over the traditional methods.  In fact, we’ve seen that the ability to quickly deploy and provision file transfer activities actually drives more people to the cloud. However, along with the quick on-boarding of companies and individuals comes the challenge of ensuring secure connectivity, managed access, reporting, adaptability, and compliance.

Having a secure connection is not as easy as it should be. Many companies still utilize legacy file transfer protocols that don’t encrypt traffic, exposing the payload to anyone that can access the network. While FTP protocol is a bit dated, the majority of companies still use it.  According to a recent file transfer survey conducted in March 2011, over 70% of respondents currently utilize FTP as their primary transport protocol. Furthermore, over 56% of those responding stated that they use a mailbox or other email applications to transfer files.

In order for enterprises to move beyond FTP to ensure sensitive files are transferred securely, they must implement protection policies that include adherence to security compliance mandates; and do so with the same ease-of-use that exists with simple email. IT managers must be concerned with who is authorizing and initiating file transfers as well as controlling what gets shared. Any time files leave a company without going through proper “file transfer” policy checks puts businesses at risk. Typical email attachments and use of ad-hoc file web-based file transfer applications makes it easy for someone to share files they shouldn’t.

In today’s computing environment, securing file transfer in the cloud requires the use of protocols that integrate security during transit and at rest.  Common secure protocols are Secure FTP (SFTP), FTPS (FTP over SSL), AS2, and HTTPS to name a few. Companies need to be actively looking at one of these protocols, as it will encrypt data while minimizing risk.

When leveraging the cloud for file transfer, IT managers need to be sure that the application and/or vendor they are working with utilizes a proven encryption method. Encrypting the file when it is most vulnerable in-transit, is best. Additionally, IT managers would be wise to work with cloud vendors that have integrated security already built into their platform.  Built-in encryption, certification and validation of data are vital to ensure a safe delivery of files. While you may not have influence over what your partner implements as their transport, you can take steps to mitigate issues. In fact today there are a number of file transfer applications that validate content prior to and after the file transfer occurs.

Another area of focus for IT mangers when accessing file transfer security is around access controls. Simply put, who has access and to what data.  Companies must have a plan to control access to each file and what data is stored there. Again in this scenario, encrypting methods to access the file is the best way to mitigate a breach. As mentioned earlier, FTP does not protect credentials from predators. More than 30% of the respondents from the March survey indicated that access controls is one of the most important criteria for Cloud based transfers.

Receipt notification is yet another way for senders ensure their confidential files are being delivered and opened by the right people.  Additionally, using file transfer applications that utilize an expiration time that keeps the file available is a great way to mitigate unauthorized access.

As mentioned earlier, adhering to industry and corporate compliance policies has is critical. Corporate governance regulations include but not limited to:

  • Sarbanes-Oxley Section 404: Requires audit trails, authenticity, record retention
  • HIPAA requirements: Record retention, privacy protection, service trails
  • 21 CFR Part 11: Record retention, authenticity, confidentiality, audit trails
  • Department of Defense (DOD) 5015.2: Record authenticity, protection, secure shredding

While there are many criteria to consider when deciding how to implement and leverage file transfer activities within your organization, there are really a few simple areas to focus on:

  • Choose a secure protocol
  • Implement data protection in-transit and at-rest
  • Utilize effective encryption technology
  • Maximize access controls
  • Leverage auditing and reporting functionality
  • Adhere to corporate and industry compliance policies

While that may seem like an endless number of steps, it can be easier than it sounds as long as you evaluate and execute file transfer activity that protects and secure your sensitive data.

 

Stuart Lisk, Senior Product Manager, Hubspan

Stuart Lisk is a Senior Product Manager for Hubspan, working closely with customers, executives, engineering and marketing to establish and drive an aggressive product strategy and roadmap.  Stuart has over 20 years of experience in product management, spanning enterprise network, system, storage and application products, including ten years managing cloud computing (SaaS) products. He brings extensive knowledge and experience in product positioning, messaging, product strategy development, and product life cycle development process management.  Stuart holds a Certificate of Cloud Security Knowledge (CCSK) from the Cloud Security Alliance, and a Bachelor of Science in Business Administration from Bowling Green State University.

 

CSA Blog: The “Don’t Trust Model” Arrow to Content

September 14, 2011 | Leave a Comment

By Ed King

The elephant in the room when it comes to barriers to the growth and adoption of Cloud computing by enterprises is the lack of trust held for  Cloud service providers.  Enterprise IT has legitimate concerns over the security, integrity, and reliability of Cloud based services.  The recent high profile outages at Amazon and Microsoft Azure, as well as  security issues at DropBox and Sony only add to the argument that Cloud computing poses substantial risks for enterprises.

 

Cloud service providers realize this lack of trust is  preventing enterprise IT from completely embracing Cloud computing.  To ease this concern, Cloud service providers have traditionally taken one or both of the following approaches:

  1. Cloud service providers, especiallythe larger ones, have implemented substantial security and operational procedures to ensure customer data safety, system integrity, and service availability.  This typically includes documenting the platform’s security architecture, data center operating procedures, and adding service-side security options like encryption and strong authentication.  On top of this, they obtain SAS-70 certification to provide proof that “we did what we said we would do.”
  2. Cloud service providers also like to point out their security and operational technology and controls are no worse, indeed, are probably  better than the security procedures which most enterprises have implemented on their own.

 

Both of these approaches boil down to a simple maxim, “trust me, I know what I am doing!”  This “Trust Me” approach has launchedt the Cloud computing industry but to date, most large enterprises have not put mission critical applications and sensitive data into the public Cloud.  As enterprises look to leverage Cloud technologies for mission critical applications, the talk has now shifted towards private Cloud, because fundamentally the “Trust Me” approach has reached its limit.

 

In terms of further development, Cloud service providers must come to the realization that enterprises will never entrust the providers with their business critical applications and data unless they have more direct control over security, integrity, and availability.  No amount of documentation, third party certification, or  on-site auditing can mitigate risks enough to replace the loss of direct control.  As an industry, the sooner it is    realized that we need solutions offering Cloud control back to the customer, the sooner enterprises and the industry will  benefit from the true commercial benefits of Cloud computing.  As such, the approach would be, be “you don’t have to trust your Cloud providers, because you own the risk mitigating controls”.  Security professionals normally talk about best practice approaches to implementing trust models for IT architectures. I like to refer to the self-enablement of the customer as the “Don’t Trust Model”. Let’s examine how we can put control back into the customer’s hands so we can shift to a “Don’t Trust Model”?

 

Manage Cloud Redundancy

Enterprises usually dual-source critical information and build redundancy into their mission critical infrastructures.  Why should Cloud based services be any different?  When Amazon Web Services (AWS) experienced an outage on April 21, 2011, a number of businesses that used AWS went completely off line, but Netflix did not.  Netflix survived the outage with some degradation in service because it has designed redundancy into its Cloud based infrastructure.  Netflix has spread  its Cloud infrastructure across multiple vendors and has designed redundancy into its platform.  Features like stateless services and fallback are designed specifically to deal with scenarios such as the AWS outage (see an interesting technical discussion at Netflix’s Tech Blog).  Technologies like Cloud Gateway, Cloud Services Broker and Cloud Switch can greatly simplify the task of setting up, managing, monitoring, and switching of Cloud redundancy.

 

For example, a Cloud Gateway can provide continuous monitoring of Cloud service availability and quality.  When service quality dips beyond a certain threshold, the Cloud Gateway can send out alerts and  automatically divert traffic to back-up providers.

Put Security Controls On-premise

Salesforce.com (SFDC) is the poster child of a successful Cloud based service.  However, as SFDC expanded beyond the small and medium business sector to go after large enterprises, they found a  more reluctant customer segment due to the concern over data security in the Cloud.  On August 26, 2011, SFDC bought Navajo Systems, an acquisition of a technology that puts security control back in the hands of SFDC customers.  Navajo Systems provides solutions that encrypt and tokenize data stored in the Cloud, a Cloud Data Gateway.

 

Cloud Data Gateway secures the data before it leaves the enterprise premises.  The Gateway monitors data traffic to the Cloud and enforces policies to block, remove, mask, encrypt, or tokenize sensitive data.  The Cloud Data Gateway technology has different deployment options.  Using a combination of Gateways at the Cloud service provider and Gateways on-premise, different levels of data security can be achieved.  By giving customers control over data security before the data leaves the premises, customers do not have to trust the Cloud service provider and need not rely on the Cloud provider alone to ensure the safekeeping of its data.

Integrate Cloud WithEnterpriseSecurity Platforms

Enterprises have spent millions of dollars on security infrastructure, including identity and access management, data security, and application security.  The deployments of these technologies are accompanied by supporting processes such as user on-boarding, data classification, and software development lifecycle management.  These processes take years to rollout and provide critical controls to mitigate security risks.  These tools and processes will evolve to incorporate new technologies like Cloud computing and mobile devices, but for Cloud computing to gain acceptance within the enterprise, Cloud services must be seamlessly integrated into existing security platforms and processes.

 

Single sign-on (SSO) is a great example.  After years of effort to deploy an enterprise access management solution like CA Siteminder, Oracle Access Manager or IBM Tivoli Access Manager to enable SSO and have finally trained all the users on how to perform password reset, do you think IT has the appetite to let each Cloud service become a security silo?  From a user standpoint, they simply expect SSO to be SSO, not “SSO, excluding Cloud based services”.   Most major Cloud service providers support standards such as SAML (Security Assertion Markup Language) for SSO and provide detailed instructions on how to integrate with on-premise access management systems.  Usually this involves some consulting work and maybe a third party product.   A more scalable approach would be using technologies such as Access Gateway (also known as SOA Gateway, XML Gateway, Enterprise Gateway) to provide integrated and out-of-the-box integrations to access management platforms.  Gateway based solutions extend existing access policies and SSO processes to Cloud based services, placing access control back  with information security teams.

 

It’s clear that more needs to be done to place control back into the hands of the customer.  Cloud computing is a paradigm shift and holds great promise for cost savings and new revenue generation.  However, to accelerate the acceptance of Cloud computing by enterprises IT, we as an industry must change from a trust model to a “Don’t Trust” model way of thinking.

 

Ed King VP Product Marketing, Vordel
Ed has responsibility for Product Marketing and Strategic Business Alliances. Prior to Vordel, he was VP of Product Management at Qualys, where he directed the company’s transition to its next generation product platform. As VP of Marketing at Agiliance, Ed revamped both product strategy and marketing programs to help the company double its revenue in his first year of tenure. Ed joined
Oracle as Senior Director of Product Management, where he built
Oracle’s identity management business from a niche player to the undisputed market leader in just 3 years. Ed also held product management roles at Jamcracker, Softchain and Thor Technologies. He holds an engineering degree from the Massachusetts Institute of Technology and a MBA from the University of California, Berkeley.

Seven Steps to Securing File Transfer’s Journey to the Cloud Arrow to Content

September 12, 2011 | Leave a Comment

By Oded Valin, Product Line Manager, Cyber-Ark Software

 

“When it absolutely, positively has to be there overnight.”  There’s a lot we can identify with when it comes to reciting FedEx’s famous slogan, especially as it relates to modern file transfer processes. When you think about sharing health care records, financial data or law enforcement-related information, peace of mind is only made possible when utilizing technology and processes that are dependable, trustworthy – and traceable.  Organizations that rely on secure file transfer to conduct business with partners, customers and other third-parties must maintain the same level of confidence that that slogan inspired.  Now, consider taking the transfer of sensitive information to the cloud.  Still confident?

 

In many ways, when you consider the number of USB sticks that have been lost in the past six-to-nine months due to human error or the number of FTP vulnerabilities that have been routinely exploited, it’s clear there must be a better way.

 

For organizations seeking a cost-effective solution for exchanging sensitive files that can be deployed quickly and with minimal training, it may be time to consider cloud-based alternatives.  But how can organizations safely exchange sensitive files in the cloud while maintaining security and compliance requirements, and remaining accountable to third-parties?  Following are seven steps to ensuring a safe journey for taking governed file transfer activities to the cloud.

 

For those organizations interested in starting off on the right foot for a cloud-based governed file transfer project, either starting from scratch or migrating from an existing enterprise program, here are important steps to consider:

 

  1. Identify Painful and Costly Processes: Examine existing transfer processes and consider costs to maintain them. Do they delay the business and negatively impact IT staff? If starting from scratch, what processes must you be securing and ensuring are free from vulnerabilities in the cloud?  Typically, starting a file transfer program from scratch requires significant IT and administrative investments ranging from setting up the firewall and VPN to engaging with a courier service to handle files that are too large to be transferred electronically.  The elasticity of the cloud enables greater flexibility and scalability and significantly decreases the amount of time and resources required to establish a reliable program.  Utilizing a cloud-based model, organizations can become fully operational within days or weeks versus months, while reducing the drag on IT resources.  Ultimately, in cases like one Healthcare provider that turned to the cloud to share images with primary MRI and CT scan providers, services being provided to the patient were more timely and efficient, and less expensive.
  2. Define Initial Community: Who are the users – internal? external?  When exchanging files with third-party partners, particularly business users, it’s important to provide a file transfer solution that works the way they work.  User communities are increasingly relying on tablets and browser-based tools to conduct business, so the file transfer process and user-interface must reflect the community’s skill sets and computing preferences.  The ease of deployment and the level of customization made possible in cloud-based environments encourage adoption and effective use of file transfer solutions.
  3. Determine File Transfer Type: Do you need something scalable or ad-hoc? How important is automation?  Compared to manual file transfer process, a cloud computing environment can support centralized administration for any file type while also providing the benefits of greater storage, accommodation for large file transfers and schedule-based processes, all without negatively impacting server or network performance.
  4. Integrate with Existing Systems: Can you integrate your existing systems with a cloud-based file transfer solution? What automation tools are provided by the cloud vendor?  Many organizations believe that file transfer systems are stand-alone platforms that can’t be integrated with existing systems, like finance and accounting, for example.  Utilizing a flexible cloud-based solution with open APIs and out of the box plug-ins not only assists with secure integration with current databases and applications, but it can also be deployed very quickly with the flexibility to support the adoption of a hybrid cloud/on-premise model, should the organization decide that scenario worked best for its business.
  5. Define Workflows: Examine how business, operations and security are interrelated.  What regulations and transparency requirements need to be considered?  How are they different in the cloud?  Ensure segregation of duties between the operations and the content, between the content owners themselves.  Organizations seeking to adopt a cloud-based file transfer solution must make sure the service provider can support its user-defined workflows. It’s also important to ensure your cloud vendor goes “beyond the basics.”  Specifically, many file sharing services allow organizations to share data and information simply from Point A to Point B.  But, if you need to add additional functionality like automatically converting to a .pdf and adding a watermark for additional security, manage audit permissions, scan the file for viruses and other advanced features, an enterprise class cloud solution is necessary.
  6. Continuous Monitoring: Take steps to ensure file download activity is being monitored, file exchange validated and transfers are smooth. Organizations must be able to verify when files arrived and know who opened them. These actions are absolutely supported in a cloud environment, and are overall governed file transfer best practices.
  7. Ongoing Operations: Is it quick and easy to add new partners or set up new file transfer processes? How reliable is the service in terms of high availability, disaster recovery and automatic recovery of file transfer processes?  The cloud-based solution should provide an easy-to-use interface to empower the business user and encourage autonomy at the operations level without requiring IT involvement. Additionally, organizations should find a cloud provider that provides a simple pricing model.  For example, paying per email is not scalable and doesn’t align with typical business use.  Finally, you shouldn’t have to fly alone, be sure to take advantage of all the consulting services and expertise your service provider offers to support ongoing operations without interruption.

 

To conclude, given the traditional reliance on antiquated technologies and unreliable processes, it’s absolutely time for organizations to consider adopting cloud-based approaches to governed file transfer activities.  Moving beyond the well-established cost and resource benefits of the cloud, for those companies with complex requirements or special file transfer needs, the flexibility and security that are possible in the cloud will ensure that high quality standards are continuously met and that the confidence and peace of mind necessary to secure your file transfer’s trip to the cloud are achieved. Rest assured.

 

Oded Valin is a Product Line Manager at Cyber-Ark Software (www.cyber-ark.com). Drawing on his 15 years of high-tech experience, Valin’s responsibilities include leading definition and delivery of Cyber-Ark’s Sensitive Information Management product line, product positioning and overall product roadmap.

Five Ways to Achieve Cloud Compliance Arrow to Content

August 26, 2011 | Leave a Comment

With the rapid adoption of cloud computing technologies, IT organizations have found a way to deliver applications and services more quickly and efficiently to their customers, incorporating the nearly ubiquitous utility-like platforms of managed cloud services companies.  The use of these cloud technologies are enabling the delivery of messaging platforms, financial applications, Software as a Service offerings, and systems consolidation in a manner more consistent with the speed of the business.

However, audit and compliance teams have been less aggressive in adopting cloud technologies as a solution of choice for a variety of reasons – there may be a lack of understanding of what security components are available in cloud; there may be a concern that the controls in cloud are inadequate for securing data; or, there may be a fear that control over the environment is lost when the application and data move to the cloud.  And, while these concerns are understandable, there is an ever-growing recognition of the security and compliance benefits available in managed cloud services that are putting to rest the minds of corporate audit and compliance teams.

Here are five steps you can take to ensure that your audit and compliance team is comfortable with the cloud:

1.       Understand and be able to relay the compliance requirements to your cloud service provider.  I have worked with organizations in all industries with a wide variety of regulations, and the most successful organizations adopting cloud come with a very in-depth understanding of what security controls and technologies are necessary to meet the compliance of their own organizations.  For example, we had a large provider of healthcare services approach us with a request to move a portion of their environment to cloud.  This environment contained Patient Health Information (PHI), and the customer knew that, in order to pass their audit, they must be able to:

a)      Enforce their own security policies in the new environment including password policies, standard builds, change management, incident handling, and maintenance procedures.

b)      Incorporate specific technologies in the environment including file integrity monitoring, intrusion detection, encryption, two-factor authentication, and firewalls.

c)       Integrate the security architecture into their already robust security operations processes for multisite event correlation, security incident response, and eDiscovery.

By ensuring that the cloud environment was architected from the very beginning with those controls in mind, the audit and compliance team had very little work to do to ensure the new environment would be consistent with the corporate security policies and achieve HIPAA compliance.

2.       Select a cloud provider with a history of transparency in security and policies built into the cloud platform.  It is extremely important that the controls in place supporting the cloud infrastructure are consistent with those of your organization or that the cloud provider has the flexibility to incorporate your controls into the cloud environment that will house your data.  It is important to note that compliance is not one-size-fits-all. An example of this is the financial industry, where there are very specific controls that must be incorporated into an IT infrastructure, such as data retention, data classification, business continuity, and data integrity.  Be sure that the managed cloud services provider is able to incorporate those policies that differ from the standard policies.  Key policies and services that are often adjustable for different industries include the following:

a)      Data and Backup Retention

b)      Data encryption at rest and in transit

c)       Business resumption and continuity plans

d)      eDiscovery and data classification policies

e)      Data integrity assurance

f)       Identity and access management

Most organizations maintain a risk management program. If your company has a risk assessment process, include your provider early to ensure that the controls you need are included.  If your organization does not, there are several accessible questionnaires that you can tailor to suit your needs.  Two great resources are the Cloud Security Alliance (https://cloudsecurityalliance.org ) and the Shared Assessments program (http://www.sharedassessments.org ).

3.       Understand what the application, the data, and the traffic flow look like.  It is not uncommon for a cloud customer not to understand exactly what data exists in the system and what controls need to be incorporated.  For example, one of the early adopter of cloud services I worked with years ago did not know that the application they hosted processed credit card transactions on a regular basis.  When they first came to us, they wanted to put their Software as a Service application in the cloud not knowing that one of the uses that a customer of theirs had was to process credit cards in a high-touch retail model – the Payment Card Industry Data Security Standard (PCI DSS) was the furthest thing from their mind.  After the end-customer performed an audit, the gaps in security and policies were closed by incorporating those policies and technologies that were made available in the cloud platform.  Further, by understanding the transaction and process flow, the customer was able to reduce costs by segmenting the cardholder environment from the rest of the environment, and implemented the more stringent security controls on the environment with the cardholder data

4.       Clearly define the roles and responsibilities between your organization and the managed cloud services provider.  Some of the roles and responsibilities in a hosted service clearly belong to the hosting provider, and some clearly belong to the customer.  For example, in cloud, the underlying cloud infrastructure, its architecture, its maintenance, and its redundancy is clearly the responsibility of the provider; likewise, the application (in many cases) and all of the data maintenance is clearly the responsibility of the customer.  However, how an organization assigns roles and responsibilities for everything in between and assigns responsibility for the ongoing compliance of those roles and responsibilities is extremely important to the ongoing management of the compliance program. Remember that some of the controls and security technologies may be in addition to the cloud platform, and your requirements may result in additional services and scope.

5.       Gain an understanding of the certifications and compliance you can leverage from your managed cloud services provider. Your managed cloud services provider may have an existing compliance program that incorporates many of the controls that your audit team will require when assessing the compliance of the cloud environment.  In many cases, this compliance program, and the audited controls, can be adopted and audited as though they were those of your organization.  For example, some cloud providers have included the cloud platform and customer environments in their SSAE 16 (formerly SAS70) program.  The SSAE 16 compliance program is audited by a third party, and provides the assurance that the controls and policies that are stated within the provider’s compliance program are in place and followed.  By inclusion into that compliance program, you may provide your auditors with a quick path to assessment completion.

The most important thing to remember in moving your environment to the cloud is to be sure to have conversations early and often with your provider regarding your requirements and the specific expectations of the provider.  They should be able to provide the information necessary to be sure that your environment includes all of the security and controls to achieve your company’s compliance and certifications.

 

Allen Allison, Chief Security Officer at NaviSite (www.navisite.com)

During his 20+ year career in information security, Allen Allison has served in management and technical roles, including the development of NaviSite’s industry-leading cloud computing platform, chief engineer and developer for a market-leading managed security operations center; lead auditor and assessor for information security programs in the healthcare, government, e-commerce, and financial industries. With experience in systems programming, network infrastructure design/deployment, and information security, Allison has earned the highest industry certifications, including CCIE, CCSP, CISSP, MCSE, CCSE, and INFOSEC Professional. A graduate of the University of California, Irvine, Allison has lectured at universities and spoken at industry shows such as Interop, RSA Conference, Cloud Computing Expo,

MIT Sloan CIO Symposium, and Citrix Synergy.

 

Page Dividing Line