When It Comes To Cloud Security, Don’t Forget SSL Arrow to Content

September 30, 2011 | Leave a Comment

By Michael Lin, Symantec

 

Cloud computing appears here to stay, bringing with it new challenges and security risks on one hand, while on the other hand boasting efficiencies, cost savings and competitive advantage. With the new security risks of cloud and the mounting skill and cunning of today’s malicious players on the Web, Secure Sockets Layer (SSL) certificates are here to stand up to the risks. Using SSL encryption and authentication, SSL certificates have long been established as a primary security standard of computing and the Internet, and a no-brainer for securely transferring information between parties online.


What is SSL?

SSL Certificates encrypt private communications over the public Internet. Using public key infrastructure, SSL consists of a public key (which encrypts information) and a private key (which deciphers information), with encryption mathematically encoding data so that only the key owners can read it. Each certificate provides information about the certificate owner and issuer, as well as the certificate’s validity period.

Certificate Authorities (CAs) issue each certificate, which is a credential for the online world, to only one specific domain or server.  The server sends the identification information to the browser when it connects, then sends the browser a copy of its SSL Certificate. The browser verifies the certificate, and then sends a message to the server and the server sends back a digitally signed acknowledgement to start an SSL-encrypted session, letting encrypted data transfer between the browser and the server.


How does it secure data in the cloud?

If SSL seems a little old-school in comparison to the whiz-bang novelty of cloud computing, consider this:  since SSL offers encryption that prevents prying eyes from reading data traversing the cloud, as well as authentication to verify the identity of any server or endpoint receiving that data, it’s well-suited to address a host of cloud security challenges.

Where does my data reside, and who can see it? Moving to the cloud means giving up control of private and confidential data, bringing data segregation risks. Traditional on-site storage lets businesses control where data is located and exactly who can access it, but putting information in the cloud means putting location and access in the cloud provider’s hands.

This is where SSL swoops in to quell data segregation worries. By requiring cloud providers to use SSL encryption, data can securely move between servers or between servers and browsers. This prevents unauthorized interceptors from reading that data. And, don’t forget that SSL device authentication identifies and vets the identity of each device involved in the transaction, before one bit of data moves, keeping rogue devices from accessing sensitive data.

How can I maintain regulatory compliance in the cloud? In addition to surrendering control of the location of data, organizations also need to address how regulatory compliance is maintained when data lives in the cloud.  SSL encryption thwarts accidental disclosure of protected or private data according to regulatory requirements. It also provides the convenience of automated due diligence.

Will my data be at risk in transit? Putting data in the cloud usually means not knowing where it physically resides, as discussed earlier. The good news is that cloud providers using SSL encryption protect data wherever it goes. This approach not only safeguards data where it lives, but also helps assure customers that data is secure while in transit.

Another point to note here is that cloud providers using a legitimate third-party SSLCAwill not issue SSL certificates to servers in interdicted countries, nor store data on servers located in those countries. SSL therefore further ensures that organizations are working with trusted partners.


Will any SSL do?

Recent breaches and hacks reinforce the fact that not all SSL is created equal, and neither are all CAs. Security is a serious matter and needs to be addressed as organizations push data to the cloud. Well-established best practices help those moving to the cloud make smart choices and protect themselves. Here are some things to keep in mind while weighing cloud providers:

  • Be certain that the cloud providers you work with use SSL from established and reliable independent CAs. Even among those trusted CAs, not all SSL is the same, so choose cloud providers that ensure that those providers have SSL certificates from certificate authorities that:
  • Ensure that the SSL your cloud provider uses supports at least AES 128-bit encryption, preferably stronger AES 256-bit encryption, based on the new 2048-bit global root
  • Require a rigorous, annual audit of the authentication process Maintain military-grade data centers and disaster recovery sites optimized for data protection and availability

Who will you trust? That’s the question with cloud computing, and with SSL. Anybody can generate and issue certificates with free software. Partnering with a trusted CA ensures that it has verified the identity information on the certificate. Therefore, organizations seeking an SSL Certificate need to partner with a trusted CA.

SSL might not be the silver bullet for cloud security, but it is a valuable tool with a strong track record for encrypting and authenticating data online. Amid new and complex cloud security solutions, with SSL, one of the most perfectly suited solutions has been here all along.

 

Securing Your File Transfer in the Cloud Arrow to Content

September 30, 2011 | Leave a Comment

By Stuart Lisk, Sr. Product Manager, Hubspan Inc. 

File transfer has been around since the beginning of time. Ok, well maybe that is an exaggeration, but the point is, file transfer was one of the earliest uses of “network” computing dating back to the early 1970’s when IBM introduced the floppy disk. While we have been sharing files with each other for ages, the security of the data shared is often questionable.

Despite File Transfer Protocol (FTP) being published in 1971, it took until the mid-80s for systems to catch up to the original vision of FTP, as LANs were beginning to find their way into the business environment. During this time period, transferring files internally became easier and the ability to move files externally by leveraging the client server typology eliminated the “here’s the disk” approach. If you think about it, these were pretty confined environments with the client and server having a true relationship. Securing the file in this scenario had more to do with making sure that no one could access the data as oppose to worrying about protecting the transport itself. Centralized control and access was the way of the world back in these “good ole days.”

Fast forward to the proliferation of the internet and the World Wide Web, the concern of securing files while in transit to its location then became top of mind. IT managers were ultimately concerned that anyone within a company could log on via the web and access a self-service, cloud based, File Transfer application without IT’s knowledge, adding to the increased security risk for file transfer.

Performing file transfer over the internet, via the “cloud”, has provided major benefits over the traditional methods.  In fact, we’ve seen that the ability to quickly deploy and provision file transfer activities actually drives more people to the cloud. However, along with the quick on-boarding of companies and individuals comes the challenge of ensuring secure connectivity, managed access, reporting, adaptability, and compliance.

Having a secure connection is not as easy as it should be. Many companies still utilize legacy file transfer protocols that don’t encrypt traffic, exposing the payload to anyone that can access the network. While FTP protocol is a bit dated, the majority of companies still use it.  According to a recent file transfer survey conducted in March 2011, over 70% of respondents currently utilize FTP as their primary transport protocol. Furthermore, over 56% of those responding stated that they use a mailbox or other email applications to transfer files.

In order for enterprises to move beyond FTP to ensure sensitive files are transferred securely, they must implement protection policies that include adherence to security compliance mandates; and do so with the same ease-of-use that exists with simple email. IT managers must be concerned with who is authorizing and initiating file transfers as well as controlling what gets shared. Any time files leave a company without going through proper “file transfer” policy checks puts businesses at risk. Typical email attachments and use of ad-hoc file web-based file transfer applications makes it easy for someone to share files they shouldn’t.

In today’s computing environment, securing file transfer in the cloud requires the use of protocols that integrate security during transit and at rest.  Common secure protocols are Secure FTP (SFTP), FTPS (FTP over SSL), AS2, and HTTPS to name a few. Companies need to be actively looking at one of these protocols, as it will encrypt data while minimizing risk.

When leveraging the cloud for file transfer, IT managers need to be sure that the application and/or vendor they are working with utilizes a proven encryption method. Encrypting the file when it is most vulnerable in-transit, is best. Additionally, IT managers would be wise to work with cloud vendors that have integrated security already built into their platform.  Built-in encryption, certification and validation of data are vital to ensure a safe delivery of files. While you may not have influence over what your partner implements as their transport, you can take steps to mitigate issues. In fact today there are a number of file transfer applications that validate content prior to and after the file transfer occurs.

Another area of focus for IT mangers when accessing file transfer security is around access controls. Simply put, who has access and to what data.  Companies must have a plan to control access to each file and what data is stored there. Again in this scenario, encrypting methods to access the file is the best way to mitigate a breach. As mentioned earlier, FTP does not protect credentials from predators. More than 30% of the respondents from the March survey indicated that access controls is one of the most important criteria for Cloud based transfers.

Receipt notification is yet another way for senders ensure their confidential files are being delivered and opened by the right people.  Additionally, using file transfer applications that utilize an expiration time that keeps the file available is a great way to mitigate unauthorized access.

As mentioned earlier, adhering to industry and corporate compliance policies has is critical. Corporate governance regulations include but not limited to:

  • Sarbanes-Oxley Section 404: Requires audit trails, authenticity, record retention
  • HIPAA requirements: Record retention, privacy protection, service trails
  • 21 CFR Part 11: Record retention, authenticity, confidentiality, audit trails
  • Department of Defense (DOD) 5015.2: Record authenticity, protection, secure shredding

While there are many criteria to consider when deciding how to implement and leverage file transfer activities within your organization, there are really a few simple areas to focus on:

  • Choose a secure protocol
  • Implement data protection in-transit and at-rest
  • Utilize effective encryption technology
  • Maximize access controls
  • Leverage auditing and reporting functionality
  • Adhere to corporate and industry compliance policies

While that may seem like an endless number of steps, it can be easier than it sounds as long as you evaluate and execute file transfer activity that protects and secure your sensitive data.

 

Stuart Lisk, Senior Product Manager, Hubspan

Stuart Lisk is a Senior Product Manager for Hubspan, working closely with customers, executives, engineering and marketing to establish and drive an aggressive product strategy and roadmap.  Stuart has over 20 years of experience in product management, spanning enterprise network, system, storage and application products, including ten years managing cloud computing (SaaS) products. He brings extensive knowledge and experience in product positioning, messaging, product strategy development, and product life cycle development process management.  Stuart holds a Certificate of Cloud Security Knowledge (CCSK) from the Cloud Security Alliance, and a Bachelor of Science in Business Administration from Bowling Green State University.

 

CSA Blog: The “Don’t Trust Model” Arrow to Content

September 14, 2011 | Leave a Comment

By Ed King

The elephant in the room when it comes to barriers to the growth and adoption of Cloud computing by enterprises is the lack of trust held for  Cloud service providers.  Enterprise IT has legitimate concerns over the security, integrity, and reliability of Cloud based services.  The recent high profile outages at Amazon and Microsoft Azure, as well as  security issues at DropBox and Sony only add to the argument that Cloud computing poses substantial risks for enterprises.

 

Cloud service providers realize this lack of trust is  preventing enterprise IT from completely embracing Cloud computing.  To ease this concern, Cloud service providers have traditionally taken one or both of the following approaches:

  1. Cloud service providers, especiallythe larger ones, have implemented substantial security and operational procedures to ensure customer data safety, system integrity, and service availability.  This typically includes documenting the platform’s security architecture, data center operating procedures, and adding service-side security options like encryption and strong authentication.  On top of this, they obtain SAS-70 certification to provide proof that “we did what we said we would do.”
  2. Cloud service providers also like to point out their security and operational technology and controls are no worse, indeed, are probably  better than the security procedures which most enterprises have implemented on their own.

 

Both of these approaches boil down to a simple maxim, “trust me, I know what I am doing!”  This “Trust Me” approach has launchedt the Cloud computing industry but to date, most large enterprises have not put mission critical applications and sensitive data into the public Cloud.  As enterprises look to leverage Cloud technologies for mission critical applications, the talk has now shifted towards private Cloud, because fundamentally the “Trust Me” approach has reached its limit.

 

In terms of further development, Cloud service providers must come to the realization that enterprises will never entrust the providers with their business critical applications and data unless they have more direct control over security, integrity, and availability.  No amount of documentation, third party certification, or  on-site auditing can mitigate risks enough to replace the loss of direct control.  As an industry, the sooner it is    realized that we need solutions offering Cloud control back to the customer, the sooner enterprises and the industry will  benefit from the true commercial benefits of Cloud computing.  As such, the approach would be, be “you don’t have to trust your Cloud providers, because you own the risk mitigating controls”.  Security professionals normally talk about best practice approaches to implementing trust models for IT architectures. I like to refer to the self-enablement of the customer as the “Don’t Trust Model”. Let’s examine how we can put control back into the customer’s hands so we can shift to a “Don’t Trust Model”?

 

Manage Cloud Redundancy

Enterprises usually dual-source critical information and build redundancy into their mission critical infrastructures.  Why should Cloud based services be any different?  When Amazon Web Services (AWS) experienced an outage on April 21, 2011, a number of businesses that used AWS went completely off line, but Netflix did not.  Netflix survived the outage with some degradation in service because it has designed redundancy into its Cloud based infrastructure.  Netflix has spread  its Cloud infrastructure across multiple vendors and has designed redundancy into its platform.  Features like stateless services and fallback are designed specifically to deal with scenarios such as the AWS outage (see an interesting technical discussion at Netflix’s Tech Blog).  Technologies like Cloud Gateway, Cloud Services Broker and Cloud Switch can greatly simplify the task of setting up, managing, monitoring, and switching of Cloud redundancy.

 

For example, a Cloud Gateway can provide continuous monitoring of Cloud service availability and quality.  When service quality dips beyond a certain threshold, the Cloud Gateway can send out alerts and  automatically divert traffic to back-up providers.

Put Security Controls On-premise

Salesforce.com (SFDC) is the poster child of a successful Cloud based service.  However, as SFDC expanded beyond the small and medium business sector to go after large enterprises, they found a  more reluctant customer segment due to the concern over data security in the Cloud.  On August 26, 2011, SFDC bought Navajo Systems, an acquisition of a technology that puts security control back in the hands of SFDC customers.  Navajo Systems provides solutions that encrypt and tokenize data stored in the Cloud, a Cloud Data Gateway.

 

Cloud Data Gateway secures the data before it leaves the enterprise premises.  The Gateway monitors data traffic to the Cloud and enforces policies to block, remove, mask, encrypt, or tokenize sensitive data.  The Cloud Data Gateway technology has different deployment options.  Using a combination of Gateways at the Cloud service provider and Gateways on-premise, different levels of data security can be achieved.  By giving customers control over data security before the data leaves the premises, customers do not have to trust the Cloud service provider and need not rely on the Cloud provider alone to ensure the safekeeping of its data.

Integrate Cloud WithEnterpriseSecurity Platforms

Enterprises have spent millions of dollars on security infrastructure, including identity and access management, data security, and application security.  The deployments of these technologies are accompanied by supporting processes such as user on-boarding, data classification, and software development lifecycle management.  These processes take years to rollout and provide critical controls to mitigate security risks.  These tools and processes will evolve to incorporate new technologies like Cloud computing and mobile devices, but for Cloud computing to gain acceptance within the enterprise, Cloud services must be seamlessly integrated into existing security platforms and processes.

 

Single sign-on (SSO) is a great example.  After years of effort to deploy an enterprise access management solution like CA Siteminder, Oracle Access Manager or IBM Tivoli Access Manager to enable SSO and have finally trained all the users on how to perform password reset, do you think IT has the appetite to let each Cloud service become a security silo?  From a user standpoint, they simply expect SSO to be SSO, not “SSO, excluding Cloud based services”.   Most major Cloud service providers support standards such as SAML (Security Assertion Markup Language) for SSO and provide detailed instructions on how to integrate with on-premise access management systems.  Usually this involves some consulting work and maybe a third party product.   A more scalable approach would be using technologies such as Access Gateway (also known as SOA Gateway, XML Gateway, Enterprise Gateway) to provide integrated and out-of-the-box integrations to access management platforms.  Gateway based solutions extend existing access policies and SSO processes to Cloud based services, placing access control back  with information security teams.

 

It’s clear that more needs to be done to place control back into the hands of the customer.  Cloud computing is a paradigm shift and holds great promise for cost savings and new revenue generation.  However, to accelerate the acceptance of Cloud computing by enterprises IT, we as an industry must change from a trust model to a “Don’t Trust” model way of thinking.

 

Ed King VP Product Marketing, Vordel
Ed has responsibility for Product Marketing and Strategic Business Alliances. Prior to Vordel, he was VP of Product Management at Qualys, where he directed the company’s transition to its next generation product platform. As VP of Marketing at Agiliance, Ed revamped both product strategy and marketing programs to help the company double its revenue in his first year of tenure. Ed joined
Oracle as Senior Director of Product Management, where he built
Oracle’s identity management business from a niche player to the undisputed market leader in just 3 years. Ed also held product management roles at Jamcracker, Softchain and Thor Technologies. He holds an engineering degree from the Massachusetts Institute of Technology and a MBA from the University of California, Berkeley.

Seven Steps to Securing File Transfer’s Journey to the Cloud Arrow to Content

September 12, 2011 | Leave a Comment

By Oded Valin, Product Line Manager, Cyber-Ark Software

 

“When it absolutely, positively has to be there overnight.”  There’s a lot we can identify with when it comes to reciting FedEx’s famous slogan, especially as it relates to modern file transfer processes. When you think about sharing health care records, financial data or law enforcement-related information, peace of mind is only made possible when utilizing technology and processes that are dependable, trustworthy – and traceable.  Organizations that rely on secure file transfer to conduct business with partners, customers and other third-parties must maintain the same level of confidence that that slogan inspired.  Now, consider taking the transfer of sensitive information to the cloud.  Still confident?

 

In many ways, when you consider the number of USB sticks that have been lost in the past six-to-nine months due to human error or the number of FTP vulnerabilities that have been routinely exploited, it’s clear there must be a better way.

 

For organizations seeking a cost-effective solution for exchanging sensitive files that can be deployed quickly and with minimal training, it may be time to consider cloud-based alternatives.  But how can organizations safely exchange sensitive files in the cloud while maintaining security and compliance requirements, and remaining accountable to third-parties?  Following are seven steps to ensuring a safe journey for taking governed file transfer activities to the cloud.

 

For those organizations interested in starting off on the right foot for a cloud-based governed file transfer project, either starting from scratch or migrating from an existing enterprise program, here are important steps to consider:

 

  1. Identify Painful and Costly Processes: Examine existing transfer processes and consider costs to maintain them. Do they delay the business and negatively impact IT staff? If starting from scratch, what processes must you be securing and ensuring are free from vulnerabilities in the cloud?  Typically, starting a file transfer program from scratch requires significant IT and administrative investments ranging from setting up the firewall and VPN to engaging with a courier service to handle files that are too large to be transferred electronically.  The elasticity of the cloud enables greater flexibility and scalability and significantly decreases the amount of time and resources required to establish a reliable program.  Utilizing a cloud-based model, organizations can become fully operational within days or weeks versus months, while reducing the drag on IT resources.  Ultimately, in cases like one Healthcare provider that turned to the cloud to share images with primary MRI and CT scan providers, services being provided to the patient were more timely and efficient, and less expensive.
  2. Define Initial Community: Who are the users – internal? external?  When exchanging files with third-party partners, particularly business users, it’s important to provide a file transfer solution that works the way they work.  User communities are increasingly relying on tablets and browser-based tools to conduct business, so the file transfer process and user-interface must reflect the community’s skill sets and computing preferences.  The ease of deployment and the level of customization made possible in cloud-based environments encourage adoption and effective use of file transfer solutions.
  3. Determine File Transfer Type: Do you need something scalable or ad-hoc? How important is automation?  Compared to manual file transfer process, a cloud computing environment can support centralized administration for any file type while also providing the benefits of greater storage, accommodation for large file transfers and schedule-based processes, all without negatively impacting server or network performance.
  4. Integrate with Existing Systems: Can you integrate your existing systems with a cloud-based file transfer solution? What automation tools are provided by the cloud vendor?  Many organizations believe that file transfer systems are stand-alone platforms that can’t be integrated with existing systems, like finance and accounting, for example.  Utilizing a flexible cloud-based solution with open APIs and out of the box plug-ins not only assists with secure integration with current databases and applications, but it can also be deployed very quickly with the flexibility to support the adoption of a hybrid cloud/on-premise model, should the organization decide that scenario worked best for its business.
  5. Define Workflows: Examine how business, operations and security are interrelated.  What regulations and transparency requirements need to be considered?  How are they different in the cloud?  Ensure segregation of duties between the operations and the content, between the content owners themselves.  Organizations seeking to adopt a cloud-based file transfer solution must make sure the service provider can support its user-defined workflows. It’s also important to ensure your cloud vendor goes “beyond the basics.”  Specifically, many file sharing services allow organizations to share data and information simply from Point A to Point B.  But, if you need to add additional functionality like automatically converting to a .pdf and adding a watermark for additional security, manage audit permissions, scan the file for viruses and other advanced features, an enterprise class cloud solution is necessary.
  6. Continuous Monitoring: Take steps to ensure file download activity is being monitored, file exchange validated and transfers are smooth. Organizations must be able to verify when files arrived and know who opened them. These actions are absolutely supported in a cloud environment, and are overall governed file transfer best practices.
  7. Ongoing Operations: Is it quick and easy to add new partners or set up new file transfer processes? How reliable is the service in terms of high availability, disaster recovery and automatic recovery of file transfer processes?  The cloud-based solution should provide an easy-to-use interface to empower the business user and encourage autonomy at the operations level without requiring IT involvement. Additionally, organizations should find a cloud provider that provides a simple pricing model.  For example, paying per email is not scalable and doesn’t align with typical business use.  Finally, you shouldn’t have to fly alone, be sure to take advantage of all the consulting services and expertise your service provider offers to support ongoing operations without interruption.

 

To conclude, given the traditional reliance on antiquated technologies and unreliable processes, it’s absolutely time for organizations to consider adopting cloud-based approaches to governed file transfer activities.  Moving beyond the well-established cost and resource benefits of the cloud, for those companies with complex requirements or special file transfer needs, the flexibility and security that are possible in the cloud will ensure that high quality standards are continuously met and that the confidence and peace of mind necessary to secure your file transfer’s trip to the cloud are achieved. Rest assured.

 

Oded Valin is a Product Line Manager at Cyber-Ark Software (www.cyber-ark.com). Drawing on his 15 years of high-tech experience, Valin’s responsibilities include leading definition and delivery of Cyber-Ark’s Sensitive Information Management product line, product positioning and overall product roadmap.

Page Dividing Line