A New Business Case for “Why IT Matters” in the Cloud Era Arrow to Content

October 30, 2013 | Leave a Comment

October 23rd, 2013

Author: Kamal Shah @kdshah 

Knowledge workers know that cloud services make our work lives easier, drive business agility and increase productivity. For instance, when colleagues need to share a file that’s too large to attach to an email message, they simply toss it into a cloud-based file sharing service and get back to work. It’s great that people find their own ways to team up, but can there be “too much of a good thing”?

Too much of a good thing?

Recently we analyzed the cloud services usage of more than 3 million end users across more than a hundred companies spanning a wide range of industries. We learned that the average enterprise uses 19 different file sharing & collaboration services. That’s right, 19.

Certainly there is benefit in groups and individuals using the services that best meet their needs, but with certain services, like file sharing and collaboration, an unmanaged policy can actually impede collaboration and productivity.

How? The collaborative value of these services increases as more employees use the same services. So, there is a productivity value in standardization.

Think about this common scenario

Consider a cross-functional team tasked working on Project Launchpad. At the kick-off meeting, they agree to use a file sharing site for the project. The marketing team recommends DropBox, the engineering team recommends Hightail, the customer service team recommends Box , the finance team recommends Egnyte, and so on. Now add multiple projects and keep track of which projects are in which file sharing service and you can see what a problem it becomes for the individual and the organization as a whole

No company uses 19 different CRM services or 19 ERP services or 19 email services or 19 project management applications. Similarly, it likely doesn’t make sense to use 19 different file sharing services.

Beyond productivity to economics and risk management

Aside from the productivity benefits, there is also economic value in procuring enterprise licenses over thousands of individual licenses, as well as security benefits in managing data in a smaller number of third-party cloud service providers. This latter point is most important for organizations that must maintain good oversight of where their data is—and today that is every business, large or small.

CIOs should encourage employees to identify and test new services that can boost productivity. Our customer Brian Lillie, CIO of Equinix, says that the CIO’s job is to be the “chief enabler” of the business.

The new role of “Chief Enablement Officer”

Being the chief enabler means understanding not just which cloud services employees have purchased or signed up for, but which ones they actually use and get real value out of. Then it’s the CIO’s responsibility to evaluate those services for risks and benefits, and standardize on and promote the ones that best meet the organization’s needs.

In this way, the CIO can maximize the value of the services and help drive an organized and productive movement to the cloud.

To see the services most used by today’s enterprises, check out the 2013 Cloud Adoption & Risk Report below, which summarizes data across 3 million enterprise users.

SSH – Does Your “Cloud Neighbor” Have an Open Backdoor to Your Cloud App? Arrow to Content

October 30, 2013 | Leave a Comment

October 22, 2013

By Gavin Hill, Director, Product Marketing & Threat Research Center at Venafi

Secure Shell (SSH) is the de facto protocol used by millions to authenticate to workloads running in the cloud and transfer data securely. Even more SSH sessions are established automatically between systems, allowing those systems to securely transfer data without human intervention. In either case, this technology underpins the security of vital network communications. According to the Ponemon Institute, organizations recognize SSH’s role in securing network communication and list threats to their SSH keys as the number one most alarming threat arising from failure to control trust in the cloud.

SSH authentication holds only as strong as the safeguards around the authentication tokens, the SSH keys. Failure to secure and protect these keys can compromise the environment, breaking down the trust that SSH should establish. Malicious actors take advantage of common mistakes in key management, the following are some of the common pitfalls organizations fall prey to.

The Weakest Link

Malicious actors often target SSH keys because SSH bypasses the authentication controls that typically regulate a system’s elevated privileges. In their efforts to exploit SSH, malicious actors naturally focus on compromising the weakest link in a highly secure protocol—human error and mismanagement of the private SSH keys.

The risks are real, and so are the costs. According to the Ponemon Institute, the average U.S. organization risks losing up to $87 million per stolen SSH key.

Lack of control

Less than 50% of organizations have a clear understanding of their encryption key and certificate inventory—let alone efficient controls to provision, rotate, track, or remove SSH keys. System administrators usually deploy keys manually, with different groups managing their own independent silos, leading to a fractured, distributed system. Without centralized monitoring and automated tools, system administrators cannot secure or maintain control of keys.

A report issued by Dell SecureWorks’ Counter Threat Unit revealed that one in every five Amazon Machine Images (AMI) has unknown SSH keys, each of which represents a door into the system to which an unknown party has access. As shocking as this fact seems, it is actually not surprising when you consider the ad-hoc management practices common in many organizations. In performing their jobs, application administrators copy their host key to multiple workloads but often fail to document the locations. As employees move on to new jobs, the keys linger, and the organization loses all ability to manage and assess its systems’ exposure to unauthorized access.

Injected elevated trust

An SSH server uses public-key cryptography to validate the authenticity of the connecting host. If the server simply accepts a public key without truly validating the identity of the connecting host, however, the server could easily give an attacker elevated access.

The mass assignment vulnerability, which is still largely unpatched, offers one example of an injected elevated trust exploit. In secure networks, users require root or admin privileges to append their own SSH keys to the authorized key file. Using the mass-assignment vulnerability, however, attackers create accounts that have the appropriate permissions. They then add their own SSH keys to gain the elevated privileges required to compromise the system.

Recycled rogue workloads

Cloud computing environments often reuse workloads. Amazon Web Services (AWS), for example, offers thousands of AMIs. However, you should exercise extreme caution when reusing a workload; educate yourself about the workload’s applications and configuration. In addition to rogue SSH keys, you may also find specific compromised packages.

For example, earlier this year hackers compromised thousands of web servers’ SSH daemons with a rootkit. The rootkit rendered companies’ key and password rotation policies futile: the SSH daemon simply yielded the new credentials to the attackers. The SSH rootkit completely replaced the ssh-agent and sshd binaries; only reinstalling SSH completely eliminated the threat.

BEST PRACTICES

Establish a baseline

Cloud computing has proliferated the use of SSH keys, and administrative efforts have not kept pace. Yet, when you fail to understand the SSH deployment in your organization—which keys give access to which systems and who has access to those keys—you risk losing intellectual property and, worse, losing control of the workloads.

Inventory the entire organization on a regular basis to discover SSH keys on workloads running in the cloud and in the datacenter. Establish a baseline of normal usage so that you easily detect any anomalous SSH behavior.

Enforce policies

Frequent credential rotation is a best practice, and you should make no exception with SSH keys. Unfortunately many organizations leave SSH keys on systems for years without any rotation. Although most cloud computing workloads are ephemeral, they are typically spun up from templates with existing SSH credentials, which are rarely rotated. Malicious actors can also crack vulnerable versions of SSH or SSH keys that use exploitable hash algorithms or weak key length.

To secure your environment, enforce cryptographic encryption policies that prohibit the use of weak algorithms and key lengths, implement version control, and mandate key rotation.

Scrutinize workload templates

If you choose to use prebuilt templates, implement an assessment process before the workload is used in production. Do not simply accept a pre-built workload template created by someone you do not know. First carefully inspect the template; ensure that the applications are patched, the workload configuration is secure, and that there are no rogue applications or keys that may be used as a backdoor.

 

 

Venafi Blog URL:

http://www.venafi.com/ssh-an-open-backdoor-to-your-cloud-app/

Patching the Perpetual MD5 Vulnerability Arrow to Content

October 18, 2013 | Leave a Comment

October 17, 2013

By Gavin Hill, Director, Product Marketing & Threat Research Center at Venafi

Earlier this month, Microsoft updated the security advisory that deprecates the use of MD5 hash algorithms for certificates issued by certification authorities (CA) in the Microsoft root certificate program. The patch has been released so that administrators can test its impact before a Microsoft Update on February 11, 2014 enforces the deprecation. This is an important move in the fight against the cyber-criminal activity that abuses the trust established by cryptographic assets like keys and certificates.

For over 17 years, cryptographers have been recommending against the use of MD5. MD5 is considered weak and insecure; an attacker can easily use an MD5 collision to forge valid digital certificates. The most well-known example of this type of attack is when attackers forged a Microsoft Windows code-signing certificate and used it to sign the Flame malware. Although the move to deprecate weak algorithms like MD5 is most certainly a step in the right direction, there still are some questions that need to be addressed.

Why is the Microsoft update important?

Cryptographers have been recommending the use of hash algorithms other than MD5 since 1996, yet Flame malware was still successful in 2012. This demonstrates that security professionals have failed to identify a vulnerability in their security strategy. However, cyber-criminals have most certainly not missed the opportunity to use cryptographic keys and digital certificates as a new way into enterprise networks. That Microsoft will soon enforce the deprecation of MD5 indicates that vendors and security professionals are starting to take note of keys and certificates as an attack vector.

Research performed by Venafi reveals that 39% of hash algorithms used by global 2000 organizations are still MD5. Such widespread use is worrying on a number of different levels as it clearly highlights that organizations either do not understand the ramifications of using weak algorithms like MD5 or that they simply have no idea that MD5 is being used in the first place. Research from the Ponemon Institute provides evidence that organizations simply don’t know that MD5 is being used—how could they when more than half of them don’t even know how many keys and certificates are in use within their networks?

What’s the impact of the security update?

Microsoft’s update is not to be taken lightly; this is probably why Microsoft has given organizations six months to test the patch. Once they have deployed the update, administrators will be able to monitor their environments for weak cryptography and take action to protect themselves from the vulnerabilities associated with MD5 hash algorithms or inadequate key sizes. Options available to administrators include the ability to block cryptographic algorithms that override operating system settings.

However, if a business has critical certificates that use MD5, enforcing such a security policy could result in system outages that may impact the business’s ability to service customer requests. For this reason, the update allows administrators to choose whether to opt-in or opt-out of each policy independently as well as to log access attempts by certificates with weak algorithms but to take no action to protect the system. The update also allows policies to be set based on certificate type such as all certificates, SSL certificates, code-signing certificates, or time stamping certificates.

Although I understand that Microsoft is allowing customers to choose how wide a net they are able to cast on MD5, the choices system administrators have when a security event is triggered should be of concern. Instead of choosing to apply the security policy to “all certificates,” some companies, out of concern for system outages, may limit the enforcement to a subset of certificate types. After all, history has shown that organizations have neglected to do anything about the known MD5 vulnerability for many years; they might easily continue to postpone the requisite changes. As a result, some companies may leave a massive open door for cyber-criminals to exploit.

Are there other weaknesses in cryptography that should concern me?

MD5 is not the only vulnerability to cryptography that should concern IT security professionals—there are many. However, I am only going to focus on a few of the most common.

Insufficient key length: Since 2011 the National Institute of Standards and Technology (NIST) has deprecated encryption keys of 1024 bits or less. After December 31, 2013, the use of 1024-bit keys will be disallowed due to their insecurity. Despite this, as surveyed by Venafi, 66% of the encryption keys still used by global 2000 organizations are 1024-bit keys. Vendors and service providers like Google, Microsoft, and PayPal made the shift to 2048-bit keys earlier this year. If you have 1024-bit keys in use, now is the time to upgrade to 2048-bit keys.

Lack of visibility: The majority of organizations lack visibility into or understanding of their key and certificate population. Organizations simply don’t know how many keys and certificates are in use on the network, what access they provide to critical systems, who has access to them, or how they are used. Businesses without visibility into such a critical attack vector—and with limited or no ability to respond quickly—are an attacker’s dream. To mitigate against these vulnerabilities, you must gain a complete understanding of your key and certificate population so that you know where your organization is vulnerable.

Inability to remediate: How can you defend something if you don’t know what you are defending? The lack of visibility has led to real vulnerabilities. Forrester Research found that 44% of organizations have already experienced an attack based on keys and certificates. Moreover, 60% of these businesses could not respond to the attacks, whether on SSH or SSL, within 24 hours. And the response, when unrolled, usually involves a laborious manual process that often leaves some systems unchecked.

What can I do to avoid these vulnerabilities?

To protect your business against attacks on keys and certificates, I recommend that you invest wisely in technologies that apply robust policies against the use of weak algorithms and poorly configured cryptography. At the same time, the technology should be able to detect anomalous behavior of keys and certificates and automatically respond, remediating any key—and certificate-based risks.

Safeguarding Cloud Computing One Step at a Time Arrow to Content

October 17, 2013 | Leave a Comment

by Manoj Tripathi, PROS

manoj headshot

There’ve been a lot of conversations around the concept of “the cloud.” Cloud storage and cloud computing continue to emerge as significant technology and business enablers for organizations. In many cases, cloud computing is a preferred option – it’s fast to set up and affordable. However, with this cloud convenience can come questions surrounding the security challenges of shared computing environments. It’s important that these cloud concerns are discussed and researched to continue to build momentum and increased trust in cloud computing. The Cloud Security Alliance (CSA) was formed to do just that.

The CSA is a member-driven, nonprofit organization with the objective to promote cloud security best practices. It promotes research, encourages open-forum discussions and articulates findings about what vendors and customers alike need to safeguard their interests in the cloud, and resolve cloud computing security concerns.

The current list of the CSA corporate members is impressive, with big name players in the technology biz including PROS partners Accenture, Deloitte, Microsoft, Oracle and Salesforce. PROS is pleased to announce that it has joined the ranks of the CSA.

PROS is dedicated to providing customers with secure and trusted cloud-based technology that adheres to security best practices and standards. The cloud is no longer a technology “playground.” It’s quickly becoming a mainstream solution, and with that increase in usage comes the obligation to ensure that data, systems and users are secure wherever they may reside. We’re excited to step up to the challenge of developing and improving the guidelines surrounding this progressive technology.

If you’re interested in discussing big data and security, I’ll be speaking at the Secure World Expo on Oct. 23 in Dallas and the Lonestar Application Security Conference 2013 on Oct. 25 in Austin.

Manoj Tripathi is a security architect at PROS where he where he focuses on security initiatives for areas of security, including strategy, architecture, controls, secure development and engineering for the enterprise, products and cloud functions for the company. Prior to joining PROS, Tripathi worked as a software and security architect for CA Technologies’ Catalyst Platform. Throughout his career, he has worked in diverse roles ranging from architecture design, technical leadership, project leadership and software development. Tripathi is a Certified Information Systems Security Professional (CISSP). He earned a B.E. in Electronics Engineering from the Motilal Nehru National Institute of Technology in India.

Visit http://www.pricingleadership.com/safeguarding-cloud-computing-one-step-at-a-time for more details

Gone in 60 Months or Less Arrow to Content

October 10, 2013 | Leave a Comment

by Gavin Hill, Director, Product Marketing & Threat Research Center at Venafi

For years, cybercriminals have been taking advantage of the blind trust organizations and users place in cryptographic keys and digital certificates. Only now are vendors starting to respond to the use of keys and certificates as an attack vector.

Last month, for example, Google announced that as of Q1 2014 Google Chrome and the Chromium browser will not accept digital certificates with a validity period of more than 60 months. Certificates with a longer validity period will be considered invalid.[i] Mozilla is considering implementing the same restrictions, however no decision has been announced yet. But are the responses from vendors enough in the constant battle against compromised keys and certificates as an attack vector?

The Certificate Authority Browser (CA/B) Forum, a volunteer organization that includes leading Certificate Authorities (CAs) and software vendors, has issued some baseline requirements for keys and certificates, which include reducing the certificate’s validity period. By 1 April 2015 CAs should not issue certificates that have a validity period greater than 39 months.[ii] The CA/B Forum makes some—very few—exceptions whereby CAs are allowed to issue certificates that have a 60-month validity period.

The National Institute of Standards and Technology (NIST) has disallowed the use of 1024-bit keys after 31 December 2013 because they are insecure. Rapid advances in computational power and cloud computing make it easy for cybercriminals to break 1024-bit keys. When a researcher from Ecole Polytechnique Fédérale de Lausanne (EPFL) in Switzerland cracked a 700-bit RSA key[iii] in 2007, he estimated that 1024-bit key lengths would be exploitable 5 to 10 years from then. Not even three years later, in 2010, researchers cracked a 1024-bit RSA key.[iv]

Last week Symantec responded to the NIST’s recommendation in a Symantec blog, stating that on 1 October 2013 Symantec will automatically revoke all certificates that have a key length of less than 2048 bits. The only exception is certificates that are set to expire before 31 December 2013. Symantec responded quickly because the company wants to help customers avoid potential disruptions to their websites and internal systems during the holiday period.[v]

Both the certificate’s validity period and the key’s length are paramount in any security strategy. The deprecation of vulnerable key lengths is the first step in mitigating against keys and certificates as an attack vector, but reducing the validity period of certificates is an important second step. Longer validity periods offer an inviting open door to cybercriminals who can take advantage of advances in computational power and cloud computing to launch more sophisticated attacks. No one knows when 2048-bit keys will be broken, but enforcing a 60-month validity period will help organizations adhere to best practices, rotating certificates on a regular basis and when doing so potentially replacing older certificates with ones that have better cypher strengths. Who knows, in 60 months companies may need to move to 4096-bit keys to achieve adequate security.

Symantec’s move to revoke all 1024-bit certificates with expiration dates after 31 December 2013 on the 1 October 2013 is a bold move, which is most certainly in the right direction. With such a short amount of time before the certificates become invalid, however, it will be very challenging for many organizations to replace the certificates in time. Most organizations—more than 50%–don’t have a clue how many keys and certificates they have in their inventory.[vi] Moreover, they manage their certificate inventories manually, making it difficult to respond quickly to new guidelines or actual attacks.

Cyber-attacks continue to advance in complexity and speed and increasingly target the keys and certificates used to establish trust—from the data center to the cloud. With the advances in technology, is a 60-month, or even a 39-month, validity period for certificates short enough to reduce risk? Perhaps certificates should be ephemeral, with a lifespan of only a few seconds? Reducing the lifespan of certificates to only a few seconds may drastically limit the exploitation of certificates as an attack vector.



[i] https://cabforum.org/pipermail/public/2013-August/002135.html

[ii] https://www.cabforum.org/Baseline_Requirements_V1.pdf

[iii] http://www.geek.com/news/rsa-1024-bit-encryption-only-has-a-few-years-left-565998/

[iv] http://news.techworld.com/security/3214360/rsa-1024-bit-private-key-encryption-cracked/

[v] http://www.symantec.com/connect/blogs/deadline-upgrade-2048-bit-ssl-certificates-sooner-you-might-think

[vi] www.http://venafi.com/ponemon

 

The Power of “Yes” Arrow to Content

October 3, 2013 | Leave a Comment

Allow-new-Block

 

by Sanjay Beri, CEO of Netskope

Beri-Sanjay-800x800

 

 

 

 

Shadow IT is a big deal. The problem is clear: People want their apps so they can go fast. IT needs to attest that the company’s systems and data are secure and compliant.

Everybody seems to have a Shadow IT solution these days. The problem is they’re all focused on blocking stuff. But blocking is so old school. Nobody wants to be in the blocking business these days. It’s counter to the efficiency and progress that the cloud brings.

IT and security leaders are smarter than that. Many of you are at the forefront of cloud adoption and want to lead your organization through this strategic shift.

Rather than say “no”, we at Netskope recommend saying “yes.” More specifically, we recommend “yes, and.” It’s a pretty powerful concept!

Try it out:

Yes, you can use that useful app, and I’m going to set a very precise policy that will mitigate our risk.”

Yes, music company people, use Content Management apps to share content with your rock stars, promoters, and partners. And I’m going to make sure that only those authorized can share files outside of the company.”

Yes, developer of oncology solutions, you can use that Collaboration tool that will help your projects run smoothly. And I’m going to alert our risk officer if anybody in the clinical division uploads images that may make us non-compliant with HIPAA.”

Yes, major e-commerce company, you can use your CRM. And I’m going to make sure that our Call Center employees outside of the U.S. aren’t downloading customer information.”

You can say “yes” when you can deeply inspect apps and answer the key questions – who’s taking what action in what app, when, from where, sharing with whom, sending to where, downloading what content, uploading what content, editing what object or field…and whether any of it is anomalous.

And you can say “yes” when you can set policies at a very precise and granular level, and ENFORCE them in real-time before bad stuff happens.

If you can do those things, you can take a “yes, and” approach. As a cloud-forward technology leader in your company, this is the most powerful statement you can make.

If you happen to be at Gartner Symposium and ITxpo, come see Netskope in the Emerging Technology Pavilion, booth ET19, to talk more about the power of “yes, and” and attend this session where you’ll hear not one, but three, of Netskope’s customers talk about how they are letting users go rogue and how they’re doing it safely so the business can go fast, with confidence.

And if you can’t see us at the event, be sure to check us out at www.netskope.com. Learn more about Netskope’s deep analytics and real-time policy enforcement platform. It lets you say “yes, and.”

 

Page Dividing Line