Assurance for Tomorrow’s Cloud

Cloud computing, and Big Data are natural bedfellows.  Add to that mix, Critical infrastructure, and consumers and all of a sudden the need for greater assurance only increases.  We will soon witness convergence of these technological advancements on a monumental scale, with previously disconnected systems now becoming connected.

This degree of convergence is also blurring the lines between the physical and logical world, previously IP enabled devices would allow you to access the Internet, or access email.  In the future however, and even to and to a lesser extent today we are using IP networks to control our televisions, music systems and so on.   Expect to see greater autonomy within the home (e.g. smart meters, consumer appliances), the car and at work.   At work these building management systems will include IP enabled Heating, Ventilation, and Air Conditioning (HVAC), security systems, lighting, fire monitoring, lifts, and so on.

This should of course change the assurance requirements.

A recent article in the New York Times suggested that “Providers of cloud services in Europe are having problems selling to some of their biggest potential customers: national governments”[i] .  The reason ultimately came down to assurance, in other words it was not clear to the end-customer how data would be protected.  This of course is not entirely true, there are many public sector customers using public cloud computing services, but the statement takes on an element of validity when we question the sensitivity of data being entrusted to third parties.

In the future however, the level of assurance sought will only increase.  The implications of data loss are significant, but when a security incident could affect the availability of critical infrastructure, such as the energy grid then a once per year checklist compliance assessment is simply not enough.  This raises the importance of continuous assessment, and in particular an assessment that is capable of monitoring the actions undertaken by a third party.    This raises the imperative for the success of programmes such as Cloud Audit that allows “cloud computing providers to automate the Audit, Assertion, Assessment, and Assurance of their infrastructure (IaaS), platform (PaaS), and application (SaaS) environments and allow authorized consumers of their services to do likewise via an open, extensible and secure interface and methodology”.

Simply put, in order to support the Internet Of Things, and the explosion of IP enabled devices the need for greater assurance will increase.  The assurance models whilst entirely appropriate today need to evolve to support the cloud of tomorrow.

By Raj Samani

[email protected]_Samani

Raj is the EMEA Strategy Advisor for the Cloud Security Alliance, and EMEA CTO for McAfee.  He is the co-author of the upcoming Syngress book entitled Cyber Security for the Smart Grid ([email protected]), written with Eric Knapp (Twitter @edknapp) with technical edits by Joel Langill (@ScadaHacker).



[i]http://www.nytimes.com/2012/11/21/technology/european-governments-staying-out-of-the-cloud.html?_r=0

The Battle of the Titans: What it all means for IT managers caught in the middle

Adapt, accept and manage: a BYOD mantra for corporate IT

RIM and Apple: two firms with more contrasting current fortunes you could not wish to imagine. The once high-flying Canadian BlackBerry-maker, for so long the darling of IT managers and beloved of time-starved execs the world over, has lost its way as rivals from the consumer space start to eat into its core enterprise business. Then there’s the phenomenon that is Apple, the Cupertino giant molded into the slick, stylish consumer success story it is today by the late Steve Jobs. You’re probably as likely in many organizations to see staff using an iPhone for work as a BlackBerry today, which makes two recent announcements from the tech giants all the more interesting for what they say about the firms’ respective strategies and what it all means for IT managers caught in the middle.Let’s take Apple first. A company whose primary aim is to make beautiful products at high margins, it was 100 per cent focused on the consumer when its iOS-based iPhone burst onto the scene back in 2007. Since then, the Cupertino firm has released several more models, as well as market leading tablet the iPad and slowly appears to be rolling more enterprise-friendly features into the platform.

Take, for example, volume purchases for businesses via the App Store – recently added capabilities designed to streamline the large-scale buying of applications for corporate users. Or how about the iPhone in Business and the iPad in Business web sites? Both are designed to attract the business user and showcase features which could appeal to those looking for a new corporate device. The latest much-touted announcement was the launch of the iPad Configurator: a new Mac app which enables administrators to configure up to 30 devices at a time according to corporate requirements – but not to manage them remotely.

Sounds great, but don’t let this slow creeping of iOS functionality into the enterprise fool you into thinking Apple has suddenly become a business-friendly company. Sure, it is providing more capabilities now in its devices to make them easier to use and manage in the corporate sphere, but it will always be a consumer-focused firm. It’s just that it has made its products so user-friendly that everyone who buys one now also wants to use it at work.

If you’re in any doubt as to Apple’s primary focus, consider the iPad Configurator. It enables management of only up to 30 devices – not practical for any but the smallest of organizations – and is primarily designed for the IT department which has purchased its devices and has yet to dole them out, rather than one faced with the problem of managing existing user-bought devices. Then let’s think about Apple the company. Does it have enterprise sales and support staff? An enterprise sales platform? Does it clearly communicate its product roadmap so large scale and long-term purchasing plans can be drawn up by its business customers? The answer to all of these questions is not really, although sources indicate that Apple may be acquiring some enterprise sales staff from a well-known corporate tech vendor.

Yet despite the lack of Apple’s business credentials, IT managers must evolve to meet the increasingly demanding needs of their users and the changing requirements of the role. Put simply, this means that they can no longer procure from a single enterprise vendor – they need to open up to multiple providers and be ready to accept and manage consumer devices. The good news is that there are vendors who can help fill the growing security and management holes that have appeared in this new mobile computing environment. One of them, perhaps surprisingly, is that old friend of the IT department, Research In Motion.

Now RIM has seen its business stall thanks in a large part to the success of the iPhone as well as the obvious challenge from Android. Recent Forrester research in fact place the three as having a roughly equal share of the workplace market. Unfortunately, instead of sticking to what it does best – providing highly secure hardware and sophisticated management software – it tried to beat Google and Apple at their own game and entered the consumer space. The strategy hasn’t worked and the company lurches from one bad launch to another with profits and share price plummeting. However, it did something very smart in April – it launched an update to its BlackBerry Mobile Fusion server software which will allow admins to manage iOS and Android devices as well as BlackBerry.

Unlike Apple, which is resolutely homogenous – you won’t be able to use the iPad Configurator for any non-Apple device, for example – RIM has taken the bold step of admitting not everyone in the enterprise will use a BlackBerry. This is a genuine move in the right direction – not only is a focus on the software side of its business better for its margins but it also plays to the firm’s biggest strength, its market leading security and mobile device management capabilities.

It should also serve as a firm reminder to any IT managers still not sure how to respond to the disruptive force of consumerization. If RIM can open itself up to interoperability with rival platforms, maybe they too should adopt a more open mindset when revising their corporate mobile device strategy.

The sands are rapidly shifting in enterprise IT but the quick witted IT professionals will understand that they are no longer a provider of technology for their company but a broker. It’s not for them to decide what mobile platforms to use but for their execs, line of business owners and end users to decide. IT’s new role is to engage as fully as possible with the requirements of the end users, find out where potential vulnerabilities lie and make it happen.

Adapt, accept and manage is the new Consumerization mantra for corporate IT.

As Vice President of Mobile Security at Trend Micro, Cesare Garlati serves as the evangelist for the
enterprise mobility product line. Cesare is responsible for raising awareness of Trend Micro’s vision for
security solutions in an increasingly consumerized IT world, as well as ensuring that customer insights are
incorporated into Trend solutions. Prior to Trend Micro, Mr. Garlati held director positions within leading
mobility companies such as iPass, Smith Micro and WaveMarket. Prior to this, he was senior manager of
product development at Oracle, where he led the development of Oracle’s first cloud application and
many other modules of the Oracle E-Business Suite.
Cesare has been frequently quoted in the press, including such media outlets as The Economist,
Financial Times, The Register, The Guardian, Le Figaro, El Pais, Il Sole 24 Ore, ZD Net, SC Magazine,
Computing and CBS News. An accomplished public speaker, Cesare also has delivered presentations
and highlighted speeches at many events, including the Mobile World Congress, Gartner Security
Summits, IDC CIO Forums, CTIA Applications and the RSA Conference.
Cesare holds a Berkeley MBA, a BS in Computer Science and numerous professional certifications from
Microsoft, Cisco and Sun. Cesare is the chair of the Consumerization Advisory Board at Trend Micro and
co-chair of the CSA Mobile Working Group – Cloud Security Alliance.

The High Costs of Securing Identities: How to Fix the Problem Using the Cloud

Authored by: Dan Dagnall, Chief Technology Strategist at Fischer International Identity

 

Identity Management is well down the path of a mature market space.  But I believe there is still one final, fundamental disconnect which is driving up your cost of deploying and maintaining an identity management solution, and that is programming and customization.

 

For example, one can appreciate the need to tailor your user’s experience within your organization to be the way that you want it, but the question begs to be asked, to what end?  Do you believe that identity management solutions should require your staff to write programming code in order to connect to your systems or for the purposes of maintaining custom user interfaces?  Should your IdM solution require a strategy for maintaining a code base, or simply a strategy to secure user access and their identifiers

while increasing efficiencies across your organization?  These questions are important, because when we get down to brass tacks, these questions represent  the primary drivers that can lead to insurmountable costs associated with maintaining & supporting your IdM solution.

 

“Fun factor” (and personal preference) aside, there is no reason why multiple industries should not be able to adopt similar identity management  practices.  I’m able to validate that personally, as I’ve worked with multiple customers, in multiple industries, and all of them have many requirements in common. Your identity management requirements are not as unique or “custom” as you might think.  Specifically, you need password management, you need user provisioning, you need approvals, etc.  The fundamentals of deploying such services do not change across industries (or IdM vendors). It is the mechanics that change.  And certain mechanisms that enable IdM just simply cost more (i.e., programming your way to a solution costs much more than simply configuring the solution without requiring a single programmer (yes, it’s possible, and available right now).

 

The cloud serves as that mechanism to enable configuring as opposed to programmer-driven customization to provide each and every industry with a predictable cost, a predictable path (with a real light at the end of the tunnel!) and a predictable result for solving identity management  problems. In order to justify how a cloud service model can drastically reduce your overhead associated with identity management, I must first define what identity management  IS and what it IS NOT.

 

What is Identity Management?

 

Identity management (IdM) describes the management  of individual identifiers, their authentication, authorization, and privileges/permissions within or across system and enterprise boundaries with the goal of increasing security and productivity while decreasing cost, downtime, and repetitive tasks (http://en.wikipedia.org/wiki/Identity_management ). This is the definition provided by Wikipedia, and for the most part, it is accurate; however, it is the last half of the sentence that I’d like to focus on.

 

“…with the goal of increasing security and productivity while decreasing cost, downtime and repetitive tasks”

 

Perfect. That’s exactly what everyone who ever decided they needed an identity management  solution hoped to achieve.  Unfortunately the reality in many cases is the exact opposite effect, specifically for on-premise deployments where consultants stand up your solution and turn over the keys when the project is complete.  If you’ve procured a solution that requires constant care and feeding, that consultant may be needed again to ensure your solution continues to serve its purpose and doesn’t lag behind and eventually fall short of securing your identities into the future.

 

Sure, all identity management  solutions should “increase security” (if they don’t, then what’s the point?), they should all “increase productivity” (if repetitive processes are automated, productivity by default will increase), which on the surface appears to lead to “decreased cost.” But the cost decreases  gained from efficiencies are quickly overtaken by the cost required to support the solution itself.  This is a direct result of the mechanism chosen to manage your solution (i.e., holding the customer hostage to programming code, as well as the responsibility to maintain programming code post-production deployment).

 

What is NOT Identity Management?

 

First and foremost, writing programming code is NOT identity management.  Frankly, from a customer perspective, it should not enter into the equation, ever. In order to call yourself an identity management provider, you must provide full-scale end-to-end identity management  capabilities, provide them in a way that enables customers to input their local policy, define their workflow(s), connect to their downstream target applications, and include out-of-the-box end-user interfaces that are directly connected to those same policies and resources that are distinct to each organization, and without the requirement to write “glue code” to make it happen.  And by this I mean managing users’

identities, not managing and editing programming code that then leads to managing user identities.  I’m speaking of programming, and debugging, and more programming, and more waiting to leverage new functionality or new process, and more… I could go on.

 

As an organization, the second you have to write programming code so your “solution” can actually provide value, you’ve lit the fuse that will eventually result in an explosion in overhead; specifically, the costs associated with maintaining what essentially will become a programmer’s playground and signal the end to your “increased security,” “increased productivity,” and most importantly, the end to your “decreased cost.”  When your identity management “solution” starts to take on attributes of a


software company, rest assured that is NOT the intent of identity management; in fact, the result will be the exact opposite.  Identity management  products must enable you to focus on your policy, your data, and your business rules. They shouldn’t force you to focus on how to connect to your downstream target systems, or force you to be an expert computer programmer in order to solve your identity- related problems.  Managing identities does not have to be that way. You have other options to realize “increased security,” “increased productivity,” and “decreased cost” without programming, at all.

 

So how can “the cloud” decrease my identity-related costs and overhead?

 

If your primary driver for procuring identity management  is to “increase security,” “increase productivity,” and “decrease cost,” the cloud should be a strong contender as you vet potential solutions.  “The cloud,” as it has been coined, is definitely more than a potential cost-saving option at

this point. It is THE most impactful method to lower your operating costs while maintaining or improving

service levels to your user community.

 

First, let’s talk security…

 

Cloudbased identity management can be more secure than conventional, on-premise deployments. Storing sensitive user data in the cloud is the single biggest point of contention when we discuss cloud- based IdM, followed closely by questions about identity-related data being sent over the public internet to get from the customer’s network to the cloud provider. For starters, data sent across the web is protected  by web-services security, including PKI, so it’s secure. Second, we must consider the unpopular truth that in many cases, a local datacenter is less secure than those of service providers. Also, most data breaches are caused by internal, often disgruntled, users. Externalizing the data center from the local premise helps address the issue of employees conspiring to remove sensitive information

from the datacenter, while introducing a third party into the process directly correlates to a greater level of data storage security.

 

Finally, decreasing cost…

 

First, it’s a service, so it includes the entire software stack, which may include automated provisioning, role management,  self-service portals, self-service [automated]  password reset, as well as audit/compliance & governance controls.  Second, because it’s a service, you only have to subscribe the services you want, as opposed to licensing an entire product suite when you only require a fraction of it to address your specific needs.  Simply outsourcing the administration around such a large stack of services can save you 1 to 2 FTE (including help desk, as well as server administrators like DBAs, etc.). Once you consider the laundry list of infrastructure requirements to support the IdM stack as well as the operational hours associated with managing and supporting the platform, you can begin to realize the significant amount of cost savings your organization can achieve if you choose to secure your identities via an Identity as a Service model.  And let’s never forget the expensive staffing requirements to maintain any “glue code” that is required to actually provide value to your organization.  ALL OF IT goes away in the IaaS® model.

 

In closing, identity management  is just not scalable for your organization when finances are a factor and the mechanism in use requires you and your staff to maintain extensive “glue code” in order to keep your solution afloat and growing to meet your demands.

EMEA Congress Recap

The inaugural EMEA Congress in Amsterdam was an unqualified success, with hundreds of security visionaries in attendance and featuring presentations from some of the leading voices from across the cloud security landscape. What follows are just a sample of the discussions and some of the key takeaways from the two-day event:

EMEA Congress Presenters

  • Monica Josi, Microsoft’s Chief Security Adviser EMEA presented on Microsoft’s compliance strategy, emphasizing the importance of a common mapping  strategy to define compliance standards. Microsoft has mapped over 600 controls and 1500 audit obligations onto the ISO27001 framework and are using CSA’s CCM and ISO27001 to certify their Dynamic CRM, Azure and Office365 platforms. They have also published all relevant documentation on the CSA’s STAR repository.
  • Chad Woolf, Global Risk and Compliance Leader for Amazon Web Services highlighted the difference between security IN the cloud as opposed to security OF the cloud. According to Chad, security IN the cloud presents a much greater risk and discussed some of the different assurance mechanisms provided by AWS.
  • Data security and privacy expert Stewart Room provided an update on some of the more pressing legal issues facing cloud security, including a plea for more realistic legislation (e.g. subcontractor recommendations of Art 29 WP)
  • Mark O’Neill, CTO for Vordel gave an update on IDM standards, including oAuth 2.0 and OpenID Connect and how they fit into the cloud ecosystem. oAuth 2.0 is now a stable standard which can be used to give granular, revocable access control. It is lighter than SAML and therefore more suitable for mobile/REST scenarios.
  • Phil Dunkelberger made an impassioned call to arms for the industry to create a standard authentication protocol which would allow for the integration of appropriate authentication mechanisms into diverse services.
  • Jean-François Audenard, Cloud Security Advisor, for Orange Business Services presented their Secure Development Lifecycle that covers security and legal obligations, mitigation plans, security reviews and on-going operational security and the roles of their security Advisors, Architects and Managers in the lifecycle.

Panel Discussion Takeaways:

  • While Gartner has some 26 definitions for Cloud, according to Bruce Schneier it can be boiled down to the fact that it’s simply your data on somebody else’s hard disk that you access over the Internet!
  • Cloud provider specialization and reputation means better security in many respects. As to the question of what can be more difficult in the cloud, forensics is a major issue (e.g., ‘freezing the crime scene’, confiscation of hardware, etc)
  • As a customer, there is a lot you can and should do to monitor the cloud service provider (either independently and/or via executive dashboards). This also allows you to establish trust in smaller companies with less history.
  • Internal IT teams are not redundant . There are lots of security-related tasks still need to be taken care of. This is especially true for IaaS providers ( e.g. credential management ). The cloud provides opportunities for many of these individuals to perform higher value tasks (i.e., security training of staff, service monitoring, etc).
  • Business is consuming technology quicker than IT can provide it; as a result more internal business users are utilising external third party and cloud vendors to process their information. For example, MARS Information Services is using a modified version of ISO27001 (ISO++) and the CSA’s CCM to risk assess their third party vendors. As engagement move from Iaas to Paas and SaaS the level of risks increase as the controls are given to the service provider.
  • Historically, organizations have been largely concerned with securing the network, not the information that resides on it. We need to now protect information based on the risk associated with the compromise of that data. As such, a risk based approach to security requires data to be “high level” classified.
  • Once data has migrated to the Cloud, access and authentication becomes key. Authentication is currently taken for granted (passport, room key, ID badge, airline ticket, cards), except online where credentials are often re-used. If they are compromised, all systems using those credentials are vulnerable.
  • As data moves to the Cloud, there will situations that will require the data to be recovered, in a forensically sound way. The use of multi-tenant environments across multi-jurisdictions introduces numerous e-disclose and chain of custody challenges that are yet to be solved.

 

“Great conference with a number of speakers that really provided up to date, timely and in-depth information” – Peter Demmink, Merck / MSD

“The CSA delivered an excellent intro to all the aspects of cloud security and compliance” – Albert Brouwer, AEGON

 

 

 

Context + Analytics = Good Security

Data [dey-tuh] noun: individual facts or statistics

 

Information [in-fer-mey-shuhn] noun: knowledge concerning a particular fact or circumstance

 

When does data become consumable information? When we correctly manage security, we integrate security devices into our infrastructure in a manner designed to support our privacy, security, and regulatory requirements. The problem is that good security can generate a lot of data. This is exacerbated by the desire to ensure that the data is actually consumable information – stuff we can use.

 

42

 

African or European?

 

Data is just “stuff,” while information is what that stuff means. Is “42” simply 6×7, or is it really the answer to life, the universe and everything? Are “African or European” just words to you, or do they have something to do with the airspeed of an unladen swallow?. To make sense of these, do you need the context of Douglas Adams and Monty Python? That is not your fault. It just is.

 

Your management of security data follows the same rules. Data is more valuable if viewed in context. If you have an IDS reporting a port scan on IP 192.161.0.12, that is simply a piece of data. You still have to figure out what that data means to you. Is it important or is it noise?

 

Getting Context

 

Your organization uses data, and the security parts of your organization use security relevant data.

 

For a non-security example, let’s use a 3000-piece puzzle. You have to put it together without looking at the picture on the box. You can look at a piece, and add context to that piece. Is it a corner piece, a side piece, or a middle piece? Does the piece have a part sticking out or does it have a hole? Is that something red and round on the piece? Is that something shiny? All of these observations add context to the pieces, as well as the puzzle as a whole.

 

When you add context to security information, it helps tell you how to build your entire security program. You go from supporting “data” to supporting “PCI data,” along with all that it means to be PCI compliant. You know that the environment that supports PCI data at BigBlueBank is going to receive more advanced security controls than the inventory control system at Joe’s Hat, Boot and Shoe Company. While the two data sets are both important to their respective companies, the specific regulatory requirements placed on the PCI data should result in enhanced controls at BigBlueBank. Even staff at Joe’s would agree that the number of size 10 boots in stock is not as sensitive as credit card data. PCI has elevated requirements for a variety of technical controls, including data segregation and encryption, as well as incident response, policy, procedure, and training. If you add St. Mary’s Hospital to the mix, you can imagine that their trauma center has stronger availability/resiliency requirements than they do at Joe’s Hat, Boot and Shoe Company. The context within which the data works shapes the entire environment.

 

The supporting information adds context to the raw security data. Your IDS alert that was previously just “data” gets a whole new meaning if you have the context to know whether 192.161.0.12 is the system that holds your credit card database, or is an internal website that has limited value. Without security context, you might know that you have an alert, and that you are being attacked. But with good context, you can tell that the server being attacked is named “Mordor,” and is a Windows Server 2008, R2 SP1, running Oracle 11g Enterprise, that sits in the Princeton, N.J., data center in row 3, rack A12, and it holds all of your clinical patient records, so it falls under HIPAA and HITECH. That information, and context, should make a huge difference in how you manage and protect the information, as well as threats to it.

 

Advanced Analytics

Adding context to data gives you information. Analytics adds even more information by evaluating relationships between the various pieces.

You started sorting the puzzle pieces, adding context where you could. You might group pieces that have red on them, as well as pieces that are shiny, to see if you can find anything in common or see a pattern. You start assembling the frame of the puzzle by looking at the sides and corners.

When you look at how the pieces fit together, you are looking at the relationships between those pieces. That is analytics. Next, you look at the red pieces, and see how they fit together. After you assemble three or four pieces, you recognize that the red is a clown nose. Analytics gives you even more data since now you know that the puzzle has a clown in it. That piece of information improves the context which you had previously assigned to every other puzzle piece. Then you assemble some shiny pieces and realize it is a shiny hubcap on a wheel. Analytics.

Better yet, you can match those larger pieces of information together and realize that the puzzle probably includes a clown car, which automatically adds new context to all of the other pieces in the puzzle. Analytics helps you to recognize the giant daisy that squirts water, and the huge green shoe sticking out of the trunk. You utilize analytics to assemble multiple clowns, and the car. Contextual information enabled you to start building, but it was analytics that actually let you make progress and eventually finish the puzzle.

Of course, the same rules apply with information security. The context is invaluable, and lets you understand what your event and alert information means. But the analytics applied to those events forms a bigger picture of what is happening in your environment, and is even more important.

Context and Analytics in Practice

How does this work in real life?

Joe’s Hat, Boot and Shoe Company has a relatively immature security management practice. They generally ignore an external port scan. When they get a series of login failures on an internal system, they probably ignore that also, unless a systems/security admin happens to realize that those failures came from a known “important” system. They effectively ignore a privileged database login since they probably lacked context to see how important the system was and their level of security paranoia was relatively low. If Joe’s sees the elevated traffic levels, it may be cause for concern, but for the most part it is simply one more in a flood of other events. Keep in mind that Joe’s did not get just these five events. Joe’s got these five events along with another 3,000 or so events that evening. Chances are that the IT staff at Joe’s is not alerted to anything.

Bob’s Big Box store could probably care less about the 17th port scan they saw that week. BBB may also not be terribly worried about a series of external login failures, but when those failures are immediately followed by a success, analytics kicks into action. Was this a user mistyping a username and/or password, or was this a successfully guessed password? At the very least, good analytics has this marked as “curious.” This is probably marked even “curiouser and curiouser” when analytics checks back in time and sees BBB had been port scanned 10 minutes earlier. Suddenly, the port scan is not “just another port scan.” Can good analytics be applied to anything else interesting about the events? For example, did the port scan and login attempts come from the same IP address? This could lend additional context to the events.

BBB sees a series of internal login failures. Given that this followed shortly after the suspicious external logins, this is now marked with an elevated concern more like “interesting.” Their internal systems report the privileged account logon as a matter of due course, and it is only really interesting if it falls in a reasonable time sequence in the series of events that are undergoing analytics. Elevated outbound traffic volume would be the last straw. Analytics considered 3,000 events, and picked out a series of five that it decided were related – that they fit together like the corner of a puzzle.

What happens next depends on how BBB has defined their security profile. At the very least, an internal alert is issued, and if they are prepared, they would probably terminate outbound traffic at the firewall when the extra traffic was detected.

The five events are a dramatic oversimplification. So is the “five out of 3,000.” In reality, this could be thousands and potentially millions, of events, depending on your environment. If your environment consists of six systems, and one IT guy knows them all, he may be able to accomplish all of your analytics. But if yours is an organization of any size, doing meaningful analytics in a manual manner is going to be more a matter of luck than skill.

 

————————————————————————————————————–

Jon-Louis Heimerl is Director of Strategic Security for Omaha-based Solutionary, Inc., a provider of managed security solutions, compliance and security measurement, and security consulting services. Mr. Heimerl has over 25 years of experience in security and security programs, and his background includes everything from writing device drivers in assembler to running a world-wide network operation center for the US Government. Mr. Heimerl has also performed commercial consulting for a variety of industries, including many Fortune 500 clients. Mr. Heimerl’s consulting experience includes security assessments, security awareness training, policy development, physical intrusion tests and social engineering exercises.

www.Solutionary.com

Red Hat Joins the Cloud Security Alliance

By: Cloud Computing Team

That user concerns about security and related matters are part and parcel of how and when cloud computing—whether on-premise, in public clouds or a hybrid—gets adopted isn’t news. Even if the risks are sometimes more about perception than reality, the fact remains that survey after survey puts “security” at or near the top of inhibitors to cloud adoption. And that makes understanding how to mitigate these risks an industry priority given the flexibility, agility and cost benefits that cloud computing can bring.

Many companies and groups are working to address security challenges in various ways. The Cloud Security Alliance (CSA), founded in 2009, is one of the most important of such initiatives because it’s arguably the organization taking the broadest view of the problem. It’s a not-for-profit organization whose mission is to promote the use of best practices for providing security assurance within cloud computing, and to provide education on the uses of cloud computing to help secure additional forms of computing.

Red Hat has been participating in the CSA community for nearly two years, and has been working to bring awareness and utilization to the tools built by the CSA to provide security to physical, virtual and hybrid cloud environments. Now, as an official corporate member of CSA, Red Hat will continue to drive a focus around open standards and security to protect enterprise workloads in the cloud.

The CSA has a broad membership with over 130 corporate members. This includes IT vendors like Red Hat who sell to a wide range of industries. But it also includes companies, such as healthcare technology supplier McKesson, that specifically work in industries that are highly regulated and significantly affected by data privacy requirements. It includes professional services firms with an interest in security and compliance issues, such as Ernst & Young and PwC. It includes government agencies such as the Department of Defense and suppliers to those agencies such as Raytheon. And it includes large technology end users such as eBay. The CSA also has a whopping almost 40,000 individual members in its LinkedIn group.

One specific CSA initiative is its Cloud Security Alliance Cloud Controls Matrix (CCM). CCM is designed to provide fundamental security principles “to guide cloud vendors and to assist prospective cloud customers in assessing the overall security risk of a cloud provider.” The goal here is essentially to provide structure so that security can be evaluated in a systematic way. Specifically, in the CSA’s words, to provide:

“…organizations with the needed structure, detail and clarity relating to information security tailored to the cloud industry. The CSA CCM strengthens existing information security control environments by emphasizing business information security control requirements, reduces and identifies consistent security threats and vulnerabilities in the cloud, provides standardize security and operational risk management, and seeks to normalize security expectations, cloud taxonomy and terminology, and security measures implemented in the cloud.

It’s important to be systematic in this way because security isn’t one thing. In fact, the CCM considers 98 distinct areas of control across 13 different domains; such as compliance, resiliency and information security. Each of these areas of control is then mapped to the area of IT architecture where it plays (e.g., networking, data or compute), its relevance to different cloud service delivery models (IaaS, PaaS and SaaS), and its relationship to a wide range of regulations. Even a quick scan of the detailed matrix gives a sense of the degree to which the CCM provides a very specific practical framework that organizations can use. (A 2009 study by the European Network and Information Security Agency (ENISA) provides a framework that’s in a somewhat similar vein.)

The CCM (or an alternative document called the Consensus Assessments Initiative Questionnaire) can be used by cloud users to structure their own evaluations of cloud providers. However, these documents are also inputs to another CSA initiative called the CSA Security, Trust & Assurance Registry (STAR), a free, publicly accessible registry that “documents the security controls provided by various cloud computing offerings.” Cloud providers can submit self-assessment reports that document their compliance to CSA-published best practices. The CSA’s goal is to make it easier and faster for cloud users to do their due diligence and generally move to an environment where security practices are more transparent and even used as a differentiator among different cloud providers.

The CSA also conducts research into cloud computing. Most recently, on September 27, it released the results of a Cloud Market Maturity study, a collaborative project with ISACA, intended to provide business and IT leaders with insights into the maturity of cloud computing. While the report found many positive indicators, it also identified a number of areas in which the survey respondents had less confidence in cloud computing. We were particularly interested to note that a number of these—such as exit strategies, longevity and credibility of suppliers, integration with internal systems and contract lock-in—very much talk to the need for an open, hybrid approach to cloud computing. That’s why we at Red Hat firmly believe that open and hybrid are essential elements of a cloud strategy, as we discuss in this whitepaper.

You might not always know it from the predictable breathless headlines one sees whenever there are reports of a provider’s service outage or security breach, but cloud security discussions are moving beyond the naïve “is it safe?” stage. They always have been, really, among knowledgeable security practitioners. They understand that cloud security is part of a broader IT governance discussion and that security exists in the context of the many tradeoffs that are always being made with IT systems. But those nuanced analyses are becoming more mainstream. And one of the important reasons this is happening is that organizations such as the CSA are helping to codify best practices and make them easier to consume.

Learn more about Red Hat’s work in the cloud computing space here.

Removing Cloud Barriers in Europe

No one is immune to the ever-changing technology forecast, but one constant (at least for the near future) appears to be global cloud cover. Cloud computing is arguably the most dominant theme  on every enterprise’s IT list, but in Europe, it’s being met with some key challenges.  The European Commission acknowledges that Europe must become more “cloud active” to stay competitive in the global economy, but public cloud adoption is fragmented and lags behind the US by some three to five years.

 

So what’s stopping cloud adoption in Europe? Major cumulative barriers to adoption include concerns surrounding legal jurisdiction and data security. Cloud computing and IT security companies know all too well that data privacy laws vary greatly around the world – a key challenge for global enterprises as they seek to adopt the cloud. Each country/geography they operate in has specific data regulations that must be met. And each country/geography in which they store and process data, which may be different from where they physically operate, also has specific data laws that must be followed. To complicate matters, these rules and regulations are very likely to change over time, particularly as technological advances emerge and government regulators fine tune their policies.

 

In a recent study entitled “Cloud in Europe: Uptake, Benefits, Barriers, and Market Estimates” research firm IDC surveyed European business users and consumers.  IDC’s research uncovered 12 key obstacles ranging from cloud data residency and security issues to slow performance and limited tax incentives for capital spending. But the majority of survey respondents (62.2 percent) cited four specific barriers, primarily related to data control:

 

  1. 1.       Legal jurisdiction: Where the does the service reside? Where does the data reside? What if I don’t want my data stored in a specific country?

 

  1. 2.       Security and data protection: Who is responsible for security, data protection, and backups? What happens if something goes wrong?

 

  1. 3.       Trust: How do I tell which services are reliable? Who guarantees data integrity and availability?

 

  1. 4.       Data access and portability: Once I sign a contract, how much interoperability will I have? Can I interact with different services and move my data from one service provider to another?

 

Data control is the common denominator, and Europe must take steps to empower data controllers if it wants to maximize cloud adoption benefits. Those surveyed offer clear guidance on what the EU could do, including enacting specific rules on service provider accountability; guaranteeing application and data portability between services; implementing an EU-wide cloud security certification program; clarifying and harmonizing data residency and legal jurisdiction regulations; and fostering EU-wide standardization of cloud services.

 

These are great suggestions. But aside from regulatory policy changes that could take a long time to deliver, the group also states that demonstrated current success by peers and strong evidence of cloud benefits would greatly enhance adoption. A solution to these challenges that gives companies downstream flexibility is critical. This type of success is possible today with a cloud data protection gateway that allows European cloud users to control their data completely when using cloud SaaS applications – regardless of geographic location of the cloud service provider’s data centers. When researching gateways, keep the right questions about your data at top of mind:

 

  • What sensitive data needs to remain private and protected?
  • What level of protection is required?
  • Who needs access to the data?
  •  What laws and jurisdiction govern information and are they likely to change over time?

 

Be sure to look for a solution that allows data controllers to configure their cloud systems with the appropriate data protection protocols that overcome the primary residency and security obstacles holding Europe back.

 

David Stott is senior director, product management, at PerspecSys where he leads efforts to ensure products and services meet market requirements. 

Assessing Your IT Environment and Evaluating Cloud

by John Howie, COO, CSA

In many conversations with IT leaders today  we discovered a common problem: they need a simple way to understand systems, processes, current policies and procedures and be able to evaluate  how  the cloud may help them realize lower IT security costs, improve best practices, and perhaps most importantly -communicate that to their management team.  After all, a move to the cloud needs to be a strategic one.

Today at RSA Europe Microsoft announced a new free Cloud Security Readiness Tool that helps organizations better understand and improve their IT state,  identify relevant industry regulations based on selected industries, and evaluate whether cloud adoption will meet their business needs.  The tool can speed the assessment process of anyone considering cloud services.

The CSA Security, Trust and Assurance registry (STAR) is our publicly accessible  registry that documents the security controls provided by cloud service offerings.  CSA STAR is open to all cloud providers, and allows them to submit self assessment reports that document compliance to CSA published best practices. The searchable registry will allow potential cloud customers to review the security practices of providers, accelerating their due diligence and leading to higher quality procurement experiences. CSA STAR represents a major leap forward in industry transparency, encouraging providers to make security capabilities a market differentiator. CSA STAR currently contains 13 entries from 11 vendors. I encourage companies to go check out this tool to help assess the benefits of adopting a STAR service.

 

 

Riding the Consumerization Wave

Rather than resist it, organizations should embrace Consumerization to unlock its business potential. This requires a strategic approach, flexible policies and appropriate security and management tools.

The Consumerization of IT is the single most influential technology trend of this decade. Companies are already well aware of it, as they wrestle with the growing influx and influence of smartphones, tablets, Facebook, Twitter and on and on.  This “Bring Your Own Device” (BYOD) movement is very reminiscent of the early days of PCs in the late 1970’s-early 1980’s, when workers bought and brought their own Apple II or IBM PC to work to handle spreadsheets (using Visicalc or Lotus 1-2-3 respectively) so they could get data processed immediately rather than wait in line for the IS department to process punchcards, tapes, or whatever else the I/O was.  Ultimately, IS heads had to stop resisting and start accepting the PC wave, and you know the rest of that story.

While this new BYOD growth does bring risks, too many companies make the mistake of trying to resist the influx of consumer IT. So what are the solutions and best practices for a company to turn Consumerization into a competitive advantage?

One: Have a plan. Take a strategic approach to Consumerization and develop a cross-organizational plan. IT cannot do this in a vacuum and will have to engage executives, line of business owners (marketing, sales, HR, product development) as well as customers, partners, and internal early adopters. While planning to adopt new consumer technology, IT managers should survey their most innovative users to discover what devices and applications they like and what they find most useful in their work activities. In this way IT will pull from users’ experience rather than pushing IT views to their base.

Two: Say yes – but not to everything for everyone. Develop a set of policies that clearly define what devices and applications are considered corporate-standard (fully supported by IT) vs. tolerated (jointly supported with the user) vs. deprecated (full user liability). In addition, IT should profile the global workforce based on relevant attributes such as role, line of business and location. And then map technologies to user profiles and define SLAs for each intersection.

Three: Put the right infrastructure in place. Deploy appropriate IT tools specifically designed to secure and manage consumer technology in the enterprise. Be aware that while some solutions have already materialized along the lines of specific product segments – i.e. Mobile Device Management, no single vendor can provide one single solution covering all functional requirements across all platforms. As vendors enter the Consumerization space with solutions initially developed for adjacent product segments, most solutions tend to offer overlapping core functionality and tend to lack the cross-platform support critical to protect and manage the full spectrum of consumer technologies. Therefore, IT will have to integrate multiple offerings across different product categories: security solutions for Internet content security, mobile anti-malware and mobile data protection, Mobile Device Management tools for system provisioning and application management, and Telecom Expense Management providers for procurement, support and cost control of voice and data services.

Companies that are questioning whether or not to allow workers to bring personal devices into the workplace should just stop asking: It’s clear that you can get a competitive edge when you put the right precautions in place. The BYOD phenomenon gives companies that allow it a competitive advantage as it enhances innovation and creativity in the workplace while reducing overall costs for the entire organization. The key to not being overwhelmed by this trend is that all these devices need to be secured by implementing the proper BYOD policies and procedures.

The lack of a strategic approach to Consumerization creates security risks, financial exposure and a management nightmare for IT. Rather than resist it, organizations should embrace Consumerization to unlock its business potential. This requires a strategic approach, flexible policies and appropriate security and management tools.

Consumerization and BYOD are disruptive and inevitable. But many IT leaders are slow to realize it. Like dinosaurs of a previous IT era, they are headed for extinction.

 

[BIO] As Vice President of Mobile Security at Trend Micro, Cesare Garlati serves as the evangelist for the enterprise mobility product line. Cesare is responsible for raising awareness of Trend Micro’s vision for security solutions in an increasingly consumerized IT world, as well as ensuring that customer insights are incorporated into Trend solutions. Prior to Trend Micro, Mr. Garlati held director positions within leading mobility companies such as iPass, Smith Micro and WaveMarket. Prior to this, he was senior manager of product development at Oracle, where he led the development of Oracle’s first cloud application and many other modules of the Oracle E-Business Suite.

Cesare has been frequently quoted in the press, including such media outlets as The Economist, Financial Times, The Register, The Guardian, Le Figaro, El Pais, Il Sole 24 Ore, ZD Net, SC Magazine, Computing and CBS News. An accomplished public speaker, Cesare also has delivered presentations and highlighted speeches at many events, including the Mobile World Congress, Gartner Security Summits, IDC CIO Forums, CTIA Applications and the RSA Conference.

Cesare holds a Berkeley MBA, a BS in Computer Science and numerous professional certifications from Microsoft, Cisco and Sun. Cesare is the chair of the Consumerization Advisory Board at Trend Micro and co-chair of the CSA Mobile Working Group – Cloud Security Alliance.

 

 

[AWARDS]

*** Nominated “Top 10 Consumerization Thought Leaders” 2011 http://blog.matrix42.com/content/top-10-consumerization-thought-leaders-part-two

Cesare Garlati – Cesare’s daily duties as Senior Director of Consumerization at Trend Micro might have been enough to get him on this list, but his blog leaves no doubt. At  BringYourOwnIT.com, Cesare writes about consumerization and everything else that’s causing disruption in IT. In a  SC Magazine article earlier this year, Cesare suggested organizations approach consumerization in a tactical way: “(Embracing CoIT) is the optimal approach. Create a plan that spans the whole organization; say yes for some but not for everyone by determining a group of users and figure out what technology is allowed; and figure out what tools are needed and put the right infrastructure in place.”

 

BLOG: http://BringYourOwnIT.com                TWITTER:  http://twitter.com/CesareGarlati

 

SPEAKING ENGAGEMENTS:

 

  • RSA Conference Europe 2012

October 9-11, 2012 – London, UK “Smartphone Security Winners & Losers”

  • Mobile 2.0 Conference

September 11, 2012 – San Francisco, CA “Mobile Enterprise/Consumerizaton of IT”

  • RSA Conference China 2012

August 28-29, 2012 – Chengdu, CN “Smartphone Security Winners & Losers”

  • DIRECTION EXPO 2012

August 7-8, 2012 – Tokyo, JP “Mobile Security”

  • European Association for e-Identity and Security

July 5, 2012 – Slough, UK “Securing Mobile Devices”

  • IET – Mobile Security Summit June 20, 2012 – London, UK “Security for Mobile Devices”
  • Ingram Micro Cloud Summit

June 4, 2012 – Phoenix, AZ

“How secure is your smartphone?”

  • BCS – The Chartered Institute for IT May 16, 2012 – London, UK

“Consumer Mobile Technology in the Enterprise: A leap of faith?”

  • Mobile Convention Amsterdam

May 8, 2012 – Amsterdam, NL

“Consumer Mobile Technology in the Enterprise”

 

  • Tablet Strategy Conference April 27, 2012 – New York, NY “Secrets of a good corporate app”

 

  • ISSA/AIPSI – Associazione Italiana Professionisti Sicurezza Informatica

April 5, 2012 – Milano, Italy

“Roundtable: Consumerization, Millenials and Mobile”

 

  • Information Assurance Advisory Council

March 13, 2012 – London, UK

“Education and training in security awareness”

 

  • Mobile World Congress 2012

February 27 – March 1, 2012 – Barcelona, Spain

Mobile Security Forum “Consumer Mobile Technology in the Enterprise: A Leap of Faith?”

 

  • IDC Enterprise mobileNext Forum, November 30 – December 1 2011, San Francisco, USA Mobility Management & Security – A Customer Panel

 

  • CTIA Enterprise Mobility Boot Camp, October 10 – 13, 2011, San Diego, USA “Consumerization Report 2011”

 

  • Gartner Security & Risk Management Summit 2011, September 19–20, London UK “Embrace Consumerization. Unlock Opportunity”

 

  • Channel Link 2011, September 14-16, Los Angeles USA “Embrace Consumerization. Unlock Opportunity”

 

  • IDC CIO Summit 2011, July 28-29, Singapore

“The Consumerization of IT: Embrace Consumerization, Unlock Opportunity”

 

  • Mobile Computing Summit 2011, June 28-30, San Francisco USA “Mobile Landscape Security Risks and Opportunities”

 

  • Gartner Security & Risk Management Summit 2011, June 20–23, Washington DC USA “Virtualization, Consumerization, Security Three Worlds Collide?”

 

 

VIDEOS/PODCASTS – http://www.youtube.com/user/BringYourOwnIT

 

  • RSA Conference 2012 – Podcast

 

  • Mobile Convention Amsterdam 2012

 

  • Mobile World Congress 2012 – Mobile Security Forum

 

  • Consumerization and BYOD – What are the Security Risks?

 

  • BYOD and Mobile Security: Remote working during the Olympics

 

  • Video interview at CITE 2012 – Consumerization of IT in the enterprise

 

  • Video interview at Mobile Word Congress 2012 – Barcelona

 

  • Financial Times Podcast – The downsides of bringing your own device to work

 

  • Consumerization 101: How to bypass the iPad password in 5 seconds

 

  • Embracing Consumerization in the Enterprise

 

  • The Consumerization of IT – Trailer. Full video available upon request

 

QUOTES:

 

“Mobile security fact: Android is the #1 mobile platform in the world. It is also the most vulnerable to attack

– and in fact the most exploited.”

 

“Contrary to common perception, Apple mobile devices are not immune to security flaws. And in fact less secure than Android if users “jail break” their devices – a jailbroken iPhone is not a secure phone.”

 

“[Mobile] Consumer technology is sexy, convenient and easy to use. When it comes to security and data protection however, consumer technology still has a long way to go.”

 

“[There is a] total lack of education out there, especially in the consumer sector. The consumers need to be told that there is a real and serious threat in terms of security on your mobile phone and it’s an economical threat.”

 

“No matter what type of smartphone you own, you are in danger. Every single platform is exposed to this, no platform is immune. Some are safer than others, but none are immune.”

 

“The [security] problem [with mobile devices] is not with the phone itself breaking or being stolen, but with the data on the phone getting into the wrong hands – including bank details and passwords. By exposing your personal information, you are exposing yourself, your financial situation and your family situation.”

 

“[BYOD Bring Your Own Device] Besides preserving data security and managing a myriad of personal devices, companies must also consider a new set of legal and ethical issues that may arise when employees are using their own devices for work.”

 

“[BYOD Bring Your Own Device] Many employees don’t understand the implications of using their personal devices for work. Many companies don’t understand that they are in fact liable for the consequences.”

 

“Consumerization and Cloud are in fact two faces of the same coin: the epochal change of the role of corporate IT – from technology provider to technology broker.

 

“Consumerization, BYOD and Cloud are disruptive and inevitable. But many IT leaders are slow to realize it. Like dinosaurs of a previous IT era, they are headed for extinction.”

 

“The lack of a strategic approach to Consumerization creates security risks, financial exposure and a management nightmare for IT.”

 

“Rather than resist it, organizations should embrace Consumerization to unlock its business potential. This requires a strategic approach, flexible policies and appropriate security and management tools.”

 

“My advice for organizations facing an increasingly consumerized IT world is to realize that Consumerization is happening and they can’t stop it – and in fact they shouldn’t. I strongly recommend our customers to embrace Consumerization to unlock its business potential.

 

“Embrace [Consumerization] is the optimal approach. Create a plan that spans the whole organization; say yes for some but not for everyone by determining a group of users and figure out what technology is allowed; and figure out what tools are needed and put the right infrastructure in place.”

 

“Companies that are questioning whether or not to allow workers to bring personal devices into the workplace should just stop asking: It’s clear that you can get a competitive edge when you put the right precautions in place. The BYOD phenomenon gives companies that allow it a competitive advantage as it enhances innovation and creativity in the workplace while reducing overall costs for the entire organization. The key to not being overwhelmed by this trend is that all these devices need to be secured by implementing the proper BYOD policies and procedures.”

 

 

PRESS TALKING POINTS / CONTROVERSIAL STATEMENTS:

 

The dark side of BYOD: privacy, personal data loss and other bad things. Many employees don’t understand the implications of using their personal devices for work. Many companies don’t understand that they are in fact liable for the consequences. The things you always wanted to know about BYOD but were too afraid to ask.

 

How secure is your smartphone? Mobile Security facts: Android is the #1 mobile platform in the world. It is also the most vulnerable to attack and in fact the most exploited. Contrary to common perception, Apple mobile devices are not immune to security flaws. And in fact less secure than Android if users “jail break” their devices – to escape Apple’s suffocating control.

 

Consumerization is happening to corporate IT, rather than being driven by corporate IT. The business and the employees are dictating the IT agenda. Consumerization is therefore inevitable, but many IT leaders are slow to embrace it. Like dinosaurs of a previous IT era, they are headed for extinction.

The Impact of Computing Power on Cryptography

Advanced technology is a beautiful thing. Not only has it enabled the creation of new, more efficient methods of application delivery and data storage (the Cloud is a prime example), but it’s also helped propel the development of more sophisticated solutions for data protection as well (think tokenization, encryption). That said, there is a challenge that accompanies this evolution of technology – the savvy cybercriminal. Determined to keep pace with, or even ahead of, each advance in data protection, these criminals are posing a huge threat to corporations and governments worldwide. Professor Fred Piper, a renowned cryptographer from Britain’s Royal Holloway University, recently shared his views regarding the issue and they included a sobering assertion: cybercriminals – with the assistance of anticipated future breakthroughs in computing (known as quantum computing) – could theoretically be able to decipher encryption algorithms. Imagine the consequences.

 

But since quantum computing is not a reality yet, and it will take a bit longer for it to get into the hands of cyber criminals, so why worry, right? It turns out that while Piper was focused on the impact of quantum computing, he was actually helping shine a light on another threat: the increased access to supercomputing, made possible by the cloud. While evangelists of cloud-based supercomputer access tout the ease with which Small and Medium Enterprises (SMEs) can now utilize computing power to run things such as models for fluid dynamics, this computing power can also now be used to attack computer security systems, and weak cryptography in particular. Just knowing that cybercriminals will be able to harness this sort of computing power creates yet another reason for enterprises to make sure they use the strongest cryptographic approaches available when encrypting their most sensitive levels of data. Any sub-standard encryption is much more likely to be cracked using the tools now available via the cloud.

 

This strong security approach needs to be applied to data that is being transferred (“in flight”) to the cloud, being processed in the cloud or being stored (“at rest”) in the cloud. To help the industry stay ahead of these issues, organizations such as National Institute of Standards and Technology (NIST) have issued standards such as the Federal Information Processing Standards (FIPS) for use across the Federal government in the United States. The FIPS 140-2 standard is an information technology security accreditation program for validating that the cryptographic modules produced by private sector companies meet well-defined security standards. FIPS 140-2 has also been adopted as a minimum standard within many industries including finance and healthcare.

 

Case closed, right? Wrong. While standards are extremely valuable they have to be applied correctly. And, regrettably, confusion has been caused in the market by some players using terms such as “military grade encryption” attached to a technique known as “Functionality Preserving Encryption” (which has lesser validation than FIPS 140-2). Organizations should carefully consider the strength of the encryption being used to safeguard their information and avoid proprietary,” closed” approaches that have not been published or peer reviewed. There may also be industry or regulatory mandates to use a certain type of encryption depending on the business realm(s) in which the organization operates. And if the preservation of functionality of their SaaS applications, such as Searching and Sorting, is important to the organization, ensure this is possible when implementing the level of encryption that the enterprise wants (or is required) to use.

 

The challenge with encryption is that once attackers obtain the key, it is effectively broken because they can decipher all the data encrypted with that key. Weak cryptography that can be broken using newly available supercomputing power poses a serious risk to organizations that face criminal charges, civil liabilities, and brand damage should a data breach occur. It is therefore imperative that organizations use the strongest encryption they can to prevent accusations that slipshod security, especially when tied to cost-saving efforts, contributed to the breach.

 

Enterprises should also strongly consider tokenization as an option for obfuscating sensitive information. Tokenization is a process by which a data field, such as a primary account number (PAN) from a credit or debit card, is replaced with a surrogate value called a token. Only the token value is transmitted to the cloud, and the real value is securely stored inside the enterprise network. (De-tokenization is the reverse process of redeeming a token for its associated original value, and the process must occur within the enterprise firewall.)

 

While there are various approaches to creating tokens, they typically are simply randomly generated values that have no mathematical relation to the original data field. Herein lies the inherent security of the approach – it is practically impossible to determine the original value of the sensitive data field by knowing only the surrogate token value. The best you can do is guess. This means that if a criminal got access to the token in the cloud, they could not even use a supercomputer to “reverse” the token into its original value, because there is simply no path back to the original. (Even a “quantum computer” could not decipher it back into its original form.) The true data value never leaves the safety from behind an organizations firewall.

 

Some companies determine that they don’t even have a choice in the matter since legal requirements in certain jurisdictions mandate that data physically resides within country borders at all times. Even with strong encryption, these restrictions had previously blocked cloud computing solutions from even being considered. But tokenization technology provides a workable solution in these instances and overcomes the strict data residency rules enforced in many countries, satisfying both the need to capitalize on the latest breakthroughs in cloud computing, as well as ensuring the security and compliance of any sensitive information.

 

So, while advanced technology and computing models – coupled with increasing threats from hackers, code-breakers and cyber criminals – are forcing the creation of new innovations in cloud security, companies should know they have solid options here and now. Strong encryption is a requirement for any organization putting sensitive data in the cloud. Tokenization – often overlooked as a data protection method – offers one of the most compelling options to secure sensitive data, ensure application functionality, and enable regulatory compliance.

 

 

Eric Hay is PerspecSys’ worldwide director, field engineering. Eric and his team are responsible for deploying PerspecSys solutions for enterprise customers needing to secure sensitive information when using Cloud applications.  A software industry veteran, Eric has specialized in computer security throughout his career at companies like Netegrity, Credant Technologies and Invincea.

Managing consumer technology in the enterprise – Why IT needs to change mindset to better support the business.

Talking regularly about the consumerization of IT can often make one sound like a broken record, but the economic, security and management challenges it throws up for enterprises are too important to ignore.

The problems boil down to a lack of control, which can be described in two key ways. IT departments of course are built on policies, planning and predictability, but the introduction of technology from the consumer sphere, even when purchased centrally by IT teams for use in the enterprise, creates its own problems. It’s sexy and easy-to-use but it’s certainly not built with security and manageability in mind and will usually fall short of IT’s typical expectations. Products from the likes of Google and Apple, for example, whose respective mobile platforms iOS and Android now account for the lion’s share of the market, are great at serving the needs of consumers but have been extremely slow at embracing enterprise requirements. There is no enterprise sales or support culture with these vendors and there is little transparency with product roadmaps, which takes corporate IT managers completely out of their comfort zone.

The second problem is that, whether consumer-focused tech or not, applications and devices are being brought into the corporate world via the individual employee rather than being mandated from IT, which is the complete opposite of what normally happens. Most IT teams simply aren’t set up to work in this way, and it will require a fundamental change of thinking to ensure consumerization is handled properly.

Rather than adopt the classic head-in-the-sand approach of old, CIOs and IT bosses need to embrace consumerization and take a proactive, strategic approach built around flexible policies and the right security and management tools. Firstly, BYOD policies can’t be created in a vacuum – IT leaders need to sit down with line of business managers in all parts of the organization to figure out what their employees would like to use and how to make that possible. Thus IT is taking the initiative and reaching out in an inclusive, proactive manner.

Secondly, policies must be drawn up to be more flexible and fluid. In a world where everyone in the organization from the CEO down needs to be managed, there can’t be a one-size-fits-all approach to policy making. IT needs to think carefully and map technology and policies to the various user groups. Finally, they need the right infrastructure technologies to help enable all of this.

Companies that are questioning whether or not to allow workers to bring personal devices into the workplace should just stop asking: It’s clear that you can get a competitive edge when you put the right precautions in place. The Consumerization phenomenon gives companies that allow it a competitive advantage as it enhances innovation and creativity in the workplace while reducing overall costs for the entire organization. The key to not being overwhelmed by this trend is that all these devices need to be secured by implementing the proper BYOD policies and procedures.

Consumerization of IT is disruptive and inevitable. But many IT leaders are slow to realize it. Like dinosaurs of a previous IT era, they are headed for extinction.

 

NEXT: BYOD Best Practices – Three pitfalls you can’t afford to ignore.

 

Post based on a podcast produced by the Financial Times featuring Cesare Garlati, head of Mobile Security at Trend Micro, on some of the downsides of bringing your own device to work. Listen to the FT Connected Business podcast at http://podcast.ft.com/index.php?pid=1398

More on Consumerization, BYOD and Mobile Security at http://BringYourOwnIT.com

 

Cesare Garlati, Vice President Consumerization and Mobile Security, Trend Micro

 

As Vice President of Consumerization and Mobile Security at Trend Micro, Cesare Garlati serves as the evangelist for the enterprise mobility product line. Cesare is responsible for raising awareness of Trend Micro’s vision for security solutions in an increasingly consumerized IT world, as well as ensuring that customer insights are incorporated into Trend solutions. Prior to Trend Micro, Mr. Garlati held director positions within leading mobility companies such as iPass, Smith Micro and WaveMarket. Prior to this, he was senior manager of product development at Oracle, where he led the development of Oracle’s first cloud application and many other modules of the Oracle E-Business Suite.

 

Cesare has been frequently quoted in the press, including such media outlets as The Economist, Financial Times, The Register, The Guardian, Le Figaro, El Pais, Il Sole 24 Ore, ZD Net, SC Magazine, Computing and CBS News. An accomplished public speaker, Cesare also has delivered presentations and highlighted speeches at many events, including the Mobile World Congress, Gartner Security Summits, IDC CIO Forums, CTIA Applications and the RSA Conference.

 

Cesare holds a Berkeley MBA, a BS in Computer Science and numerous professional certifications from Microsoft, Cisco and Sun. Cesare is the chair of the Consumerization Advisory Board at Trend Micro and co-chair of the CSA Mobile Working Group.

 

You can follow Cesare at http://BringYourOwnIT.com and on Twitter at http://twitter.com/CesareGarlati

 

 

 

 

7 Steps to Developing a Cloud Security Plan

By David Grimes, Chief Technology Officer, NaviSite

 

In IT, the easiest way to stop a new technology or solution from being implemented is to raise a security red flag. As soon as someone mentions concerns around a new IT solution not being “secure” the project can come to a screeching halt. So as cloud infrastructure and cloud computing has begun to enter enterprise IT conversations, concerns around the security of cloud quickly became the biggest barrier to adoption.

 

Just like security for any other technology solution being used – past, present, or future – creating a security strategy and plan must be one of the first considerations for enterprise IT organizations. And while partnering with a service provider with strong security procedures and services in cloud computing is an important step, enterprises need to continue to take an active role in their own security and risk management. With that in mind, NaviSite has compiled 7 basic steps based on our experiences helping hundreds of companies secure enterprise resources. By following these steps any business can rely on a proven methodology for cost-effectively and securely leveraging cloud services and gain the cost and business advantages of cloud services without compromising the security of enterprise applications.

 

  1. Review Your Business Goals: It is important that any cloud security plan begins with the basic understanding of your specific business goals. Security is not a one-size-fits all proposition and should focus on enabling – technologies, processes, and people. Additionally gaining executive input is not only essential to ensure that assets are protected with the proper safeguards, but also to ensure that all parties understand the strategic goals.
  2. Maintain a Risk Management Program: Develop and maintain a risk management program centrally, and view it holistically. An effective cloud computing risk management program is important for reducing overall risk to the organization. It is also key for prioritizing the utilization of resources and for providing the business with a long-term strategy.
  3. Create a Security Plan that Supports Your Business Goals: Develop goals with measurable results that are consistent with providing support for the growth and stability of the company. These goals should include – specification date for completion, verification of achievement, and a measurable expected result. Security professionals are encouraged to regularly conduct careful analysis to develop responsible programs and build in the necessary controls and auditing capabilities to mitigate threats and maintain a reasonable security program that protects organizational assets.
  4. Establish Corporate Wide Support: Gain the approval of your cloud computing security plan from not only executive management but also the general workforce. Organizations need to establish levels of security that meet business goals and comply with regulatory requirements and risk management policies, but that can be centrally managed and conveniently implemented across the organization with minimal negative impact to productivity. Gaining this acceptance streamlines adoption throughout the organization.
  5. Create Security Policies, Procedures With input from a variety of business units establish a set of guidelines to ensure that all compliance measures are identified. Cloud services are a major advantage for growing organizations that have not yet embedded established policies and procedures into the company. The enterprise can rely on the best practices the service provider has developed over years of experience in similar environments.
  6. Audit and Review Often: Review the security plan on a regular basis, report on achievements of goals, and audit the compliance of the organization to the security policies and procedures. Understanding the auditing requirements for your business and the frequency of your audits is essential not only for ensuring compliance but also for maintaining best practices for securing enterprise resources.
  7. Continuously Improve: Annually review your cloud computing security plan with senior management and your cloud services provider. Many companies believe that once they have solid policies and procedures in place they do not need to revisit them—but your industry and your business will change over time, and the technology available to support your security plan will evolve. Understanding the dynamic nature of your business and constantly evaluating your security requirements are the foundation for implementing a successful continuous improvement strategy.

 

Cloud computing provides compelling cost and strategic benefits, including: scalability with reduced capital expenditure; more efficient use of IT resources; and the ability for an organization to focus on their core competency. Many well established security technologies and procedures can be applied to cloud computing to provide enterprise-class security. The steps outlined above will help organizations structure security and compliance programs to take advantage of the economic advantages of managed cloud services while meeting organizational security and compliance objectives.

 

Properly managed cloud infrastructure provides better security than most enterprise data centers, applications, and IT infrastructure. It allows companies to more efficiently deploy scarce technical personnel. Obviously, enterprise security should not be taken lightly, including cloud security, but it also doesn’t have to be a major roadblock either. These seven steps are meant to serve as a framework to guide companies as they develop a secure cloud-computing plan. For the complete checklist of the above seven steps download the white paper titled 7 Steps to Developing a Cloud Security Plan.

 

Can You Be Sued for Using the Cloud?

We all know that adopting the Cloud comes with some risks – security, reliability and scalability have, to-date, been the most popular complaints. But now, we can add a new one to the mix: litigation. Case in point, companies doing business in Australia, known for its strict privacy laws, have been warned that the risk for litigation should be factored into their due diligence when selecting a cloud vendor.

 

The Acting Victorian Privacy Commissioner recently spoke at the 2012 Evolve Cloud Security Conference in Australia that focused on privacy concerns related to widespread cloud adoption. In his speech, he advised cloud users to scrutinize service provider security policies thoroughly before jumping into an arrangement based primarily on cost savings and scalability. Why? Because, in Australia, as well as other regulated jurisdictions, cases of information misuse will be investigated and prosecuted.

 

And more often than not, the cloud user will be the target of the litigation. As highlighted in the Cloud Computing Information Sheet, if a business can’t answer basic questions about where its data is located, who owns and controls the service provider organization, and what happens to data when contracts terminate, the business is directly at risk.

 

Preserving functionality in particular can prove a challenge when it comes to cloud data security. A cloud service provider may in fact offer the ability to encrypt data to sufficiently meet privacy laws, but it does so at the risk of complicating data access and SaaS application usability. In that case, a secure cloud application may not seem like it’s worth the hassle to a company, and they may opt for an on-premise solution alternative.

 

It is important to carefully investigate statements made by cloud providers about legal compliance or other security credentials. Especially with international vendors, they may not know the details of the regulations that an individual enterprise needs to adhere to, let alone those of a specific geographic region, or the specific policies of an industry group. Should data become compromised, they are not liable in most cases.

 

Striking fear in the hearts of enterprises seeking to exploit technological innovation may prevent some data mishandling. But it doesn’t help address the long-term issue of how companies can successfully and legally implement the cloud into their IT strategies. Cloud advantages have simply become too valuable to ignore. If companies want to stay competitive, they must find ways to meet the privacy and residency restrictions enforced in countries like Australia, Switzerland, China and others while making the move to the cloud.

 

The Privacy Commission also warned against “haphazard” approaches to “de-identify” personally identifiable information (PII). Permanently removing the personally identifiable information is not a valid option because this often destroys the data’s intrinsic business value. Industry approved approaches, such as encryption using strong algorithms (i.e., FIPS 140-2 validated) or tokenization, which replaces PII with randomly generated tokens with no relation to the original information, are methods that should be explored.

 

Tokenization, in particular, should be looked at very carefully as it helps to solve data control, access, and location issues because the data controllers themselves maintain the system and the original data.  With tokenization, all sensitive information can be kept in-house – what travels to the cloud are random tokens vs. actual data – making information undecipherable should it be improperly accessed. So, companies can adopt cloud applications (public or private) with added assurance about their position relative to data residency, privacy and compliance. And employees accessing the protected cloud data can enjoy application functionality and the same user experience, such as searching and sorting, on encrypted or tokenized data, with the standard cloud SaaS application – all while staying within the legal lines.

 

Bottom line: Data control is becoming a key legal requirement in many countries and jurisdictions – and it is one that will clearly be enforced. Are you and your organization covered or do you need to prepare for a legal battle in the Cloud?

 

 

Gerry Grealish leads the Marketing & Product organizations at PerspecSys Inc., a leading provider of cloud data security and SaaS security solutions that remove the technical, legal and financial risks of placing sensitive company data in the cloud. The PerspecSys Cloud Data Protection Gateway accomplishes this for many large, heavily regulated companies by never allowing sensitive data to leave a customer’s network, while simultaneously maintaining the functionality of cloud applications.

 

Is crypto in the cloud enough?

Box.net, DropBox, iCloud, SkyDrive,Amazon Cloud Drive… the list goes on for convenient cloud storage options. Some have had a security incident; the rest will. All implement some form of protection against accidental exposure with varying degrees of protection. Are these sufficient and, in the ones claiming cryptographic isolation, truly implemented in a manner enough for more than sharing pictures of the kids with Aunt Betty? We’ll examine the technologies, architectures, risks and mitigations associated with cloud storage and the cryptographic techniques employed.

Even with the promise of cloud, all of the providers are looking to monetize their service. For the past couple of years, the draw of “unlimited” to build up the user counts for a service has been adjusted downwards. Mozy was one of the first, discontinuing their unlimited backup service in 2011.  Microsoft’s SkyDrive dropped their free service in April 2012 from 25 GB down to 7 GB.   Why did providers serve up free access and what’s moving them in a different direction?

Online Storage Drivers

There are three components driving requirements for each of these services: Privacy/Security, Locale and good old fashioned cost.  They all intertwine into a mishmash of designs and constraints.

Privacy/Security

Some governments/organizations require that, for security, data remain within their borders, regardless of encryption – the locale aspect.  A judge or government may compel a Cloud Service Provider to disclose requested data when they hand down a legal order or sign a search warrant.  Most of the Providers write into their use policies that they will comply with law enforcement requests.

This sort of blatant disregard for a user’s privacy scares European Union citizens.  The entire purpose of the EU’s Data Protection Directive (Directive 95/46/E) , and its antithesis, the US PATRIOT Act surrounds who can access what private data.  Some of the security and privacy aspects may be answered through cryptography.  A full treatment of encryption as a service may be found on the Cloud Security Alliance’s web site.

Location

Locale is the easiest to address and hardest to guarantee.  Various laws require data stay within their government’s borders.  If data migrate past those borders, the service provider is subject to fines.  This varies between countries, trust reciprocation and what sorts of protections are/are not considered adequate for ignoring said provisions.  In some cases, segregation through cryptography suffices in complying with location based laws.

Costs

The last storage driver is cost (although it might be first from a provider’s perspective).  The business efficiencies expected for Storage as a Service and the reason the above providers thought they could turn a profit hinge on the type of data de-duplication seen in the enterprise.  Separate copies of, for instance, a popular MP3 file or a Power Point presentation are not individually stored; a pointer to that file exists instead that all of the service users may access.  The benefits are huge, where enterprises see as much as a 50-90% reduction in storage space necessary.  This efficiency requires storage vendors’ access to the data they are storing for comparison.

Compromise

How do you balance these three?  Which aspects allow you to meet your privacy/security/regulatory policies without jeopardizing your bottom line?  Let’s dissect the solutions:

Underlying technology – Cost is a mighty significant factor in designing an on-demand storage service.  Many of the most popular solutions were created on a shoestring budget.  What better way to operate under tight fiscal constraints then to use the power of the cloud and scale up or down with workload.  It turns out that at least a couple of the more popular services (currently) use Amazon’s S3 (Simple Storage Service ).  S3 includes built in cryptography, where key material resides, not on Amazon’s servers, but within the application making the S3 API calls.  What the services do with the key material is up to them. For simplicity, some services allow Amazon to manage the keys, as discussed later.

Cryptographic algorithms – With few exceptions, everyone uses 256 bit SSL/TLS for data-in-transit protection and when encrypting data-at-rest, 256 bit AES.  These are today’s de-facto standards, and there are easier ways to breach security than brute force attacks on 128 bit or longer key lengths.

Key Material – In Server Side cryptography, the service provider manages both the keys and your data.  This limits the complexity of the environment and allows for the de-duplication aspects mentioned earlier while still providing user to user data isolation.  If a user deletes a file, it may be recovered without much fuss.  Crypto hygiene takes place without issue: Keys may be rotated appropriately, split into separate locations and put into Highly Available clusters.

So what are the risks?

Put simply, storing key material with the information it is designated to protect is akin to leaving the vault door unlocked at a bank.  As long as no one is trying to get in, you might get away with it – for a while.    The service provider may be compelled, against your wishes, to produce the key material and data with warrants in the USand similar government requests in other countries.  Most privacy policies actually document their compliance for these requests (see table).   Trusted insiders can poke around and access keys and thereby data.  Programming and operational mistakes may come to light, as was evidenced in the Dropbox disclosure incident.

Client Side Cryptography

There really is no one you can trust besides yourself.  Rich Mogul from Securosis makes a couple of duct tape style suggestions for sharing within an insecure environment using various forms of encryption.  Newer providers Jungle Disk and Spider Oak label their services as inaccessible to anyone without permission – you have a password which decrypts your keys and then all sharing and use operations occur from there.  Jonathan Feldman makes the case that secure sharing defeats the purpose of the cloud file sync and is just wrong.

 

Services Underlying Technology Release to law Key Material Access
Amazon Cloud Drive S3 Yes – Privacy Policy Server Side
Box.com (formerly box.net) S3 Yes – Privacy Policy Server Side
Dropbox S3 Yes – Privacy Policy Server Side
Google Drive Google App Engine Yes – Privacy Policy Server Side
iCloud iDataCenter (EMC) Yes – Will disclose Server Side
Skydrive (Microsoft) Microsoft Azure Yes – Not Secured In Transit Only
Spider Oak Proprietary No – Zero Knowledge Client Side Password
Jungle Disk S3 No – No Access Client Side Password

This is far from an exhaustive list.  All of the products listed have their place, and should be used according to your specific application and to their strengths/avoided dependent on their weaknesses.

For a very in-depth treatment of cloud storage security, with a special emphasis on one of the most privacy paranoid countries in the world (Germany), please see the Fraunhofer Institute for Secure Information Technology’s Cloud Storage Technical Report.

Jon-Michael C. Brook is a Sr. Principal Security Architect within Symantec’s Public Sector Organization.  He holds a BS-CEN from the University of Florida and an MBA from the University of South Florida.  He obtained a number of industry certifications, including the CISSP and CCSK, holds patents & trade secrets in intrusion detection, enterprise network controls, cross domain security and semantic data redaction, and has a special interest in privacy.  More information may be found on his LinkedIn profile.

 

Your Cloud Provider is a Partner… Not a One-Night Stand

“We programmatically interface with Cloud Providers to manage our customer data, so we can rely on them for securing our services right?” Wrong!

 

The moment you start interfacing with a Cloud Provider you immediately inherit the risks associated with their deployment, development, and security models – or lack thereof in many cases. However, you’re still responsible for the secure development of your business’s applications and services, but with the caveat that you are now sharing that responsibility with a Cloud Provider. Unfortunately, most Cloud Providers do not provide sufficient visibility into the maturity of security activities within their software development lifecycle.

 

Below we’ll take a brief walkthrough of a secure buy-cycle for a Cloud Provider and look at how you are affected by interfacing with Cloud Providers and what you can do to ensure consistent adherence to secure programming patterns and practices.

Gaining Visibility into Security Activities

 

Gaining visibility into the security posture of a Cloud Provider requires a large amount of discussion and documentation review. There are several common security activities that I look for when evaluating a Cloud Provider. If I were to evaluate your security capabilities as a Cloud Provider, some of my very first questions would be:

 

Do you centralize application security initiatives?

 

As a user of your Cloud Provider services, I need assurance that your development team and management staff is enabled by a centralized security team to produce fully secured products. Show me that you have a centralized security team or standards committee. I want to see a team that is responsible for defining application security practices and standards as well as defines and recommends security activities within the organization. Don’t run your application security program like the Wild-Wild West!

Do you enforce an application security-training curriculum?

 

As a user of your Cloud Provider services, I need assurance that your development team and management staff is aware of the latest secure programming vulnerabilities and their mitigation strategies. Before you can begin addressing application security risks, your team needs to have an understanding of those core risks!

Do you facilitate secure development through automation?

 

As a user of your Cloud Provider services, I need assurance that your development team and management staff has the tooling necessary to streamline challenging security activities for quick remediation. This is simply a matter of scalability; humans alone are not a viable option for finding and fixing every problem in your codebase. Technologies such as Static Analysis Security Testing (SAST) and Dynamic Application Security Testing (DAST) help scale code review and penetration testing solutions by focusing on a common set of application security problems while additional human-resources apply more specialized techniques to the business contextual components of your services.

 

I do not want to hear that you “perform penetration tests on a yearly basis using a 3rd party firm and or 3rd party tool.” This type of process is not continuous, does not enable developers, does not scale and leaves too many open problems.

Do you have incident response for dealing with security vulnerabilities?

 

As a user of your Cloud Provider services, I need assurance that you have a process in place to respond to vulnerabilities identified in production applications. I’m looking for a standardized process that is well understood by the key stakeholders in your business and the applicable business unit.

 

Show me the turn-around time for fixing vulnerabilities. Give me an understanding of compensating controls used to reduce exposure of exploitable vulnerabilities. Most importantly, show me who did what, when, and how. I cannot make educated and well-informed decisions for my business if you do not provide me with enough information from your end.

How do you ensure confidentiality and integrity of sensitive data?

 

As a user of your Cloud Provider services, I need assurance that you have sufficient controls in place to protect my sensitive data throughout the service lifecycle. Tell me the protections you have in place when sensitive data is being entered into the application, when the sensitive data is transmitted across the wire, when the sensitive data is at rest, and when the data is presented to end users.

 

Key security controls that I am looking for in this regard include using FIPS 140-2 compliant cryptographic modules, masking of sensitive fields, use of Transport Layer Security (TLS) for network transmission, use of strong encryption and message digest algorithms for persistence, and a key management strategy that incorporates key rotation and processes to minimize disclosure. The last thing I’d want is you storing the cryptographic key in a database column adjacent to the encrypted data!

How can my team make use of your services securely?

 

As a user of your Cloud Provider services, I need assurance that my development team will have all the support they need to systematically interface with your exposed API in a secure fashion. Show me clear and concise documentation of the security features and security characteristics of your exposed functionality. My development teams need to understand your authentication and identity management workflow along with guidance on how to manage those identity tokens.

 

My development teams also need to understand any security relevant assumptions you place on your exposed API. For example, are you expecting my development team to verify the user is authorized to access a database record by querying the UserEntitlments endpoint prior to querying the DatabaseRecord endpoint? Or have you encapsulated the authorization logic within the DatabaseRecord endpoint so that my development team only has to make one API call? I definitely don’t want to be responsible for disclosing my users’ information because you did not provide me guidance on how to securely interact with your service.

Verify Security Claims and Assertions

 

While simply hammering your potential Cloud Provider with application security questions like the above helps provide visibility into their security posture, it in no way verifies that they’re doing what they claim. In an ideal partnership, it is prudent for you to require your potential Cloud Provider to “get tested” by an application security team before moving the relationship forward. Whether an internal team or a 3rd party carries out the assessment, the goal of the effort would be to gain confidence that the Cloud Provider is properly adhering to and implementing their security claims and assertions.

 

The assessment should cover not only a code review and penetration test of the target services, but should also evaluate the capability of the Cloud Provider to implement their security activities throughout their Software Development Lifecycle. Use the vulnerabilities from the code review and penetration test to assist in the evaluation of their security activity effectiveness. Ask them:

 

  1. What vulnerabilities in this report are known and unknown?
  2. How long have you been working on remediating the known?
  3. Why do you believe the unknown were not previously identified?
  4. How long will it take to fix these vulnerabilities?

 

You can roughly estimate what security activity failed based on evidence from a combined code review and penetration test. If the vulnerabilities indicate a complete lack of security control(s), then there is likely a serious problem with the Cloud Provider’s planning and requirements phases. If the appropriate security controls exist but were not used correctly or there are various implementations of the same security control, then there is likely a problem in the design and implementation phases. If the vulnerability is substantial and was unknown, then there is likely a serious problem with the Cloud Provider’s secure coding enforcement strategies. Finally, if the vulnerability is substantial and known for an extended period of time, then there is likely a serious problem with the Cloud Provider’s incident response strategies.

 

Conclusion

 

There is a very common problem facing consumers of Cloud Providers today; they simply fail to dig deep enough in the selection process and settle for what looks good on the surface – a surefire way to build a short-lived relationship. You must realize that you inherit the risk of your Cloud Provider the moment you leverage their services. The risks are further compounded when sensitive information is passed through these Cloud Provider services. When you evaluate your future Cloud Providers, ensure that you gain visibility into their application security activities and you verify security assertions and claims through penetration tests and code reviews. After all, your Cloud Provider is a Partner… not a One-Night Stand!

 

Eric Sheridan – Chief Scientist, Static Analysis

 

Eric Sheridan is responsible for the research, design, implementation, and deployment of core static analysis technologies embedded within WhiteHat Sentinel Source. Mr. Sheridan brings more than 10 years of application security expertise to WhiteHat Security with a focus on secure programming patterns and practices. This experience has allowed Mr. Sheridan to infuse WhiteHat Security with the ability to create static analysis strategies and technologies that actually target the correct problem domain thus enabling developers to produce more secure code. In addition to his static analysis expertise, Mr. Sheridan has enormous experience in defining, integrating, and executing security activities throughout the software development lifecycle.

 

Prior to joining WhiteHat Security, Mr. Sheridan co-founded Infrared Security; a company specializing in application security consultation and the development of next generation static analysis technologies ultimately used within WhiteHat Sentinel Source. Aside from providing professional consultation services to organizations in both the Government and Private sectors for more than 6 years, Mr. Sheridan frequently contributes to the Open Web Application Security Project (OWASP). Mr. Sheridan led the creation of the CSRFGuard and CSRF Prevention Cheat Sheet projects while contributing to WebGoat, CSRFTester, and Stinger.

 

Avoiding Storms In The Cloud – The Critical Need for Independent Verification

By Chris Wysopal,   Co-founder and CTO of Veracode

Last year, Forrester predicted that cloud computing would top $240 billion in 2020. Market Research Media came up with a more aggressive forecast of $270 billion in 2020. None of this data is particularly surprising, as cloud technology is clearly here to stay, particularly if cloud providers are able to maintain secure environments for their customers.     As companies adapt to the shifting cloud paradigm to address cost, scalability, and ease of delivery issues, there continues to be a growing concern about the safety of data in the cloud, and whether cloud security can ever be as robust as enterprise security.

The dangers associated with storing information in the cloud are regularly highlighted in well publicized breaches and security flaws experienced by some of the world’s most well-known brands. Cloud businesses such as Amazon, Yahoo, Linkedin, eHarmony and Dropbox have all been attacked in just the last few months, but the problem is not exclusive to consumer facing businesses. B2B organizations that offer cloud-based solutions, like my company Veracode, are facing their own set of security requirements from business customers the need to ensure data is protected.

The answer to why cloud security has become such a fast growing concern for enterprise organizations today can be found in a perfect storm of current trends.

First is that the reporting of security breaches has skyrocketed, in part because hackivists love the publicity but also because crime typically occurs where there is value, and in our digital economy the value resides in various forms of intellectual property.

Second is that today’s cloud computing environments often distributes corporate intellectual property to many different infrastructures while promising authorized users ready access to that information, which means the value can be found in many places.

Third is that enterprise organizations rarely use just one cloud-based service. If one was to count the number of Salesforce.com customers that have integrated the service with other cloud-based marketing automation solutions, or cloud-based accounting solutions, it would be a very high number.  With all of this corporate information and intellectual property now residing in so many interconnected places in the cloud, hackers that are actively looking for weaknesses can abuse those connections and wreak havoc for the cloud customer and provider alike.

What most enterprise organizations are looking for from prospective cloud-based solution providers is transparency in the provider’s security mechanisms and IT processes. Companies want to know what security mechanisms are being used to keep their information confidential and secure, particularly while it is in transit to and from the provider’s datacenter, but also while it is in use in the datacenter, while it is at rest in a disaster recovery site, and ultimately, how the information is finally deleted. Customers are also concerned about the security mechanisms used to authenticate company users that will be accessing and updating the information. Sure, the goal of most cloud-delivered services is to provide fast, easy, ready access to corporate information – but only to the appropriate people.

In terms of process transparency, companies need (and want) to know that a provider’s IT procedures do not allow for corporate information to be exposed to members of the provider’s workforce, even during routine maintenance or updates to infrastructure or service software. They also want to know whether the service infrastructure and software is continually being hardened against attack, and that the incident response procedures are well known and appropriately followed.  Many breaches have been tied to vulnerabilities, such as SQL injection, in the custom software developed by the service provider.  Customers are beginning to seek evidence that this software was developed and tested for security.

This brings us to the impact cloud security concerns are having on solution providers.  While customers are certainly asking more questions about their providers’ security, they are also increasingly expecting independent proof of the answers. This is a good thing.

One example that we recently encountered at Veracode was during an RFP process, which asked  that we answer the checklist questions  published in Gartner’s September 2011 research note titled “Critical Security Questions to Ask a Cloud Service Provider.” The checklist is designed to arm customers with the necessary security questions to ask of their cloud-based solution providers as part of their due diligence.  We provided those answers, but the customer went further to ask for our SysTrust report and proof that our hosting provider was certified as an SSAE 16 facility.  SysTrust certification demands Ernst & Young audits every January and February that review process documentation, includes personnel interviews and reviews activity logs to see whether effective platform controls existed to protect information during the previous year. The hosting provider also goes through a similar process with their auditors, providing an added layer of third party security validation.

Ultimately the burden of security should fall on both the cloud solution provider and the customer. As Greg Rusu, general manager of PEER 1 Hosting’s public cloud division Zunicore stated in a recent InfoSecurity article, “the burden of security lies with both the cloud provider and the customer. No matter how secure the cloud provider makes the infrastructure…what we see in practice is that security is a partnership.”

After all, at the end of the day it’s the customers’ duty to protect their intellectual property and corporate information. Taking assurances from cloud solution vendors, even in writing, only provides a certain level of assurance, which is why calling for third party validation is so critical.   This level of third party inspection is no different than the advice we give our own customers about securing their applications – trust is good but independent validation is much better.

BIO

Chris Wysopal, co-founder and chief technology officer of Veracode, is responsible for the security analysis capabilities of Veracode technology. He is recognized as an expert in the information security field, and his opinions on Internet security are highly sought after. Wysopal has given keynotes at computer security events and has testified on Capitol Hill on the subjects of government computer security and how vulnerabilities are discovered in software. He also has spoken as the keynote at West Point, to the Defense Information Systems Agency (DISA) and before the International Financial Futures and Options Exchange in London. Wysopal’s groundbreaking work in 2002 while at the company @stake was instrumental in developing industry guidelines for responsibly disclosing software security vulnerabilities. He is a founder of the Organization for Internet Safety, which established industry standards for the responsible disclosure of Internet security vulnerabilities.

Big Data, Big Cloud, Big Problem

By Todd Thiemann

Big data presents a big opportunity for businesses to mine large volumes of data from a variety of sources to make better and more high velocity decisions.  Since big data implementations are practically always deployed in a cloud environment, be it a private cloud or public cloud, this poses a major security challenge. That’s because some of that “Big Data” will inevitably be sensitive in the form of intellectual property covered by corporate security mandates, cardholder data affected by PCI DSS, or Personally Identifiable Information (PII) affected by state or national data breach laws.

For the purposes of this article, our definition of Big Data refers to the non-relational storage and processing technologies including NoSQL tools such as Hadoop, MongoDB, Cassandra and CouchDB.   These offerings comprise the bulk of “Big Data” deployments and share similar security challenges.  For example, The Hadoop Distributed File System (HDFS) is used to store data that needs to be analyzed.  Software frameworks such as MapReduce or Scribe process large amounts of data in parallel on large clusters of commodity computer nodes. Tasks are distributed and processed in a completely parallel manner across the cluster. The framework sorts the output, which can be used as input to the reduce tasks. Typically both the input and the output of the job are stored across the cluster of compute nodes.

The ability to perform complex ad-hoc queries against massive disparate datasets can unlock tremendous value for enterprises. In order to tap this intelligence, companies are using distributed file systems such as Hadoop. This is primarily because the volume of data has increased beyond the performance capabilities of relational database systems.

While traditional relational databases use the concept of a data container, this is absent in the Big Data world. Instead of a datafile associated with a database, NoSQL implementations scatter files across hundreds or thousands of nodes.  As a result, sensitive data that requires protection is no longer in one compact tablespace on a single system, but can be scattered among a multitude of nodes in the cloud.

One of the key challenges posed by NoSQL tools is that while they are great at crunching massive volumes of data, they have virtually zero built-in security or access control capabilities. If a Big Data deployment includes or will include sensitive data, it’s imperative to put data security and access controls in place. Operating a Big Data infrastructure without some form of security is a very high risk endeavor.

The following threats and how to mitigate them are important considerations in Big Data environments:

Privileged User Abuse – keeping system administrators from accessing or copying sensitive data.

Unauthorized Applications – preventing rogue application processes from touching your Big Data.

Managing Administrative Access – While system administrators should not be allowed to access data, they may need access to the directory structure for maintenance operations and performing backups.

Monitoring Access – Understanding who is accessing what data in a Big Data repository allows for necessary auditing and reporting.

When it comes to protecting and controlling access to Big Data, encryption combined with key management are central elements of a layered security approach. Here are some important considerations when securing Big Data environments:

  • Classify data & threats – This is one of the biggest challenges for any data security project – knowing what is sensitive, where is it located, what are the potential threats. If no sensitive data is in scope, data protection may not be necessary. If sensitive data is stored in the Big Data environment, it needs to be protected.  Talking to the Big Data development team about the nature of the data is a first step.
  • Encryption & Key Management – Taping the key to the front door just above the door knob is not a security best practice. In the same vein, storing encryption keys within the data environment they are protecting is also not a best practice.
  • Separation of Duties – this has many implications, but one is that encryption keys should never be under the control of IT administrators.
  • Costs – Minimizing silos of encryption and key management typically reduces costs and minimizes scalability, audit, and total cost of ownership issues.
  • Performance – Enterprises are embracing Big Data for its potential to enable faster decision making. By the same token, encryption and key management should not significantly slow down Big Data system performance

Big Data promises to be the proverbial goose that lays golden eggs. Understanding the data security and privacy risks associated with a Big Data environment early in the development process, and taking appropriate steps to protect sensitive information, will prevent that goose from getting cooked.

Todd Thiemann is senior director of product marketing at Vormetric and co-chair of the Cloud Security Alliance (CSA) Solution Provider Advisory Council.

Best Practices to Secure the Cloud with Identity Management

Authored by: Dan Dagnall, Director of Pre-Sales Engineering at Fischer International Identity

 

What is the “cloud identity?”   The “cloud identity” begins at the birth of the user’s “digital identity” and includes the attributes to define “who you are.”  “Cloud Identity” is not a new term to those in the industry, but one that has definitely taken hold as the way to define “you” in the cloud.  Much focus has been on how to “enable” a secure authentication event (through mechanisms like ADFS or Shibboleth), which is a key component of securing the transaction between Identity Providers (“IdP”) and Service Providers (“SP”). However, too little focus has been placed on the fundamental component required to “ensure” the integrity of the transaction; and by “integrity,” I mean that the person is right, the attributes are right, and the values are right  The integrity of a “cloud identity” transaction can only be secured by sound identity management practices, with a razor-sharp focus on attribute management and policy enforcement.

 

Competent attribute management is the foundation of securing the “cloud identity.”  It is the attribute and its corresponding value that ultimately determine the digital identity of an individual (or entity).  When you consider the level of accuracy required (if your true goal is the validity of the transaction) in a cloud-centric world, you will concede the importance of properly representing the user in the cloud.  When you consider attributes within this context, it becomes clear why identity management (IdM\) is the epicenter for securing the cloud identity.

 

Attribute management is much more than “just a middleware component;” it is identity management at a fundamental level.  This fundamental level must not be overlooked as our industry begins discussing the large scale initiatives to create a common “ecosystem” through which cloud identities will travel.

 

There are a few key components of the IdM stack that provide for the integrity I’m describing; automation and policy management/enforcement.

 

Best Practice #1: Automation

Sound identity management practices must include automation, which includes event detection and downstream provisioning (i.e. the system automatically detects when a user, along with data associated to the user, is added/modified within the system of record, followed by automatically provisioning the user and the required attributes to downstream systems). Detection of changes to key attributes specific to the user’s identity [ideally, in real time] ensures the validity of the attribute value, i.e. making sure the value is correct and placed in the proper location and that placement was/is authorized.

 

Manual modification of users (on downstream target systems) including manual entry of attribute/value pairs is not a secure approach unless identity management has authorized these actions and the user performing them.  Manual approaches can undermine data integrity and leave the user (whose identity and sensitive information will be floating around the cloud) at a major disadvantage and lead to improper representation of their identity in the cloud, not to mention the inherent risk for the user and the organization as a whole.  This represents a scary reality for some, unless of course IdM has been properly deployed to ensure that malicious events are either immediately detected or thwarted before-hand.

 

Automated event detection eliminates the need for manual interactions with the user’s attribute set, which as I’ve discussed is the single-most important aspect of securing one’s identity in the cloud.  Automated event detection when coupled with attribute management enables the proper enforcement of organizational policies put in place to protect the user.

 

Best Practice #2: Policy Management & Enforcement

Once automation is introduced, securing the remaining aspects of the cloud identity shifts to policy management and enforcement.  Policy management is the layer of IdM which defines who is authorized and what level of access will be granted to downstream target systems.  Whether bound by regulation (which is most often the case) or the requirement to comply with a set of standards and/or practices to participate in global federations (i.e. attribute management processes that meet a certain criteria), policy definition is the key to successfully securing the cloud identity.

 

Securing this layer cannot be accomplished by allowing unchecked “human” decisions to overrule the policy because it can have a direct effect on how that user is represented in the cloud.  As a user, I’d sleep much better knowing that automated policy enforcement is managing my cloud identity, and abiding by organizational or regulatory guidelines like CSA and others to keep my identity safe and properly represented in the cloud.

 

In conclusion, someone with direct access to my data (because there is no automation), who can manipulate my attribute values without authorization (because there is no policy definition and enforcement), could compromise the representation of my “cloud identity” and call into question the integrity of the entire transaction.

So before you consider cloud-based transactions, specifically those where identity data is required, it is in your best interest to solidify your IdM practices and introduce the components I’ve outlined.  Only then can you truly secure the cloud for your users and your organization.

Application-Aware Firewalls

You may have heard this term recently and wondered what it meant. When it comes to security, everyone thinks of Firewalls, Proxies, IPS, IDS, Honeypots, VPN devices, email security and even Web security, but most people don’t think in terms of application level security unless either you are the developer, admin, or user of those specific services or perhaps a hacker. Especially when your traditional network boundaries disappear you can’t carry all of those devices with you. When you move out of your traditional boundaries, towards the cloud, you trust the cloud provider to provide you these features. But you can’t do the same with application level security.  That is because those devices work on a level below the Application Layer (Or Layer 7 in the ISO-OSI architecture model). And those standards are very well defined and established, whereas, to an extent, the application layer is still evolving – from COBOL to API, everything is fair game.

There is a reason why enterprises are looking for devices which can do it all. I was reading a security research report the other day, which suggested that attackers are moving up the stack to the application layer since it is so easy to hack into applications nowadays; especially with the applications moving to the cloud, thus introducing new vectors of attack, including a whole layer of API/ XML threats (if you are still bound to XML/SOAP and can’t free yourself). Most of the organizations that I see don’t have the same solid security at the application level as they do at the network level. This discrepancy developed over last few years as more and more applications came out with new technologies exposing themselves to newer threats. Plus there is no unified standard amongst developers when they develop application level security.

The network security we have today is not “application aware”. This means that API/XML and other application level threats go right through the regular network defenses that you’ve built up over years. Many people think that if they use REST or JSON then they are not as prone to attacks as those who are using SOAP/XML/ RPC, which is a funny thought.

Add this to the fact that when your applications move your enterprise boundary to go to a cloud, they are exposed to hackers 24×7 waiting to be attacked.  This leaves you subject not only to direct attacks on your application, but also to bounces off another application that is hosted in a multi-tenant environment. So your new “firewall” should be able to inspect, analyze application traffic, and identify threats. But the issue doesn’t stop here; you also need to analyze for viruses, malware and the “intention” of the message (and its attachments) as they pass through. Most times the issue with Firewalls inspecting traffic is that they look at where information is going (port and maybe an IP address), but not what the message is intended to do. There is a reason why injection attacks such as SQL Injection, XSS, Xpath injection all became so popular.

 

Now there is another issue, and this relates to the way applications are built nowadays. In the olden days you controlled both the client, the server, and even the communication between them to an extent. Now we expose APIs and let others build interfaces, middleware, and the usage model as they see fit. Imagine a rookie or an outsourced developer developing a sub-standard code and putting it out there for everyone poke and prod for weaknesses.  As we all know, the chain is as strong as the weakest link. A problem arises because it is hard to figure out which is your weakest link. So application-aware firewalls can not only inspect, analyze or control traffic to applications, but also utilize inherent knowledge allowing them to work at a deeper level too.

This gives you freedom to move the necessity of application level security from your applications/ services/ API to a centralized location, so your developers can concentrate on what they are supposed to do – develop the services that matter to your organization and not worry about other nuances, which can now be left to the experts.

Andy Thurai — Chief Architect & CTO, Application Security and Identity Products, Intel

Andy Thurai is Chief Architect and CTO of Application Security and Identity Products with Intel, where he is responsible for architecting SOA, Cloud, Governance, Security, and Identity solutions for their major corporate customers. In his role, he is responsible for helping Intel/McAfee field sales, technical teams and customer executives. Prior to this role, he has held technology architecture leadership and executive positions with L-1 Identity Solutions, IBM (Datapower), BMC, CSC, and Nortel. His interests and expertise include Cloud, SOA, identity management, security, governance, and SaaS. He holds a degree in Electrical and Electronics engineering and has over 20+ years of IT experience.

 

He blogs regularly at http://cloudsecurity.intel.com/ on Security, SOA, Identity, Governance and Cloud topics. You can find him on LinkedIn at http://www.linkedin.com/in/andythurai.

Consumerization 101 – Employee Privacy vs. Corporate Liability

Mary D. joined MD&M Inc. in 2009. Being an Apple enthusiast, she was quite excited to learn that the company offered an innovative BYOD program that allows employees to use their own iPhone for work. As part of the new hire package, Mary signed the acceptable use policy and was granted access to corporate email on the go.

Mary’s started having performance problems in her second year and her manager put her on notice. After six months, Mary was terminated. When her manager clicked the ‘terminate’ button within the company’s HR system, a series of automated tasks were initiated, including the remote wipe of all information on Mary’s iPhone.

As it turned out, Mary had been performing poorly because her son John was dying of cancer. Just a few weeks before Mary was terminated, her husband took a picture of her and his son using Mary’s iPhone. It was the last photo Mary had of her son, and MD&M Inc. unknowingly destroyed it. Mary sued the company for damages.

Just how much is the last photo of a mother and son worth? Attorneys and expert witnesses sought to answer that question. They arrived at $5 million.

Three pitfalls your BYOD program can’t afford to ignore.   

While Mary’s story is a fictitious case debated last year by the International Legal Technology Association (ILTA), it’s just a matter of time before stories like this become mainstream reality. A recent survey by Trend Micro clearly shows that a majority of companies are already allowing employees to use their personal devices for work-related activities– 75% of organizations in the U.S. offer BYOD programs.

Besides preserving data security and managing a myriad of personal devices, companies must also consider a new set of legal and ethical issues that may arise when employees are using their own devices for work. Here are just three pitfalls to consider:

Pitfall #1: Remote deletion of personal data:  Under what circumstances (if any) should the company have a right to remove any non work-related content from an employee-owned device?

Pitfall #2: Tracking individual location: What corporate applications might ‘track’ the location of an employee-owned device?  Is the employee aware that this is possible?

Pitfall #3: Monitoring Internet access: Should accessing questionable websites be restricted, when an employee is also using a personal device for work?

 

NEXT: BYOD Best Practices – Three pitfalls you can’t afford to ignore.

 

Cesare Garlati, Vice President Consumerization and Mobile Security, Trend Micro

 

As Vice President of Consumerization and Mobile Security at Trend Micro, Cesare Garlati serves as the evangelist for the enterprise mobility product line. Cesare is responsible for raising awareness of Trend Micro’s vision for security solutions in an increasingly consumerized IT world, as well as ensuring that customer insights are incorporated into Trend solutions. Prior to Trend Micro, Mr. Garlati held director positions within leading mobility companies such as iPass, Smith Micro and WaveMarket. Prior to this, he was senior manager of product development at Oracle, where he led the development of Oracle’s first cloud application and many other modules of the Oracle E-Business Suite.

 

Cesare has been frequently quoted in the press, including such media outlets as The Economist, Financial Times, The Register, The Guardian, Le Figaro, El Pais, Il Sole 24 Ore, ZD Net, SC Magazine, Computing and CBS News. An accomplished public speaker, Cesare also has delivered presentations and highlighted speeches at many events, including the Mobile World Congress, Gartner Security Summits, IDC CIO Forums, CTIA Applications and the RSA Conference.

 

Cesare holds a Berkeley MBA, a BS in Computer Science and numerous professional certifications from Microsoft, Cisco and Sun. Cesare is the chair of the Consumerization Advisory Board at Trend Micro and co-chair of the CSA Mobile Working Group.

 

You can follow Cesare at http://BringYourOwnIT.com and on Twitter at http://twitter.com/CesareGarlati