An In-House Security Approach for Cloud Services That Won’t Drive Your IT Department Insane

July 11, 2016 | Leave a Comment

By Jane Melia, VP/Strategic Business Development, QuintessenceLabs

“If your security sucks now, you’ll be pleasantly surprised by the lack of change when you move to cloud.” — Chris Hoff, Former CTO of Security, Jupiter Networks

11406956076_beec9f4e37_oThe chances are, almost everyone in your organization loves the convenience of the cloud for data storage and for collaborative workflow needs. And why wouldn’t they when documents and files are now easily accessible to all team members, whether down the hall, in another state or even on another continent? From a cost and operations perspective, cloud storage is certainly pretty compelling. However “almost everyone” might not include CIOs, CISOs and their teams, who often harbor concerns about the security of data in the cloud, and particularly where sensitive data is involved. I have similar misgivings. I’m not saying that we should not use the cloud, but I do believe that we can improve how we secure sensitive data stored on it.

Blue Skies or Dark Clouds Ahead?
In a recent report titled “Blue Skies Ahead? The State of Cloud Adoption,” Intel Security said that IT decision makers are warming to the cloud along with the rest of us with 77 percent saying they trusted the cloud more than they did a year ago. This hides a darker reality that only 13 percent of respondents actually voiced full trust in the public cloud, with 37 percent trusting their private cloud. Surprisingly, a full 40 percent of respondents claim to process sensitive data in the cloud, indicating that there is both room and a real need for cloud security improvement.

Adding Peace of Mind to Cloud Storage
When I hand over data to a third party, I want to be sure that they are not only contractually obliged to look after it properly but are actually equipped to do it. This means protecting it from accidental loss, malicious attacks and from silent subpoenas, among other threats. Logging and multi-factor authentication are part of the tool kit that can be implemented, as is encryption. There is an existing (and growing) awareness of the importance of encryption which is why most cloud service providers offer encryption options of one kind or another. But too frequently the third-party vendor is doing the encrypting, and holding the keys, which isn’t very reassuring to say the least.

Fundamentally, the best way to ensure data is safe and managed well is to pre-encrypt it before it’s sent to the cloud. Coupled with a policy of keeping key management in house, these precautions should allow for several hours of blissful sleep each night for members of the IT security team whether the cloud is public, private, or a hybrid of the two! Other approaches include using 2 or more different vendors to handle the different parts of the storage solution: one vendor can manage the keys while the other manages storage itself. Key wrapping is another way to reduce risk: the end customer can manage master keys that in turn wrap the document keys, giving you some assurance of isolation between your data and that of other customers stored on the same cloud, as well as control for document access. Through these approaches, you can provide a significantly higher level of protection for data stored in the cloud.

Encryption is the best tool we have for protecting sensitive information so we need to use it to support and enable our expansion to the cloud. As seen above, the devil is in the details of how we do it, but keeping control of keys is fundamental. Of course, there is also the issue of how strong the keys are that you are using, but that is a topic for another day….

No More Excuses – Time to Get a Grip On Your Cloud Security

July 7, 2016 | Leave a Comment

Rolf Haas, Enterprise Technology Specialist/Network Security and Content Division, Intel Security

800x536-CloudcampaignCloud use continues to grow rapidly in the enterprise and has unquestionably become a part of mainstream IT – so much so that many organizations now claim to have a “cloud-first” strategy.

That’s backed up by a survey* we commissioned here at Intel Security that questioned 1,200 cloud security decision makers across eight countries. One of the most startling findings: that 80% of respondents’ IT spend will go to cloud services within just 16 months.

Even if that outlook overestimates cloud spend it still shows a dramatic shift in mindset, and it’s often the business, rather than the IT department, that is driving that shift. In today’s digital world the pull of the cloud and its benefits of flexibility, speed, innovation, cost, and scalability are now too great to be dismissed by the usual fears. To compete today businesses need to rapidly adopt and deploy new services, to both scale up or down in response to demand and meet the ever-evolving needs and expectations of employees and customers.

This new-found optimism for the cloud inevitably means more critical and sensitive data is put into cloud services. And that means security is going to become a massive issue.

If we look at our survey results the picture isn’t great when it comes to how well organizations are ensuring cloud security today. Some 40% are failing to protect files located on SaaS with encryption or data loss prevention tools, 43% do not use encryption or anti-malware in their private cloud servers, and 38% use IaaS without encryption or anti-malware.

Many organizations have already been at the sharp end of cloud security incidents. Nearly a quarter of respondents (23%) report cloud provider data losses or breaches, and one in five reports unauthorized access to their organization’s data or services in the cloud. The reality check here is that the most commonly cited cloud security incidents were actually around migrating services or data, high costs, and lack of visibility into the provider’s operations.

Trust is growing in cloud providers and services, but 72% of decision makers in our survey point to cloud compliance as their greatest concern. That’s not surprising given the current lack of visibility around cloud usage and where cloud data is being stored.

The wider trend to move away from the traditional PC-centric environment to unmanaged mobile devices is another factor here. Take a common example: an employee wants to copy data to their smartphone from a CRM tool via the Salesforce app. The problem is they have the credentials to go to that cloud service and access that data, but in this case are using an untrusted and unmanaged device. Now multiply that situation across all an organization’s cloud services and user devices.

There is clearly a need for better cloud-control tools across the stack. Large organizations may have hundreds or even thousands of cloud services being used by employees – some of which they probably don’t even know about. It is impossible to implement separate controls and polices for each of them.

To securely reap the benefits of cloud while meeting compliance and governance requirements, enterprises will need to take advantage of technologies and tools such as two-factor authentication, data leakage prevention, and encryption, on top of their cloud services and applications.

Increasingly, organizations are also investing in security-as-a-service (SECaaS) and other tools that can help orchestrate security across multiple providers and environments. These help tackle the visibility issue and ensure compliance needs are met. That’s why I believe we are starting to see the rise of so-called “broker” security services. These cloud access security brokers (CASBs) will enable consolidated enterprise security policy enforcement between the cloud service user and the cloud service provider. That’s backed up by Gartner, which has picked out CASBs as a high-growth spot in the security market. Gartner predicts by 2020, 85% of large enterprises will use a CASB for their cloud services, up from fewer than 5% today.

This will all be driven by the rapid growth in enterprise cloud adoption and the need for a new model of security that enables the centralized control or orchestration of the myriad cloud services and apps employees use across the enterprise. Cloud security is now a critical element of any business, and it needs to be taken seriously from the boardroom right down to the end users.

*Blue Skies Ahead? The State of Cloud Adoption
The survey of 1,200 IT decision makers with responsibility for cloud security in their organizations was conducted by Vanson Bourne in June 2015. Respondents were drawn from Australia, Brazil, Canada, France, Germany, Spain, the UK, and the US across a range of organizations, from those with 251 to 500 employees to those with more than 5,000 employees.

Shock Treatment: Combatting Infosec Negligence

July 6, 2016 | Leave a Comment

By Peter Wood, Cyber Security Consultant, Code42

Blog Images_6-20-16_Blog_600x450 (1)Boring training videos, box-ticking to meet regulations, blacklisting software at the expense of productivity: large enterprise has been reliant on these methods of “cyber security control” for too long. They are outdated and don’t work. Cyber criminals don’t follow the steps outlined in a training video from 2006—they innovate, manipulate, penetrate and steal information in many different ways and by many different means.

Internally, employees can also represent a real and significant danger to corporate information—whether by accident or design—they are the insider threat. Think about it this way. Dropbox might be an easy way to transfer a file to a client—but has it been sanctioned by IT? Ask every knowledge worker in a company that question, and you can guarantee you won’t get a single, clear cut answer. In fact, according to Code42’s 2016 Datastrophe Study, 22% of knowledge workers surveyed said their IT department doesn’t know they use third-party cloud sharing solutions.

So in 2016, what are the right ways to educate your employees about data security from both an internal and external perspective?

Shock therapy
We briefly covered that training videos and generic presentations don’t work that well. Within 10 minutes, staff will have switched off and words will be going in one ear and out of the other—unless you’ve invited Snowden himself to present the training.

To encourage employees to take responsibility and ownership of sensitive corporate data, a more direct approach is needed. Fortunately, cybersecurity consultancy and threat-based penetration testing is something we’re well versed in at First Base Technologies, and we’d recommend the following to drive employee awareness:

  • Faking data loss—by targeting specific departments (or even the entire company) with a well-designed program of phishing attacks, you can easily demonstrate the real risk to the business and start the process of education. No information is actually compromised, and the affected employees are told it’s been a simple training exercise. I can guarantee that over time, with the right messages it’ll hammer home the importance of double-checking whether to click that link, install that file, or respond to that unknown request in the future. Think of it as the cyber security equivalent of regular fire drills.
  • Physical penetration testing—this involves hiring third-party security consultants to visit an office disguised as “help-desk” computer engineers, visitors or even cleaners. In actuality, they are penetration testers evaluating both the physical security of an organization and its network infrastructure, with the goal of demonstrating unauthorized access to sensitive information. The resulting report, often accompanied by video footage of the exercise, provides valuable guidance on security weaknesses and remediation. Staff is briefed on what happened and the potential gravitas of the situation—providing another important lesson as a result.
  • Company-wide warnings—as information security professionals, we are well versed in the latest threats and the results of high-profile breaches. And thanks to the recent media agenda, it does seem to be filtering down to non-IT folk too. According to Datastrophe, 74% of knowledge workers say that IT staff’s ability to protect corporate and customer data is very important to their company’s brand and reputation. To communicate these facts to the remaining 26% of employees, breach and security risk information should be regularly delivered to staff at all levels.

Education. It really is the most important weapon in IT and security professionals’ arsenals. It’s a fact that in 2016 and beyond, organizations are under attack pretty much constantly, and if employees aren’t wise to this, the insider threat they present is realized with devastating results. With Datastrophe highlighting that 36% of knowledge workers think the business they work for may be at risk of a public data breach in the next year, it seems people are fortunately starting to understand the threat. And by IT and senior management enacting some of the training methodology above, knowledge workers will start getting well versed in information security practices too.

FedRAMP High Baseline Requirements Published

July 1, 2016 | Leave a Comment

By Abel Sussman, Director, TAAS–Public Sector and Cyber Risk Advisory, Coalfire

The Federal Risk and Authorization Management Program (FedRAMP) Project Management Office officially released its High baseline for High impact-level systems. This baseline is at the High/High/High categorization level for confidentiality, integrity, and availability in accordance with FIPS 199; and is mapped to the security controls from the NIST SP 800-53, Rev. 4 catalog of security controls. Previously, the FedRAMP authorization process was only designed for low and moderate impact systems. The number of controls for each of the FedRAMP defined impact system levels is presented below:

FedRAMP-High-Baseline-Blog

 

The release cumulates several months of work from the FedRAMP PMO, numerous agencies, cloud service providers and key stakeholders that established the draft baseline, collected industry and federal comments, and completed pilot programs.

FedRAMP High Baseline
The establishment of the FedRAMP High Security baseline is critical for federal agencies to migrate more high-impact level data to the cloud. The High baseline is the strongest FedRAMP level to date, covering sensitive, unclassified data. According to FedRAMP Director Matt Goodrich, most of the information to be covered under the High baseline will be law enforcement data and patient health records. This should cover the needs of several civilian agencies, the Department of Defense (DoD), and the Department of Veterans Affairs (VA).

FedRAMP High Baseline Authorized Cloud Service Providers
The three Infrastructure-as-a-Service (IaaS) providers who participated in the FedRAMP High baseline pilot program and achieved Authorization are:

  • Microsoft’s Azure GovCloud
  • Amazon Web Services GovCloud
  • CSRA / Autonomic Resources’ ARC-P

Federal agencies are able to review these vendor’s security packages, through OMB MAX, to begin to use these services immediately.

Coalfire was one of the earliest Third Party Assessment Organizations (3PAO) in FedRAMP, providing FedRAMP assessment or advisory services to cloud service providers in pursuit of their FedRAMP P-ATO or Agency ATO. If you’d like to talk to one of our staff about the new FedRAMP High baseline or have questions about the FedRAMP process, please contact us.

Microsoft Azure Closes IaaS Adoption Gap with Amazon AWS

June 29, 2016 | Leave a Comment

Percentage of Enterprise Computing Workloads in the Public Cloud Expected to Reach 41.05% This Year

By Cameron Coles, Director of Product Marketing, Skyhigh Networks

Skyhigh IaaS blog banner imageIndustry analyst firm Gartner predicts that the infrastructure as a service (IaaS) market will grow 38.4% in 2016 to reach $22.4 billion by the end of the year. A new report from the Cloud Security Alliance (download a free copy here) finds that Microsoft is quickly catching up with industry leader Amazon in the race to tap this growing market. Amazon, Google, and Microsoft collectively own 82.0% of the IaaS market today. Even at companies that have a strict “no cloud” philosophy, IT leaders admit that nearly one fifth of their computing workloads will be in the public cloud this year versus their own data centers.

Amazon remains the dominant IaaS provider but Microsoft is closing their gap in market share. IT professionals at 37.1% of companies indicated that Amazon AWS is the primary IaaS platform at their organization. Microsoft Azure is a close second, at 28.4% followed by Google Cloud Platform at 16.5%. Enterprises using public cloud benefit in many ways including greater agility, lower cost of ownership, and faster time to market. IaaS providers, meanwhile, are also benefitting. In April 2016, Amazon reported that AWS is its most profitable division and is growing 64% annually.

iaas platform

 

IaaS adoption trends
Enterprises are increasingly relying on public cloud infrastructure providers such as Amazon, Microsoft, and Google for their computing resources, rather than managing their own data centers. A plurality of organizations (45.1%) have a “hybrid cloud” philosophy, another 25.1% prefer private cloud, and 21.5% take a predominantly public cloud approach. Just 8.2% of enterprises have a “no cloud” philosophy. Today, 31.2% of an enterprise’s computing resources come from infrastructure as a service (IaaS) providers. IT professionals expect that number to rapidly grow to 41.0% of computing workloads in the next 12 months.

csa-survey-graphic-4-v4

Not surprisingly, companies with a “public cloud” philosophy have more computing in the public cloud. At these companies, nearly one half (47.8%) of computing resides in the public cloud today and IT professionals at these organizations expect a majority of their computing (56.5%) will reside in the public cloud 12 months from now. Even companies with a “no cloud” philosophy estimate that 14.6% of their computing nevertheless resides in the public cloud, and they expect that number will grow to 18.8% in the next 12 months. There is a sizable amount of computing in public cloud IaaS even for organizations that are philosophically opposed to cloud.

There is a clear correlation between company size and IaaS adoption. Companies with fewer employees rely on public IaaS platforms for more of their computing today. Companies with 1-1,000 employees have the largest share of computing workloads in the public cloud (37.1%) versus companies with more than 10,000 employees (22.3%). However, in the next 12 months, companies with more than 10,000 employees are anticipating growing their use of IaaS to 32.9%, which would eclipse companies with 5,000-10,000 employees and would put them roughly on par with companies with just 1,000-5,000 employees. Public IaaS appears to be reaching an inflection point in the enterprise.

csa missing

Barriers to IaaS projects
Despite the rapid growth of public cloud infrastructure, there are still barriers holding back IaaS adoption. The most common barrier reported by IT professionals is concern about the security of the IaaS platform itself (62.1% of respondents). The next most common roadblock is also security related – 40.5% of respondents indicated that concern about the ability to secure applications deployed on IaaS platforms is a barrier to adoption. The third most common barrier, reported by 37.9% of respondents, is the inability to store data within their country to comply with data privacy laws (e.g. EU General Data Protection Regulation).

barriers

Despite concerns, overall confidence in cloud
Despite concerns about security, an overwhelming 61.6% of IT leaders believe that, generally speaking, custom applications they deploy on IaaS platforms are as secure, if not more secure, than applications they deploy in their own datacenter. That may be due in part to the significant investments cloud providers have made in their own security, and in achieving compliance certifications such as ISO 27001 and 27018 to demonstrate their investments. It could also be due to a growing sentiment that cloud companies such as Amazon, Microsoft, and Google can dedicate far more resources to IT security than the average company where IT is not their core business.

61 percent

Little Bits of Security – Micro-Segmentation in Clouds

June 27, 2016 | Leave a Comment

By Darren Pulsipher, Enterprise Solution Architect, Intel Corp.

800x536-CloudcampaignCloud environments have made some things much easier for development teams and IT organizations. Self-service portals have cut down the amount of “hands on” intervention to spin up new environments for new products. Provisioning of new infrastructure has moved from weeks or days to minutes. One thing that barely changed with this transformation is security. But new techniques and tools are starting to emerge that are moving security to the next level in the Cloud. One of these technologies is called micro-segmentation.

Traditional datacenter security
To understand micro-segmentation let’s first look at current datacenter security philosophy. Most security experts focus on creating a hardened outer-shell to the datacenter. Nothing gets in or out without logging it, encrypting it, and locking it down. Firewall rules slow malicious hackers from getting into the datacenter. With the increase of more devices connected to the datacenter, security experts are looking at ways to secure, control, and authenticate all these connected devices.

Inside the datacenter, security measures are put into place to make sure that applications do not introduce security holes. Audit logs and incident alerts are analyzed to detect intrusions—notifying security analysts to lock things down. Security policies and procedures are created to try and mitigate human error in order to protect vital data. All of this creates a literal fortress, with multiple layers of protection from a myriad of attacks.

Micro-segmentation adds a hardened inner shell
Wouldn’t it be nice if I could create a hardened shell around each one of my applications or services within my datacenter? Opening access to the applications through firewalls and segmented networks that would make your security even more robust? If my outer datacenter security walls were breached, hackers would uncover a set of additional security walls—one for each service/application in your IT infrastructure. The best way to envision this is to think about a bank that has safety deposit boxes in the safe. Even if you broke into the safe there is nothing to take—just a set of secure boxes that also need to be cracked.

One of the benefits of this approach is when someone hacks into your datacenter, they only get access to at most one application. And they need to breach each application one by one. This extra layer of protection gives security experts a very powerful tool to slow down hackers wreaking havoc on your infrastructure. The downside to this approach is it can take time and resources setting up segmented networks, firewalls, and security policies.

SDI (Software-Defined Infrastructure) increases risk or security
Now I want you to imagine that you have given developers or line of business users the ability to create infrastructure through a self-service portal. Does that scare you? How are you going to enforce your security practices? How do you make sure that new applications are not exposing your whole datacenter to poorly architected solutions? Have you actually increased the attack surface of your datacenter? All of these questions keep security professionals up at night. So, shouldn’t a good security officer be fighting against SDI and self-service clouds?

Not so fast. There are some great benefits to SDI. First off, you can programmatically provision infrastructure (storage, compute and yes, network elements.) This last one, software-defined networking, gives you some flexibility around security that you might not have had in the past. You can create security policies enforced through software and templates that can increase your security around applications and the datacenter outer shell.

Software-defined infrastructure enabling micro-segmentation
Now take the benefits of both SDI and micro-segmentation. Imagine that you put together templates and/or scripts that create a segmented network, setup firewall rules and routers, and manages ssh keys for each application that is launched. Now when a user creates a new application or set of applications a micro-segmented “hardened shell” is created. So even if your application developer is not practicing good security practices you are only exposed for that one application.

The beginnings of micro-segmentation is available in some form from all of the major SDI platforms. The base functionality and most prevalent in all of the SDI platforms is the ability to provision a network, router, and firewall in your virtual infrastructure. Both template-driven and programmable APIs are available. So there is some work that needs to be done by the security teams. And enforcing the use of these templates is always a battle. The key is to make them easy to consume.

Don’t ignore the details
One thing that SDI does bring to your infrastructure is the propagation of bad policies and tools. If you make it easy to use, people will use it. Pay attention to the details. Setup the right policies and procedures and then leverage SDI to implement them. Don’t be like the banker that writes the combination to the safe on a piece of paper and tapes it to the top of their desk. And then photocopies it and shares it with everyone in the office.

SDI can make micro-segmentation a viable tool in the security professional’s toolkit. Just like any tool, make sure you have established the processes and procedures before you propagate them to a large user community. Otherwise you are just making yourself more exposed

Verizon DBIR Says You Can’t Stop the Storm—But You Can See It Coming

June 22, 2016 | Leave a Comment

By Susan Richardson, Manager/Content Strategy, Code42

Blog Images_6-13-16_Blog_600x450 (3)The 2016 Verizon Data Breach Investigations Report (DBIR) paints a grim picture of the unavoidable enterprise data breach. But accepting the inevitability of breaches doesn’t mean accepting defeat. It’s like severe weather: you can’t prevent a tornado or hurricane. But with the right visibility tools, you can recognize patterns and mitigate your risk.

Likewise with data security, visibility is critical. “You cannot effectively protect your data if you do not know where it resides,” says Verizon.

Most enterprises plagued by poor data visibility
The report shows that most organizations lack the data visibility tools for effective breach remediation. Hackers gain access more easily than ever, with 93 percent of attacks taking just minutes to compromise the enterprise ecosystem. Yet without the ability to see what’s happening on endpoint devices, 4 in 5 victimized organizations don’t catch a breach for weeks—or longer.

Here’s a look at how data visibility solves many of the major threats highlighted in the 2016 DBIR:

Phishing: See when users take the bait
The report showed users are more likely than ever to fall for phishing. One in ten users click the link; only three percent end up reporting the attack. Instead of waiting for the signs of an attack to emerge, IT needs the endpoint visibility to know what users are doing—what they’re clicking, what they’re installing, if sensitive data is suspiciously flowing outside the enterprise network. The “human element” is impossible to fix, but visibility lets you “keep your eye on the ball,” as Verizon put it, catching phishing attacks before they penetrate the enterprise.

Malware and ransomware: Encryption + endpoint backup
With laptops the most common vector for the growing threats of malware and ransomware, Verizon stresses that “protecting the endpoint is critical.” The report urges making full-disk encryption (FDE) “part of the standard build” to gain assurance that your data is protected if a laptop falls into the wrong hands. Continuous endpoint backup is the natural complement to FDE. If a device is lost or stolen, IT immediately has visibility into what sensitive data lived on that device, and can quickly restore files and enable the user to resume productivity. Plus, in the case of ransomware, guaranteed backup ensures that you never truly lose your files—and you never pay the ransom.

Privilege abuse: “Monitor the heck” out of users
Authorized users using their credentials for illegitimate purposes “are among the most difficult to detect.” There’s no suspicious phishing email. No failed login attempts. No signs of a hack. And for most organizations, no way of knowing a breach has occurred until the nefarious user and your sensitive data is long gone. Unless, of course, you have complete visibility into the endpoint activities of your users. Verizon urges enterprises to “monitor the heck out of authorized daily activity,” so you can see when a legitimate user is breaking from their use pattern and extricating sensitive data.

Forensics: Skip the hard part for big cost savings
The most costly part of most enterprise data breaches—accounting for half of the average total cost—involves figuring out what data was compromised, tracking down copies of files for examination, and other forensic tasks required for breach reporting and remediation. Most often, an organization must bring in legal and forensic consultants—at a steep price. If you have complete visibility of all enterprise data to begin with, including endpoint data, you can skip much of the hard work in the forensics phase. If you already have continuous and guaranteed backup of all files, all your files are securely stored and easily searchable. Modern endpoint backup solutions go a step further, offering robust forensic tools that make it easy and cost-effective to conduct breach remediation, forensics and reporting tasks without eating up all of IT’s time, or requiring expensive ongoing consultant engagement.

See your data, understand your patterns, mitigate your risk
The whole point of the DBIR is to shed light on data to see the patterns and trends in enterprise data security incidents—to mitigate risk through greater visibility. So read the report. Understand the common threats. But make sure you apply this same methodology to your own organization. With the right data visibility tools in place, you can see your own patterns and trends, learn your own lessons, and fight back against the inevitable data breach.

Download The Guide to Modern Endpoint Backup and Data Visibility to learn more about selecting a modern endpoint backup solution in a dangerous world.

Why You Need a Multi-Layer Approach to Public Cloud Security

June 20, 2016 | Leave a Comment

By Scott Montgomery, Vice President & Chief Technical Strategist, Intel Security Group

Would you hand your house keys to a total stranger and then go away on vacation for two weeks? Probably not, but that’s precisely what some businesses do when they move applications and data to the public cloud.

Security has long been the principal fear that weighs on cloud investments. While perceptions are improving, Intel Security’s recent State of Cloud Adoption study found that data breaches remain the biggest concern of companies deploying Software as a Service (SaaS), Infrastructure as a Service (IaaS), and even private cloud models. A 2015 survey by Crowd Research Partners found that nine in 10 security professionals worry about cloud security.

These concerns, however, are not stopping enterprises from investing in the cloud. The Intel Security study found that

intel_cisco-100657835-large970.idge
While the survey shows that confidence in cloud security is increasing, only one-third of respondents believe their senior executives understand the security risks.

Investments in cloud security should be commensurate with the level of migration to cloud services. But budgeting for security in the public cloud is distinctly different than planning for on-premise prevention. One fundamental shift is that cloud providers use a “shared responsibility model” that spreads risks between vendor and customer. Another difference: Customers don’t buy the same mix of products and equipment to secure the cloud that they do in the data center.

intel-security-chart-100658994-large970.idge
Budgeting for security in the public cloud begins by considering which applications and infrastructure components will live there. Some, like website hosting and document serving, are of relatively low risk and don’t demand the most stringent safeguards. Also consider the consumption models you’ll use. SaaS providers generally assume responsibility for security and the application and system levels. However, IaaS providers tend to cede those responsibilities to the customer. What’s more, no public cloud provider is likely to assume responsibility for user access and data protection, although there are measures they can take to support your own efforts.

There are three levels of security to consider as you build out your public cloud strategy:

System-level security for IaaS
This is secured plumbing: systems-level components such as operating systems, networks, virtual machines, management utilities and containers. Here, you want to invest in cloud providers that make it easy for you to keep your systems current with the latest patches and updates. The service provider should also provide thorough visibility into your cloud instances so that you can see all instances that are running. One of the challenges of public cloud is that it’s so convenient to spin up new VMs and containers that you may forget to shut them down later. These so-called “zombies” are latent security threats because they present potential attack vectors into more business or mission critical systems.

If you plan to use containers, as a growing number of enterprises are, be diligent about the level of security protection they offer. The market for containers is still immature, and security – while improving – is considered one of the technology’s weakest areas.

Remember, you are responsible for system-level security in your Infrastructure as a Service (IaaS) and Platform as a Server (PaaS) instances. Integrating these security controls and reporting in with your on-premises systems will create efficiencies. Be sure to include the appropriate controls for the type of server employed. These may include tools such as intrusion prevention, application control, advanced antimalware solutions and threat detection. These should be all be centrally managed for visibility and compliance in addition to policy and threat intelligence sharing with your on-premises infrastructure.

Application-level security
This level is primarily about identity and access management. Your best investment here isn’t financial; it’s a policy that limits the ability of users to deploy cloud applications without IT’s knowledge.

After ensuring policies are in place that offer IT visibility, the next step is to invest in multifactor authentication and identity management. The first approach uses two or more devices or applications to permit access. For example, a verification code can be sent to a phone or email address to ensure that a stolen password isn’t a critical failure point.

Identify management locks down application access by requiring users to authenticate through a secure resource such as LDAP or Active Directory. If your organization already uses a directory, consider investing in cloud brokering software that supports single sign-on so that users can authenticate to all their cloud services through their local directory. This gives IT complete visibility and shifts access control from the cloud service to your own IT organization. Consider also investing in a secure VPN tunnel so sessions are never exposed to the public Internet.

Data-level security
This level of protection involves securing the data itself. No cloud provider will take responsibility for your data, but there are solutions you can purchase to help.

Many cloud providers, for example, offer encryption as a standard option, but you may be surprised at how many do not, or who encrypt data only part of the time. Anything less than 256-bit encryption is considered inadequate these days.

More important is that you have full control of the encryption keys. If a cloud provider insists on owning them, you have no guarantees that your data will be safe. Seek another provider.

In addition, make sure your data is unencrypted only when in use. Some providers require that data be transmitted to their facilities in plain-text format. That’s a security risk.

As noted in the Cloud Security Primer, none of these levels should be secured in isolation. Cloud security, the primer states, is “an end-to-end challenge whereby the solutions must be built into the overall IT environment and not tacked on as an afterthought.”

Whatever cloud provider you adopt, make sure their security guarantees spelled out in their contract and SLA. A good contract should spell out exactly what procedures will be employed, along with any penalties the provider will face for non-compliance, how they will report upon it, and how you can audit to ensure your contractual terms are being met. A strong SLA ensures that you don’t simply toss the keys to your cloud provider as you’re walking out the door.

Confident Endpoint Visibility Responds to Modern Data Protection Problems

June 17, 2016 | Leave a Comment

By Joe Payne, President and CEO, Code42

Consumer tech adoption has outpaced tech evolution in business for more than ten years. SaaS and cloud solutions, new apps and devices are at the disposal of empowered workers, making it very easy for employees to get what they need to work anywhere or—despite policies forbidding it—take career-making IP as they exit one company for the next. Legacy backup can neither unlock nor disarm these threats.

At the same time, data has become the new currency: cyber-crime syndicates have boomed with new variations on stealing or disabling data, particularly spear phishing and ransomware targeted at employees. As for breach, the headlines say it’s not a matter of if. It’s when. Legacy backup, long rejected by workers, simply cannot address these threats.

Finally, encrypted data moving through the network has made the intelligence it houses opaque—even to its stewards. A CISO recently shared with us that more than 75% of his network traffic is encrypted, making it nearly impossible to identify the threats facing his organization.

While it’s safe to say encryption is a must, it also means the focus of security must shift to the endpoints to mitigate risk and regain control.

Modern endpoint backup sees what you can’t
Modern endpoint backup gives IT and InfoSec the ability to see, monitor movement of and recover data housed on every employee device.

It neutralizes the threat of ransomware by making up-to-the-minute data recovery simple and fast. It decreases the cost of litigation by leveraging a complete dataset for legal holds, and it supports rapid response and remediation of breach via data attribution—with or without the device. From a productivity perspective, modern endpoint backup makes everyday challenges like data migration a lighter lift for IT and end users.

In response to modern data security problems, more than 39,000 businesses—including ten of the most recognized brands in the world, the 7 of the top 10 technology brands, and 7 of the 8 Ivy League schools—have adopted Code42 to regain visibility and mitigate risk.

In 2008, Code42 launched its enterprise endpoint backup software—knowing it was time for backup to catch up. Now approaching its sixth-generation platform, Code42 provides visibility of all the data through a single console and the real-time recovery and security tools the enterprise needs to be more resilient, more accountable, and more defensible.

Modern endpoint backup imparts the right to “Be Certain” in the face of modern data protection and security problems. We invite you to find out how.

More Than One-Fourth of Malware Files “Shared”

June 15, 2016 | Leave a Comment

By Krishna Narayanaswamy, Chief Scientist, Netskope

netskopeLast week, Netskope released its global Cloud Report as well as its Europe, Middle East and Africa version highlighting cloud activity from January through March of 2016. Each quarter we report on aggregated, anonymized findings such as top used apps, top activities, top policy violations, and other cloud security findings from across our customers using the Netskope Active Platform, including by industry.

This report took up where we last off last quarter on our cloud malware research, in which we found that 4.1 percent of enterprises had at least one sanctioned cloud app laced with malware. This quarter that number has risen to 11.0 percent, or nearly triple since last quarter. This is before counting unsanctioned apps, which we are researching and will incorporate into future reports. When we do, we expect these numbers to increase dramatically. Beyond sharing volume of detections, this quarter’s report breaks down those malware into the following observed categories, several of which are known to be used to distribute or propagate ransomware:

  1. JavaScript exploits and droppers
  2. MS Office macros
  3. Backdoors
  4. Mobile malware
  5. Spy- and Adware
  6. Mac malware

We also rated discovered malware in terms of its severity based on the extent to which it affects user privacy and computer security and causes damage to files, computers, or networks. 73.5 percent of detected malware this quarter ranks “high” in terms of severity, with 8.3 percent “medium,” and 18.2 percent “low.”

Perhaps the most shocking finding is that 26.2 percent of discovered malware files had been shared, either internally (with one or more people inside of the organization), externally (with one or more people outside of the organization), or publicly (with a publicly-accessible link). Sync and share, two important capabilities that characterize the cloud, are liabilities when it comes to malware because malware can use sync and share to propagate rapidly between users and devices, and the reason we dubbed this issue the cloud malware fan-out effect.

What do we recommend to combat the fan-out? Five things:

  1. Back up versions of your critical content in the cloud. Enable your app’s “trash” feature and set the default purge to a week or more. This is one of your best bets for preserving your data should you become infected with data destructing malware such as ransomware.
  2. Use your CASB to scan for and remediate cloud malware in your sanctioned apps. Make sure to check for infected users through sync and share. Integrate your CASB with, and share detections across, your existing security infrastructure such as your sandbox and endpoint detection and response (EDR) so you can stop malware wherever it’s propagating in your environment.
  3. Detect malware incoming via sanctioned and unsanctioned apps.
  4. Detect anomalies in your sanctioned and unsanctioned cloud apps, such as unusual file upload activity or other out-of-the-norm behaviors.
  5. Monitor uploads to sanctioned and unsanctioned cloud apps for sensitive data, which can indicate exfiltration in which malware is communicating with a cloud-based command and control server.