Six archetypes of insider exfiltration Arrow to Content

July 2, 2015 | Leave a Comment

By Susan Richardson, Manager/Content Strategy, Code42

With all the talk about insider threats and the potentially dangerous brew of nomadic employees and data-to-go, there’s no time like the present to identify behaviors that come before a data leak.

Here are the top six:

  1. The Ship Jumper: Frequent absences, unexplained disappearances or unexpected medical appointments point to an employee who’s unhappy, distracted or looking to jump ship. Workers who have accepted a new job are the most likely to give data to a competitor. In what must be the most common insider threat scenario, a sales representative leaves the company for a competitor, taking sales opportunities with him. Concern over defectors leaving with data is prevalent in organizations of all industries and sizes, especially in competitive markets. Stealing customer data and leads is not only difficult to detect because it occurs on unsanctioned corporate applications, but it is also incredibly detrimental to the business.
  2. The Unhappy Camper: An employee who has been reviewed poorly or put on a performance improvement plan may seek revenge. When a bad performance review has been delivered, HR and IT should communicate so both can heighten monitoring. In a case where an IT employee was disgruntled, the hosting service Code Spaces was forced to go out of business when an attacker gained access to their Amazon Web Services (AWS) control panel and deleted customer data and backups.
  3. The Spendthrift: When an employee talks excessively about money, gets calls from collection agencies or takes a second job it may be a clue that he or she is experiencing financial problems. Be wary: these folks may steal data or sabotage company systems for personal gain.
  4. The Angler: When employees engage in “atypical” computer behaviors like taking their computer home for the first time, trying to exfiltrate CRM data, changing their computer configurations, repeated attempts to access privileged folders on the Intranet or shared drive, or the sudden appearance of external drives to back up data, it may be a tell that company data is being exfiltrated.
  5. The Uploader: If employees are using personal clouds, it’s highly likely they’re uploading files to take home (or elsewhere). Also, if the free space on an employee’s computer increases he may be deleting files to cover his tracks.
  6. The Ex: When office romance goes bad, some scorned lovers may seek to access personnel files or other personal information to “stalk” ex-lovers. Watch for increased failed password attempts. Other acts of revenge may be far more serious, like this one reported in the Harvard Business Review:

A manager complained to his superior about the person in question—a systems administrator who had been sending him flowers at work and inappropriate text messages and had continually driven past his home. Once clearly rejected, the attacker corrupted the company’s database of training videos and rendered the backups inaccessible. The company fired him. But knowing that it lacked proof of his culpability, he blackmailed it for several thousand euros by threatening to publicize its lack of security, which might have damaged an upcoming IPO. This costly incident—like most other insider crimes—went unreported.

It’s common sense to remove terminated employees from systems, yet a 2014 infosec survey showed that 13 percent of respondents still had access to previous employers’ systems using their own credentials. It is critical to void passwords, privileges and user accounts immediately—and to document and adhere to “stand down” procedures to protect the enterprise.

Cloud Security Open API: The Future of Cloud Security Arrow to Content

June 29, 2015 | Leave a Comment

Today, CipherCloud announced that the Cloud Security Alliance (CSA) is launching a Cloud Security Open API Working Group, co-led by CipherCloud. The charter of the working group is to provide guidance for enterprises and cloud service providers on the operation and interoperability of cloud security functions, with a specific goal to protect PII and sensitive data across multiple clouds. Current members of the working group include Deloitte, Intel Security, CipherCloud, SAP, Symantec, Infosys, and a few others.

Unlike many API efforts where the APIs typically allow access to a particular solution provider’s core code base, this effort aims to span multiple cloud services and bridge the gap between proprietary cloud environments.

Why Focus on Cloud Security Open APIs?
As cloud deployments become more extensive in enterprise, the ecosystem surrounding cloud deployments is becoming more and more complex. The number of places that touch personal data, company IPs, and other confidential information are quickly ballooning out of control. This is a conceptual diagram that illustrates the cloud ecosystem of an enterprise. As seen, personal data from the enterprise could go into CSP1, CSP2, and CSP3. In addition, partner app1 and partner app 2 may process personal data as well as the ISVs that help integration and customization efforts.

OpenAPI

For the enterprise to retain complete control over your security and compliance-sensitive data in such an environment, it requires a monumental effort. Not only you need to have complete visibility of the entire ecosystem including partner applications outside of clouds with which you work directly. You must also exercise gate-keeping functions at each integration point, which quickly becomes non-scalable.

The Cloud Security Open APIs provide a layer of abstraction via which cloud users and third party technology providers can access and integrate with the core functions of cloud services. This common layer of abstraction across clouds allows end-user organizations the ability to exercise standard integrations with ease, eliminating the need for costly one-off custom development efforts. Ultimately, this will accelerate the pace of cloud adoption and innovation.

An analogy to the Cloud Security Open APIs is the Automated Clearing House (ACH) network in the banking industry. ACH is a widely adopted industry standard across different financial institutions and clearing houses. A bank can switch from one clearing house to another without changing the way they do funds transfers and payment processing. This is possible because the clearing houses and the banking institutions all adhere to the ACH standards. In a way, the Cloud Security Open APIs is the ACH standard for cloud security operations.

Benefits of Cloud Security Open APIs
Expedite cloud deployments: A well-known and standard API layer will give enterprise developers the ability to leverage core cloud functions quickly, thus expediting the pace of cloud deployments.

Foster cross-cloud innovations: With the Cloud Security Open APIs, developers now have a way to write cross-cloud functions without having to custom integrate with each cloud that it touches. This may open up breakthrough innovations in new economic venues, new ways of doing business for cloud users and providers alike.

Extend cloud services reach to new functionality: From the perspective of a cloud service provider (CSP), the Cloud Security Open APIs will allow a much larger set of developers (than those within the CSP’s own company) to leverage the CSP’s core code base/data and deliver adjacent functionality.  Sometimes this model can lead to entirely new and unexpected user experiences and technology advances, which can make the service much more appealing to end users.

What Will the Working Group Produce and What Does It Mean for You?
Today the business drivers for the Cloud Security Open APIs are about eliminating business and technology frictions when organizations move to embrace cloud applications. With this in mind, the working group will execute this roadmap going forward:

  1. Defining a set of concrete security use cases covered by the Open APIs
  2. Produce the Cloud Security Open API framework
  3. Generate a reference architecture that implements the API framework
  4. Produce industry guidance and white papers

If you are a cloud service provider, participating in the Open API program will allow you to go beyond just a service and become a platform for innovation. If you are a technology provider to the cloud environment, being part of the Open API will make your offering more agile and more appealing to a broad set of partners and users. If you are an end-user organization, the Cloud Security Open APIs really aim to make your life easier and should represent what you want to see in the security ecosystem. Your input therefore is extremely important.

The CSA Working Group and ways to participate can be found here. Get involved and get your voice heard!

The Future of Cybersecurity Arrow to Content

June 23, 2015 | Leave a Comment

vidya_260_340In 2013, President Obama issued an Executive Order to protect critical infrastructure by establishing baseline security standards. One year later, the government announced the cybersecurity framework, a voluntary how-to guide to strengthen cybersecurity and meanwhile, the Senate Intelligence Committee voted to approve the Cybersecurity Information Sharing Act (CISA), moving it one step closer to a floor debate.

Most recently, President Obama unveiled his new Cybersecurity Legislative Proposal, which aims to promote better cybersecurity in information-sharing between the United States government and the private sector. As further support, The White House recently hosted a Summit on cybersecurity and consumer protection at Stanford University in Palo Alto on February 13, 2015 which convened key stakeholders from government, industry and academia to advance the discussion on how to protect consumers and companies from mounting network threats.

No doubt we have come a long way, but looking at the front-page headlines today reminds us that we’ve still got a long ways to go. If the future if going to be different and more secure than today, we have to do some things differently.

I recently participated on a panel titled “The Future of Cybersecurity” at the MetricStream GRC Summit 2015, where I was joined on stage by some of today’s leading thinkers and experts on cybersecurity; Dr. Peter Fonash, Chief Technology Officer Office of Cybersecurity and Communications, Department of Homeland Security; Alma R. Cole, Vice President of Cyber Security, Robbins Gioia; Charles Tango, SVP and CISO, Sterling National Bank; Randy Sloan, Managing Director, Citigroup; and moderator John Pescatore, Director of Emerging Security Trends, SANS Institute.

The purpose of this panel was to convene a diverse group of experts who believe in a common and shared goal – to help our customers, companies, governments and societies become more secure. This panel followed on the heels of a keynote address by Anne Neuberger, Chief Risk Officer of the NSA, who spoke about a simple challenge that we can all relate to: operations. Speaking on her experience at the NSA, Neuberger articulated that a lot of security problems can be traced back to the operations, and more precisely, this idea that ‘we know what to do, but we just weren’t doing it well’ or ‘we had the right data, but the data wasn’t in the right place.’

Moderator John Pescatore from SANS Institute did an exceptional job asking the questions that needed to be asked, and guiding a very enlightening discussion for the audience. For one hour on stage, we played our small part in advancing the discussion on cybersecurity, exploring the latest threats and challenges at hand, and sharing some of the strategies and solutions that can help us all become more secure.

Here are the five key takeaways that resonated most.

threat-cybersecurity

Topic 1: Threat information sharing tends to be a one-way street. There is an obvious desire from the government to get information from private industry, but a lot more needs to be done to make this a two-way street.

According to Dr. Peter Fonash, Chief Technology Officer at the Office of Cybersecurity and Communications at the Department of Homeland Security, the DHS is looking to play a more active role in threat information sharing. To that end, the DHS is actively collecting a significant amount of information, and even paying security companies for information, including the reputation information of IP addresses. However, some challenges faced when it comes to the government being able to participate in sharing that threat information is in getting that information as “unclassified as possible” and second, lots of lawyers involved in making sure that everything that is shared is done so in a legal manner. Dr. Fonash stressed that government faces another challenge; private industry thinking that government is in some way an advisory or industry competitor when it comes to threat information – this is simply not the case.

Topic 2: There are lots of new tools, the rise of automation, big data mining – but the real challenge is around talent.

Simply stated, our organizations need more skilled cybersecurity professionals than what the currently supply offers. For cybersecurity professionals, it is a great time to be working in this field – job security for life, but it is a bad time if you are charged with hiring for this role. Automation and big data mining tools can definitely help when they are optimized for your organization, with the right context and analysts who can review the results of those tools. According to Alma R. Cole, Vice President of Cyber Security at Robbins Gioia, in the absence of the skill-sets that that you aren’t able to find, look internally. Your enterprise architecture, business analysis, or process improvement leaders can directly contribute to the outcome of cybersecurity without themselves having a PHD in cybersecurity. While cybersecurity experts are needed, we can’t just rely on the experts. Cole makes the case that as part of the solution, organizations are building security operations centers outside of the larger city centers like New York and DC – where salaries aren’t as high, and there isn’t as much competition for these roles. Some organizations are also experimenting with virtual security operations centers, which provide employees with flexibility, the ability to work from anywhere, and improved quality of life, while also providing the organization with the talent they need.

Topic 3: We are living and doing business in a global economy – we sell and buy across the world and we compete and cooperate with enemies and business partners around the world. We are trying to make our supply chains more secure but we keep making more risky connections.

According to Charles Tango, SVP and CISO at Sterling National Bank, this might be a problem that gets worse before it gets better. We’ve seen a dramatic increase in outsourcing, and many organizations have come to realize that the weakest link in the chain is oftentimes their third party. At this moment in time, as an industry, banks are largely reactionary, and there’s a lot of layering of processes, people and tools to identify and manage different risks across the supply chain. The industry needs a new approach, wherein banks can start to tackle the problem together. According to Tango, we won’t be able to solve this challenge of managing our third and fourth parties on an individual bank-by-bank basis; we have to start to tackle this collaboratively as an industry.

Topic 4: No doubt, the future of applications is changing dramatically, and evolving everyday – just look at the space of mobile computing.

According to Randy Sloan, Managing Director at Citigroup, from a dev-ops automation perspective, if you are introducing well-understood components and automation such as pluggable security – you are way out in front, and you are going to be able to tighten things up to increase security. More challenging from an app-dev perspective is the rapidness – the rapid development and the agile lifecycles that you have to stay up with. The goal is always to deliver software faster and cheaper, but that does not always mean better. Sloan advocates for balance – investing the right time from an IS architecture, to putting the right security testing processes in place, and focusing on speed – slowing things down and doing things a more thoughtfully.

Topic 5: We’ve got dashboards, and threat data, and more sharing than ever before. But what we need now are more meaningful approaches to analytics that aren’t in the rear view mirror.

I believe over the next few years, organizations will be more analytics driven, leveraging artificial intelligence, automation, machine learning and heuristic-based mechanisms. Now the challenge is figuring out how to sustain it. This is the value of an ERM framework where you can bring together different technologies and tools to get information that can distilled and reported out. This is about managing and mitigating risk in real time, and intercepting threats and preventing them from happening rather than doing analysis after the fact.

We live in an increasingly hyper-connected, socially collaborative, mobile, global, cloudy world. These are exciting times, full of new opportunities and technologies that continue to push the boundaries and limits of our wildest imaginations. Our personal and professional lives are marked by very different technology interaction paradigms than just five years ago. Organizations and everyone within them need to focus on pursuing the opportunities that such disruption and change brings about, while also addressing the risk and security issues at hand. We must remember that the discussions, strategies, and actions of today are helping to define and shape the future of cybersecurity.

By Vidya Phalke, CTO, MetricStream

11 Advantages of Cloud Computing and How Your Business Can Benefit From Them Arrow to Content

June 22, 2015 | Leave a Comment

blog-banner-advantages-cloud-computing-300x180HOW COMPANIES USING THE CLOUD GROW 19.3% FASTER THAN THEIR COMPETITORS

While their motivations vary, businesses of all sizes, industries, and geographies are turning to cloud services. According to Goldman Sachs, spending on cloud computing infrastructure and platforms will grow at a 30% compound annual growth rate (CAGR) from 2013 through 2018 compared with 5 percent growth for overall enterprise IT. Cloud adoption is accelerating faster than previously anticipated, leading Forrester to recently revise its 2011 forecast of the public cloud market size upward by 20 percent. Whether you’re looking atSoftware-as-a-Service (SaaS), Infrastructure-as-a-Service (IaaS), or Platform-as-a-Service (PaaS), the predictions are the same: fast growth of the workloads placed in the cloud and an increased percentage of the total IT budget going toward cloud computing.

blog image - forrester cloud market sizing 850

 

 

 

 

 

 

 

 

 

 

According to a study by the Cloud Security Alliance, 33% of organizations have a “full steam ahead” attitude toward cloud services and 86% of companies spend at least part of their IT budget on cloud services. IT leaders at 79% of companies receive regular requests from end users each month to buy more cloud applications with file sharing and collaboration, communication, social media, and content sharing topping the list of the most-requested cloud services.

Numerous factors are driving cloud adoption, according to a study conducted by the market research company Vanson Bourne. “The Business Impact of the Cloud” report compiles insights from interviews of 460 senior decision-makers within the finance functions of various enterprises. The report summarized 11 drivers of cloud adoption along with quantifiable improvements these companies have achieved by deploying cloud services to improve productivity, lower cost, and improve time to market.

Though they aren’t in IT positions, the majority of these financial executives are actively involved in their organizations’ discussions about cloud strategy. Their perspective of cloud computing includes benefits to the business as a whole. Companies that adopted cloud services experienced a 20.66% average improvement in time to market, 18.80% average increase in process efficiency, and 15.07% reduction in IT spending. Together, these benefits led to a 19.63% increase in company growth.

blog image - cloud measurable impact 680

 

 

 

 

 

 

 

 

 

 

 

 

 

The Vanson Bourne report identified eleven advantages of cloud computing that organizations are experiencing today, leading to quantifiable improvements in their businesses:

1. Fresh Software

With SaaS, the latest versions of the applications needed to run the business are made available to all customers as soon as they’re released. Immediate upgrades put new features and functionality into workers’ hands to make them more productive. What’s more, software enhancements are typically released quite frequently. This is in contrast to home grown or purchased software that might have major new releases only once a year or so and take significant time to roll out.

2. Do more with less

With cloud computing, companies can reduce the size of their own data centers — or eliminate their data center footprint altogether. The reduction of the numbers of servers, the software cost, and the number of staff can significantly reduce IT costs without impacting an organization’s IT capabilities.

3. Flexible costs

The costs of cloud computing are much more flexible than traditional methods. Companies only need to commission – and thus only pay for – server and infrastructure capacity as and when it is needed. More capacity can be provisioned for peak times and then de-provisioned when no longer needed. Traditional computing requires buying capacity sufficient for peak times and allowing it to sit idle the rest of the time.

4. Always-on availability

Most cloud providers are extremely reliable in providing their services, with many maintaining 99.99% uptime. The connection is always on and as long as workers have an Internet connection, they can get to the applications they need from practically anywhere. Some applications even work off-line.

5. Improved mobility

Data and applications are available to employees no matter where they are in the world. Workers can take their work anywhere via smart phones and tablets—roaming through a retail store to check customers out, visiting customers in their homes or offices, working in the field or at a plant, etc.

6. Improved collaboration

Cloud applications improve collaboration by allowing dispersed groups of people to meet virtually and easily share information in real time and via shared storage. This capability can reduce time-to-market and improve product development and customer service.

7. Cloud computing is more cost effective

Because companies don’t have to purchase equipment and build out and operate a data center, they don’t have to spend significant money on hardware, facilities, utilities and other aspects of operations. With traditional computing, a company can spend millions before it gets any value from its investment in the data center.

8. Expenses can be quickly reduced

During times of recession or business cut-backs (like the energy industry is currently experiencing), cloud computing offers a flexible cost structure, thereby limiting exposure.

9. Flexible capacity

Cloud is the flexible facility that can be turned up, down or off depending upon circumstances. For example, a sales promotion might be wildly popular, and capacity can be added quickly to avoid crashing servers and losing sales. When the sale is over, capacity can shrink to reduce costs.

10. Facilitate M&A activity

Cloud computing accommodates faster changes so that two companies can become one much faster and more efficiently. Traditional computing might require years of migrating applications and decommissioning data centers before two companies are running on the same IT stack.

11. Less environmental impact

With fewer data centers worldwide and more efficient operations, we are collectively having less of an impact on the environment. Companies who use shared resources improve their ‘green’ credentials.

Despite these benefits, the Cloud Security Alliance has identified several barriers holding back cloud adoption. At 73% of companies, the security of data is the top concern holding back cloud projects. That’s followed by concern about regulatory compliance (38%), loss of control over IT services (38%), and knowledge and experience of both IT and business managers (34%). As organizations address their security and compliance concerns by extending corporate policies to data in the cloud and invest in closing the cloud skills gap, they can more fully take advantage of the benefits of cloud services.

Written by:

cameron-coles3

CAMERON COLES: Sr. Product Marketing Manager at Skyhigh. Interested in data that reveals the promise and peril of the cloud economy.

6 Security Tips From the Gartner Security & Risk Management Summit Arrow to Content

June 18, 2015 | Leave a Comment

Posted by Christopher Hines

resiliencyAs I was sitting in the Gartner Keynote session here at Gartner’s Security & Risk Management Summit, listening to the analysts speak about what’s necessary for greater enterprise security, it became  clear that one word was ever-present in each of the Gartner analyst’s speeches. That word was resiliency. The analysts made a point that the ability to absorb hits and accept risk while focusing on the overall success of the company, was a must for all organizations in today’s breach prone world.

They spoke about the need to move from prevention to detection and response. They called for security professionals to stop thinking as pure defenders and start thinking like business facilitators. Most importantly, how IT security leaders must “seize the opportunity” given to them by the massive headline breaches we see each week. As perverse as that may sound.

One analyst, used Netherlands’ water control system as a prime example of resiliency.  Citing how the system opens and closes based on the level of the water.  Allowing ships to pass through once they were safely able to do so, while also providing safe water levels, and controlling the currents as the water nears the shores of the Netherlands. In short, the technology is resilient. Enterprises must also be able to do the same.

He mentioned how access control and authentication, if not used properly, can cause extra steps for employees, slowing down core business functions in the process. At the same time, only a minor set-back for cyber criminals attempting to steal corporate data. Companies must be able to roll with the punches, and accept risk as a part of the security landscape.

The analyst then went on to speak about each of the 6 core principles of resiliency that all enterprise securers should abide by in order to gain the trust of the c-suite. Here’s the breakdown of each principle he mentioned during his presentation:

  1. Don’t just go for the checked boxes, think in terms of risk-based
  2. Move from a technology focus to an outcome-driven focus
  3. Shift from the defender of data to the facilitator of core business functions
  4. Don’t just control information, understand its flow in order to secure it more effectively
  5. Drift from a pure technology focus to more of a people -purpose and work to gain trust
  6. Move from prevention to detection & respond so that you can react faster and limit damage

It was refreshing to hear the analyst speak about the need for resiliency, and break down the 6 core principles and what they mean to enterprise security teams. These principles can now act as the guide for all enterprises still questioning the need for a new approach to security. Something some securers just haven’t seemed to evolve to yet, being stuck in the more traditional security mindset.

IT securers now have unprecedented power within their organizations. The massive breaches we have all grown too familiar with continue to pile up, forcing security to now become a board room level discussion. The C-suite is now turning to the IT security team for the answer to the question of how to protect data, while also enabling the business to function and grow. You, as the securer must be prepared for the task. You must be able to speak in terms that the C-suite is familiar with, and is willing to listen to with an open mind.

Keep these 6 principles in mind, and seize the opportunity.

Chris Hines

Product Marketing Manager | Bitglass

Cloud Security Alliance and Palo Alto Networks Release Security Considerations for Private vs. Public Clouds Arrow to Content

June 17, 2015 | Leave a Comment

PAN_Logo

 

 

 

By Larry Hughes, Research Analyst, Cloud Security Alliance

Cloud computing has the potential to enhance collaboration, agility, scale and availability, and provides opportunities for cost reduction through optimized and efficient computing.   The cloud trend presents a momentous opportunity to revisit not only how we think about computing, but also how we think about information security.

The Cloud Security Alliance (CSA) recently teamed up with Palo Alto Networks to produce a new whitepaper titled, “Security Considerations for Private vs. Public Clouds.”  For purposes of definition, a public cloud deployment occurs when a cloud’s entire infrastructure is owned, operated and physically housed by an independent Cloud Service Provider.  A private cloud deployment consists of a cloud’s entire infrastructure (e.g., servers, storage, network) owned, operated and physically housed by the tenant business itself, generally managed by its own IT infrastructure organization. 

While the title of the paper implies a primary focus on security, we took the opportunity to expand the conversation and incorporate a wider set of considerations including:

  • Business and legal topics, including contracts, service level agreements, roles and responsibilities, and compliance and auditing. We touch on the importance of establishing principal business and legal feasibility early on in the process, before investing too much in technical requirements.
  • Physical and virtual attack surface considerations including a look at vulnerabilities that are accessible to would-be attackers.
  • Operational issues, including data migration, change management, logging, monitoring and measuring and incident management and recovery and the roles they play in determining which cloud deployment makes the most sense for an organization.

Cloud security is one of the most critical considerations, regardless of whether the deployment is public vs. private. But security is not black and white and no two companies looking to deploy a cloud infrastructure do so for exactly the same reasons. Wise organizations will take the long view and invest in security accordingly. As Thomas Edison once said, “Opportunity is missed by most people because it is dressed in overalls and looks like work.”

On Tuesday, June 23, Matt Keil, Palo Alto Networks Director of Product Marketing for Data Center, and I will be hosting a webinar to discuss the white paper in-depth and look at security considerations for public and private clouds.  For more information and to register for the webinar, click here.

For more information on Palo Alto Networks, please visitwww.paloaltonetworks.com .

 

Google leads the way out of the castle to the cloud Arrow to Content

June 11, 2015 | Leave a Comment

By Mike Recker, Manager – Corporate Systems Engineers, Code42

54312856_520Traditional IT infrastructure is built to centralize data and prevent intrusion. Like a bank vault or a defended castle in medieval times, valuables are kept in one repository and fortified to keep intruders out. In this scenario, the queen can behold all that she owns and keep her enemies at bay.

The centralized storage model worked well until the early 2000s. But the world has changed. A mobilized workforce no longer toils inside the castle walls—and they demand streamlined workflow from everywhere. Which makes tunneling into the castle (via VPN connection) to utilize the tools of their trade not just inefficient, but irrelevant.

Accepting the brave new world of security
As people, applications, e-mail servers, databases and virtual computing move outside the corporate firewall—and companies accept the necessity of shifting their security practices, big questions arise. How should applications be delivered? Where should data be protected? Google Corp has an idea:

Virtually every company today uses firewalls to enforce perimeter security. However, this security model is problematic because, when that perimeter is breached, an attacker has relatively easy access to a company’s privileged intranet. As companies adopt mobile and cloud technologies, the perimeter is becoming increasingly difficult to enforce. Google is taking a different approach to network security. We are removing the requirement for a privileged intranet and moving our corporate applications to the Internet.

Google gets it. “The perimeter is no longer just the physical location of the enterprise, and what lies inside the perimeter is no longer a blessed and safe place to host personal computing devices and enterprise applications.” In fact, Google decries the internal network (those drafty stone rooms inside the castle) are as dangerous as the Internet. And Google should know.

Let down the drawbridge; beef up the secret handshake
Google’s BeyondCorp initiative depends on device and user credentials—regardless of a user’s network location—to authenticate and authorize access to enterprise resources.

As a result, all Google employees can work successfully from any network, and without the need for a traditional VPN connection into the privileged network. The user experience between local and remote access to enterprise resources is effectively identical, apart from potential differences in latency.

Most companies will balk at the idea of enabling workers to access enterprise apps and data from anywhere without a VPN connection—much less store the data they produce outside the firewall–on the endpoint and in the cloud.

Change is hard, inevitable and here
The idea of enabling workers to store data on the endpoint with cloud backup goes against “mature” information security policies. IT will point to rules that require users to backup to the central file server where data can be monitored and protected. When the employee fails to follow policy and loses data—as a result of everyday disasters such as file overwrite, malware, ransomware, device loss or theft, IT can shrug it off because the employee ignored the policy. Or can they?

The biggest mistake IT makes is assuming the data is where it should be because people were told to put it there. When “process” fails, like it did at Sony, Target, Anthem and Home Depot, what should IT do to save face?

First, dust off the resume. Sadly, people lost jobs and in some cases, careers, because they believed the perimeter approach to collecting and securing data still worked.

Second, stop looking for a stronger firewall; secure the data where it lives on servers, desktops, laptops and mobile devices.

Third, understand that the enemy is outside and inside the castle. Make sure data is collected, visible and auditable so it can be restored to a known good state from a secure copy. In cases of breach and leakage, protecting every device with a backup assures faster inventory and remediation and substantial cost and productivity savings during data recovery.

Six data security practices for the brave new world
Plainly, data centers surrounded by defensive measures have failed to keep data secure. What does work is a security approach in which the data on every device is protected and backed up—whether or not the device is on the corporate network. The only thing missing in Google’s “trust but verify” approach is clear guidance on data backup and management.

That’s where we come in: We recommend these modern, proven data security practices for endpoints:

  1. Secure every device with full disk encryption (FDE) to disable access to data should the device be lost or stolen—inside or outside the organization.
  2. Deploy automatic, continuous backup of every device, every file and every version so data is recoverable in any event.
  3. Enable workers to work they way they do. Abandon processes that require antiquated behaviors and replace with automated agents that work lightly and quietly in the background.
  4. Keep encryption keys on premises to prevent unauthorized access from anyone and any agency.
  5. Trust but verify every user and device before enabling access to the network and data.
  6. Implement data governance tools that enable data visibility and analytics for auditing, data tracing and fast remediation.

When you live by these security practices in the brave new world, you’ll sleep better at night—even when the drawbridge is down.

Three Quick Cloud Security Wins for Enterprise IT Arrow to Content

June 10, 2015 | Leave a Comment

By Krishna Narayanaswamy, Chief Scientist, Netskope

Netskope_3 quick winsToday we released our Cloud Report for Summer 2015 – global as well as Europe, Middle East and Africa versions. Whereas in prior reports, we shared our top findings about usage, activities, and policy violations across enterprises’ cloud apps, in this report (and going forward!) we are matching those findings with a set of “quick wins,” or recommendations for how to mitigate cloud risk and protect data.

This season’s report focuses heavily on cloud data loss prevention (DLP). In our cloud, we identify policy violations for DLP profiles, including personally identifiable information (PII), payment card industry information (PCI), protected health information (PHI), source code, profanity, and “confidential” or “top secret” information, both at rest in and en route to or from cloud apps.

Two of the most dramatic findings in this report were that for content at rest in sanctioned cloud storage apps, 17.9 percent violated a DLP policy. Of those files, more than one out of five, or 22.2 percent were exposed publicly, or shared with at least one person outside of the corporate domain. Those are both huge numbers, and easily fixable. This leads us to quick win #1: Discover sensitive content in your sanctioned apps and eliminate public access. Don’t forget to notify internal collaborators.

For DLP violations in content at rest and en route, we looked at category and activity. The vast majority (90 percent) of these violations occurred in the Cloud Storage category, and primarily in the activities “upload” and “download.” The other categories that have DLP violations include Webmail, Social Media, and CRM, and top DLP-violating activities vary depending on the category, e.g., “post” in social and “download” in CRM. This brings up quick win #2: Enforce your cloud DLP policies on data-compromising activities in apps containing sensitive data. Start where most violations occur: uploads and downloads in Cloud Storage.

For the first time since we’ve been releasing this report, we noticed a decline in the average apps per enterprise. They went from 730 in our last report to 715. Anecdotally, our customers are getting more serious about consolidating apps and standardizing on their corporate-sanctioned ones. They’re doing this through policy, education, and user coaching. We believe the decline is a direct result of this effort, which leads us to quick win #3: Consolidate on popular apps that are also enterprise-ready. Use app discovery as a guide, and get there with user coaching.

We also have a global and an EMEA version of our infographic available on our website.

Are you missing the most versatile endpoint security tool? Arrow to Content

June 8, 2015 | Leave a Comment

By , Integrated Marketing Manager, Code42

Beyond Back UpLots of companies have endpoint security strategies. We know, because we’ve asked them. We’re using Backup Awareness Month to help businesses evaluate the obvious and hidden benefits of backup within a larger security plan.

Hardware fails. It’s inevitable.
12,000 hard drives will fail this week. Anyone in IT knows hardware failure and retirement are inevitable in technology. With the right backup, there will never come a time when you have to spend thousands of dollars on data recovery or tell users there’s no hope of restoring files.

 

2015 called and “there’s an app for that”—continuous endpoint backup.

Don’t play by the ransomer’s rules.
New malware is born every 4 seconds. Yeah. It’s a cruel world online. You’ll never be 100% impenetrable, but you won’t have to play by the ransomer’s rules if you get hit. Just restore destroyed files to the last known good state—prior to infection.

Back it up before you lock it down.
Without reliable, up-to-date backups, full disk encryption is a bit too secure. You’ve succeeded in restricting unwelcome access to corporate data, but you’ve risked locking your own users out of their files in the process. Think about it: if the computer gets damaged, data may become unreadable—even to those with permission to view it.

That’s why so many enterprises mandate that IT back up endpoints before locking them down with full disk encryption software.

To err is human. To recover is divine.
Users make mistakes—they modify read-only files and forget or omit saving to the shared drive, they spill on, drop, lose, misplace and misuse their devices. You know the saying, “you can’t teach an old dog new tricks?” It applies to the modern workforce and the way they work. If it can happen, it will. Continuous endpoint backup makes files available to restore when user errors happen.

The most powerful tool in the box saves data, money, time and a lot more.
Endpoint backup is a lot bigger than a copy in the cloud. It gives the enterprise assurances that it can recover from known threats. With the right backup solution, you’ll have the antidote when data loss strikes. Now that’s a lot to love.

CSA Establishes Cloud Data Governance Working Group and Releases Governance Framework Arrow to Content

June 4, 2015 | Leave a Comment

By J.R. Santos, Vice President/Research and Member Services, Cloud Security Alliance

jr santosIt is becoming increasingly difficult to protect customer data in the clouds, which in turn is causing more and more cloud providers and cloud consuming organizations to embrace data governance strategies. To address this need, Cloud Security Alliance (CSA) recently created the Cloud Data Governance 2.0 working group.

The Cloud Data Governance working group has been created to design a universal set of principles and map to emerging technologies and techniques for ensuring the privacy, confidentiality, availability, integrity and security of data across private and public clouds. The group has recently released a data governance framework to ensure the privacy, availability, integrity and overall security of data in different cloud models. These will feed into the GRC stack and can be implemented as controls across CSA’s CAIQ, CCM and STAR.

The Cloud Data Governance working group will look to develop thought leadership materials to promote CSA’s leadership across the spheres of data privacy, data protection and data governance. One key issue is that the over-emphasis on technology controls often leads to underlying weaknesses in processes. The group will work to harmonize data privacy regulations to a set of data protection principles that can help cloud consuming organizations and cloud service providers meet new data privacy requirements in a more efficient and proactive manner.

Chaired Evelyn de Souza of Cisco, the group is comprised of representatives from across the industry, with collaboration between key industry leaders from different verticals, academia, industry analyst associations and vendor subject matter experts.

The Governance Framework is tied to the CSA Cloud Controls Matrix and examines the three phases to govern:

  1. Plan (Plan & Organize)
  2. Do (Acquire and Implement, Deliver and Support)
  3. Check, Act (Monitor and Evaluate)

The Cloud Data Governance working group has some exciting research coming up later in 2015, including reviewing and streamlining the values of security risk management, going from ad hoc to optimal. Also research on data privacy – measuring the changing perceptions to data heat index – is scheduled for release.

If you are planning to attend Cloud Expo in New York next, you are invited to attend a presentation being given by Evelyn that will focus on how to set up a cloud data governance program and spans setting up an executive board to ensuring the availability, integrity, security and privacy of cloud data through its lifecycle.

To learn more about the Cloud Data Governance 2.0 working group, please join the LinkedIn group: CSA Cloud Data Governance Working Group or join the mailing list.

 

 

Page Dividing Line