Which Security Topics Are AWS Users Most Interested In?

May 26, 2016 | Leave a Comment

By David Lucky, Director of Product Management, Datapipe

AWSSecurityWe hope this blog provides an insightful dive into topics like cloud computing, managed services, products, and ways to improve your business strategy. Of course, our partners have great things to say, as well. One of those partners is AWS, and they’ve been kind enough to highlight the most popular security posts on their blog from the past year. There is some great info here; below is our take on just a few of these posts.

Privacy and Data Security
Security has always been a concern for the enterprise. Initially, it was a major barrier to entry for migrating to the cloud, but over the past few years, a greater number of businesses have realized that, like us, AWS takes security very seriously. This post talks about some of the best practices of the company.

Perhaps the biggest is protecting the privacy of its customers. AWS doesn’t disclose customer information unless required to do so to comply with a legally valid and binding order. And, if they do have to disclose information, they’ll notify customers beforehand. AWS also offers strong encryption as one of many standard security features, and gives organizations the option of managing their own encryption keys. That’s one of the driving forces behind our Datapipe Access Control Model for AWS (DACMA) offering – you get to hang onto the keys to your system, and maintain complete control of your virtual infrastructure and your data. What’s more, DACMA requires two-factor authentication, and all system access and activities are tied back to unique user names, without the hassle of managing an exhaustive list of AWS users. This added layer of security and accountability ensures your business is protected and meeting compliance requirements.

Receiving Alerts
It’s never a bad idea to have an extra layer of security within your infrastructure. As an AWS administrator, you can be notified of any security configuration changes. Changes are to be expected, but if anything seems out of the norm, you can make sure no changes to your AWS Identity and Access Management (IAM) configuration are made without you being made aware.

This post from AWS goes into detail on some of the steps you can take to stay in touch with all that’s going on within your AWS structure. From using CloudWatch filter patterns, to monitoring changes to IAM, to generating alarms and metrics, these are all necessary to ensure nothing gets by your watchful eye. Once everything is set up, you’ll receive an alert via email or SNS topic. The below image illustrates the process:

alertsdiagram_p

 

PCI Compliance in the AWS Cloud
Payment Card Industry (PCI) compliance is important for just about any business. However, one of the more complex aspects of cloud hosting is deciding which party is responsible for PCI requirements. The PCI Compliance workbook provides a guide on where AWS can cover compliance requirements, and which areas a business must cover itself.

There are twelve top-level PCI requirements in all, and they are quite complex. It can be easy to miss certain requirements or not stay up to date with audits. It’s important to note that you can’t just arbitrarily ignore a PCI requirement—all of them must be met. It may be possible that not all requirements apply to your business, so a PCI assessor is helpful for clarifying which do and do not apply. We were one of the first hosting providers in the world to achieve PCI DDS Level 1 service provider status—the highest, most rigorous status in the industry—and are happy to work with enterprises to ensure they’re setup and maintain their AWS environment compliance.

As a business, it’s refreshing to know your provider has your best interests in mind. For more information, check out our previous posts on AWS security.

The Best Security KPIs Are the Ones That Matter to Your C-Suite

May 25, 2016 | Leave a Comment

By Susan Richardson, Manager/Content Strategy, Code42

Blog Images_5-12-18_Blog_600x450 (1)What information security KPIs are you tracking? Are they tied specifically to your organization’s business goals? If not, consider that using predictive business performance metrics could help increase your organization’s profitability—by as much as 20% over three years, according to one Gartner study.

To help you develop more relevant security performance indicators, here are some suggestions from the experts:

Make them meaningful to executives
Start by considering what matters most to executives:

  • Meeting organizational goals
  • Maintaining efficient, uninterrupted operational processes
  • Fostering a positive public image
  • Complying with regulations and contractual obligations
  • Managing risks

Don’t focus on cost metrics
“Security guys are always talking about cost,” said Steve Durbin, managing director of the Information Security Forum (ISF), in a CIO magazine interview. “If we realign this, the security guys can now go to the business and say, ‘Look, if this is what is important to you, this is the role I can play in helping you protect that, but I don’t have the funding for a variety of reasons.’ The business can then make the call as to whether to find the funding for that problem. It’s no longer the security guy’s problem, it’s the business’s problem.”

Use leading vs. lagging metrics
A lagging indicator measures actual results, our outputs, so it’s too late to make corrections or improvements. A leading indicator looks at activities necessary to achieve your goals, so they’re essentially inputs that provide information needed to intervene and change course for the better. For example, the number of viruses reported after a new software implementation is a lagging indicator, whereas the number of virus updates implemented prior to implementation shows action taken to drive launch success and improve user productivity.

Evaluate the effectiveness of your proposed metrics
Thankfully, there’s a tool for that. The ASIS Foundation sponsored a major security metrics research project, and one of the outcomes was a Security Metrics Evaluation Tool that security managers can use to assess the quality of specific security metrics. The written tool helps you analyze the effectiveness of a metric against nine criteria, including its relevance to the organization’s strategic mission, how easily it can be communicated and its reliability. The tool is in the Appendix of the research report, “Persuading Senior Management with Effected, Evaluated Security Metrics.”

Download The Guide to Modern Endpoint Backup and Data Visibility to learn more about selecting a modern endpoint backup solution in a dangerous world.

Cloud Security Alliance Opens Call for Presentations for EMEA Congress 2016

May 24, 2016 | Leave a Comment

The Cloud Security Alliance has opened the call for papers for the 2016 CSA EMEA Congress, to be held November 15 at the Circulo de Bellas Artes in Madrid, Spain. CSA EMEA Congress is Europe’s premier cloud security event and is designed around the CSA’s core mission of promoting the use of best practices for providing security assurance within Cloud Computing and to provide education on the uses of Cloud Computing to help secure all other forms of computing.

The one-day conference will include two parallel tracks. Papers can be submitted online at: https://easychair.org/conferences/?conf=csaemea2016

The call for papers closes on August 1, and speakers will be notified by September 1.

The following topics are of key interest:
Current and emerging trends

  • Containerization and micro service security
  • Software Defined Perimeter (SDP)
  • Blockchain
  • Cloud-enabled vs Cloud-centric application
  • Multi-actor authentication
  • Cloud-based solutions for Small and Medium Enterprises (SMEs)
  • Threat landscape for cloud computing
  • IoT and Cloud, e.g Cloud and smart cities, smart transport.
  • Big Data security

Privacy in the Cloud

  • The impact of the European General Data Protection Regulations
  • How to manage legal and security compliance in a multi national environment
  • Cyber Security Laws and Regulation in Europe
  • The impact of the right to be forgotten in the cloud
  • Privacy and Security by design
  • Encryption solutions and trends

Risk Management, Certification and standards for the cloud

  • The role of standards within organisations
  • Case studies on the adoption of cloud certification
  • How manage Governance Risk and Compliance in the Cloud
  • Risk Profiling
  • Security and Privacy Service Level Agreements
  • Cloud Security Automation: how to develop and implement framework for automated risk calculation and response.
  • SaaS Governance

Incident Management in the Cloud

  • Incident Information Sharing
  • Leveraging Big Data for Threat intelligence in the Cloud
  • Cloud Forensics
  • SIEM in the Cloud
  • Cloud Security Gateways vs SIEM
  • Privacy Breaches Reporting
  • Reporting Security Breaches: the state of art in Europe
  • Cloud Disaster Recovery

Cloud Computing in critical sectors

  • Finance sector
  • ehealth
  • eGovernement
  • Energy
  • Transport

General guidelines

  • Proposals cutting across the above topics are also encouraged.
  • Proposals, presentations, panels, or sessions must be in English and should provide a learning opportunity for the conference attendees.
  • In case a proposal is accepted, the author (or one of the authors) must attend
  • Proposals that focus on marketing or promoting a product or service will not be considered.
  • Proposals from marketing or PR professionals (external or internal) will not be considered.

For general inquiries and speaking opportunities, please contact [email protected].
For sponsorship enquiries, please contact [email protected]
For media credentials, please contact [email protected].

Bridging the Divide Between CISOs and IT Decision Makers

May 20, 2016 | Leave a Comment

By Rick Orloff, Chief Security Officer, Code42

Blog Images_5-12-18_Blog_600x450In a large organization, leaders create a vision and strategy for the business and employees work to achieve the vision. At the business unit level in information technology, CIOs, CSOs and CISOs define their strategies while other IT decision makers work to implement it. The key to success is a team working in unison with effective strategies and KPI’s. But this might be a case of “theory vs practice.”

When we surveyed 400 IT decision makers (ITDMs) for our 2016 Datastrophe Study, we discovered that CISOs, CIOs and other IT decision makers often diverge in the real world in terms of everyday data security implementation and addressing real-world issues such as BYOD policy administration, reputation management and insider threats. That’s the scary reality of the unseen divide: when the people who are meant to protect the enterprise do not agree, then the CXO’s need to step up and lead.

The Datastrophe Study reveals several specific drivers contributing to the disconnect between C-level and other IT decision makers and ways in which businesses can bridge the gap.

Image issues
Data breaches are hitting organizations left, right and center, and there is little doubt that brands’ reputations are at stake. CISOs, with their executive hats on, spend their time on risk mitigation: more than half of CISO/CIOs (53%) say their ability to protect corporate and customer data is vital to their company’s brand and reputation. However, only two fifths (43%) of ITDMs share that focus.

While the Datastrophe Study reveals a 10% difference between leaders and decision makers, when it comes to sensitive data, even a little complacency can lead to security failures. This may be an issue of operational efficiencies being developed without using a secure framework. Data security needs to be part of the design starting with strategy at the CXO (horizontal) level and vertically with tactical execution.

In order to ensure that risk and the potential of reputational damage is reasonably mitigated, C-level and ITDMs need to work in concert. ITDMs have the clearest view of incumbent systems and employee behaviors—and should not be afraid to speak up. Equally, C-level executives need to take this information on board, if not back to the Board, in order to help ITDMs fulfill the vision of building a secure enterprise.

The insider threat is very real
All security professionals will agree that the insider threat is a reality in any business. But it seems that CISOs, CIOs and other ITDMs have not aligned on the scope and magnitude of the threat or the threat vectors. Sixty-four percent of CISOs and CIOs believe that insider data security threats will increase in the next twelve months. Only 50% of other ITDMs agree with them.

Is the view from the top—with a focus on protecting the organization and brand—skewing reality? Or, with the day-to-day liaison between ITDMs and employees, could it simply be that ITDMs lack the proactive (instead of traditional detective) tools required to provide real-time situational awareness? Even so, if they haven’t aligned on the threat vectors, the probability is very high that ITDM’s aren’t aligned on what to measure or monitor. There is, today, a potential tendency for both parties to underestimate threats. A study by Forrester reported that 70% of data breaches could be traced to employee negligence. In order to overcome the insider threat, the C-level and all other ITDMs have to agree on the best strategic course forward. More importantly, both parties need to engage employees and help to educate them on behaviors that could lead to data breach. For example, C-level execs could use a workshop format to explain to employees the costs and damages caused by employee negligence, while ITDMs can provide practical tips and examples of how to actively avoid behaviors that put data at risk.

Anomaly at the endpoint
In an increasingly mobile workplace, BYOD is a key driver for adoption of policies to manage employee-owned devices connected to organizational networks. But things are never as simple as they seem. Among the normally skeptical CISO/CIOs, 87% believe their companies have clearly defined BYOD policies in place. Meanwhile, only 65% of ITDMs say their organizations have defined BYOD policies. To add more contention to the mix, 67% of knowledge workers (employees who think for a living and engage with mobile devices daily), believe their companies have no apparent BYOD policies.

This disconnect is a major cause for concern: CISOs/CIOs believe that 47% of corporate data is held on endpoint devices, as opposed to the more moderate estimation of 43% by other ITDMs. It’s clear that C-level and ITDMs need to work collaboratively to clarify, communicate and implement well-defined BYOD policies.

Ultimately
The simple solution to bridging the gap? Better communication. CISO/CIOs need to talk to their teams and their teams need to talk back. Better alignment and integration between the vision and the reality will go a long way to building more secure enterprises.

Addressing Cloud Security Concerns in the Enterprise

May 18, 2016 | Leave a Comment

By David Lucky, Director of Product Management, Datapipe

cloudsecurityBusinesses want to move to the cloud, they really do. And more than ever, they’re starting to make the switch: A Cloud Security Alliance (CSA) study that polled more than 200 IT professionals found that 71.2 percent of companies now have a formal process for users to request new cloud services.

That CSA study also found that nearly two-thirds of IT professionals trust the security of cloud computing equally or even more than their on-premise systems. About a third of respondents cited better security capabilities to be a benefit of the cloud. However, almost 68 percent of respondents noted the ability to enforce their corporate security policies remains a barrier to cloud adoption.

Companies know there’s top-notch security in the cloud, yet security remains the biggest hurdle in getting over to the cloud. Kind of a catch-22, huh? Fortunately, there are a few things you can do to help assuage these fears.

Cloud security is something everyone in a company should be concerned with, not just the IT department or decision-makers. And while the tools we use are improving and more people are starting to better understand cloud computing, people still play a big part in security. Your team of security professionals should get the correct training early on in their tenure, and constant training will allow them keep their skills sharp.

Outside of security professionals, all employees within a company should know their role in maintaining a secure environment. Having a proactive approach to security risks is the first step, which is something that 82.2 percent of companies have. However, fewer than half of the companies that responded have a complete incident response plan. With real concerns like loss of reputation or trust, financial loss, and destruction of data, it’s imperative to have a plan in place to combat any potential security issues head-on, rather than reacting after the fact.

To help with the development of that plan, some businesses have turned to a managed service provider (MSP). Naturally, there are concerns surrounding that, as well­–the CSA report notes 87.3 percent of companies cite access control as an important asset of cloud security. Our Datapipe Access Control Model for AWS (DACMA) addresses this concern by letting a business stay in control by securely delegating access to Datapipe while retaining control of their credentials. DACMA’s role-based access and accountability elements also ensure the right people within an enterprise are accessing certain data. And with 24/7/365 security monitoring, you’ll be on top of the ball should an issue arise.

Whether or not you choose to partner with an MSP to assist with security, there are plenty of reasons to develop a cloud security strategy that works within your enterprise. There’s no one right method, but there is a wrong approach: not doing anything about it. To learn more about first steps you can take, visit our Managed Security page.

Cloud Computing: A Little Less Cloudy

May 16, 2016 | Leave a Comment

By Christina McGhee, Manager/FedRAMP Technical Lead, Schellman

Cloud-Computing-A-Little-Less-CloudyToday, consumers have an increasing interest in implementing cloud solutions to process and store their data. They are looking to take advantage of the benefits provided by cloud computing, including flexibility, cost savings, and availability. Fortunately, there are many cloud solutions available to consumers, touting cloud computing features such as multi-tenancy, virtualization, or increased collaboration. But is it really a cloud service?

With the rapid growth of these types of solutions, consumers and other interested organizations want to identify whether a service is actually a cloud service.

In actuality, there is such thing as a cloud service. It has a definition and we have seen federal agencies require cloud service providers to justify why their service is considered a cloud service.

The five essential cloud characteristics are based on the National Institute of Standards and Technology’s (NIST) definition of cloud computing in Special Publication (SP) 800-145. Here, NIST defines cloud computing as a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction.

According to NIST SP 800-145, a cloud service employs all of the following five characteristics:

  1. On-demand self-service – A consumer can unilaterally provision computing capabilities, such as server time and network storage, as needed automatically without requiring human interaction with each service provider.
  2. Broad network access – Capabilities are available over the network and accessed through standard mechanisms that promote use by heterogeneous thin or thick client platforms (e.g., mobile phones, tablets, laptops, and workstations).
  3. Resource pooling – The provider’s computing resources are pooled to serve multiple consumers using a multi-tenant model, with different physical and virtual resources dynamically assigned and reassigned according to consumer demand. There is a sense of location independence in that the customer generally has no control or knowledge over the exact location of the provided resources but may be able to specify location at a higher level of abstraction (e.g., country, state, or datacenter). Examples of resources include storage, processing, memory, and network bandwidth.
  4. Rapid elasticity – Capabilities can be elastically provisioned and released, in some cases automatically, to scale rapidly outward and inward commensurate with demand. To the consumer, the capabilities available for provisioning often appear to be unlimited and can be appropriated in any quantity at any time.
  5. Measured service – Cloud systems automatically control and optimize resource use by leveraging a metering capability at some level of abstraction appropriate to the type of service (e.g., storage, processing, bandwidth, and active user accounts). Resource usage can be monitored, controlled, and reported, providing transparency for both the provider and consumer of the utilized service.

Whether you are a cloud service provider, consumer, or other interested party, it is important to identify how the cloud service offering meets each of the five essential characteristics. For example, cloud service providers in the FedRAMP authorization process usually document how their service meets each of the five essential cloud computing characteristics in their System Security Plan (SSP).

It goes without saying that regardless of whether or not a service meets the definition of a cloud service, the cloud service provider and consumer must always plan and prepare for the security risks associated with providing or using a the cloud service and the types of data the cloud service will consume. The cloud service provider is responsible for selecting a security program framework to implement security controls specific for cloud environments and the data protection requirements of their customers. Equally, the consumer must be fully aware of the data they plan to process and/or store with the cloud service and their responsibilities to protect that data.

 

Providing Trust and Assurance Through Cloud Certification and Attestation: A Complimentary CSA STAR Program Webinar by Schellman

May 12, 2016 | Leave a Comment

 

ByAvani Desai, Executive Vice President, Schellman

resource-csa-star-programIn the last 24 months, the Cloud Security Alliance (CSA) has made great strides in enhancing their CSA Security, Trust and Assurance Registry (STAR) Program.  In brief, the STAR Program is a publicly available registry designed to recognize assurance requirements and maturity levels of cloud service providers (CSPs).  Prior to issuing the guidance for STAR Certification and STAR Attestation, a CSP could only perform a self-assessment, which meant completing the Consensus Assessments Initiative questionnaire (CAIQ) and making the responses publicly available on the CSA Register.  The CAIQ was completed in several different ways and the content varied from short answers to full-page responses.  It was relevant information but not independently validated.  This created a path for the STAR Certification and STAR Attestation Programs.

Join Schellman during a complimentary webinar titled “CSA STAR Program: Attestation and Certification”.  The webinar will be held on May 13th from 12:00pm EST to 1:00pm EST and will provide one (1) hour of CPE.  Debbie Zaller, Schellman Principal, and Ryan Mackie, Practice Leader, STAR Program, will provide an in-depth discussion on the opportunities to undergo third party assessments, through the CSA STAR Programs, to validate maturity level or control activities.

Organizations, specifically cloud service providers, are continuously working to provide confidence to their customers regarding the security and operating effectiveness of their controls supporting the cloud and the STAR Certification and STAR Attestation options provided by the CSA allow for these organizations to further establish confidence in the market,” said Ryan Mackie.  “This webinar is a practical introduction to the STAR Level 2 offerings, outlining their benefits, requirements, and process, and how these types of third party validation can clearly compliment a cloud provider’s governance and risk management system.”

This informative webinar will provide:

  • An overview and journey of the CSA STAR Programs
  • A definition of the CCM framework
  • An overview of the Certification and Attestation purpose and scope
  • The process and preparations
  • A discussion of the common challenges and benefits

For more information and to register for the webinar, click here .  The event will also be recorded and available for on-demand viewing,. Click for more information.

ABOUT THE SPEAKERS
Debbie Zaller leads Schellman’s CSA STAR Attestation and SOC 2 services practice  where she is responsible for internal training, methodology creation, and quality reporting.  Debbie has performed over 150 SOC 2 assessments and Debbie also holds a Certificate of Cloud Security Knowledge (CCSK).

Ryan Mackie leads Schellman’s CSA STAR Certification and ISO 27001 certification services practice where he is an integral part of the methodology creation and the planning and execution of assessments.  Ryan has performed over 100 ISO 27001 assessments and is a certified ISO 27001 Lead Auditor trainer.

 

Outdated Privacy Act Close to Getting an Upgrade

May 12, 2016 | Leave a Comment

By Susan Richardson, Manager/Content Strategy, Code42

050916_OutdatedPrivacyAct_BlogThe outdated Electronic Communications Privacy Act (ECPA) may finally get a much-needed upgrade, but the reform can’t come soon enough for Microsoft, other cloud providers and privacy advocates. Here’s what you need to know:

The issues:
The ECPA was enacted in 1986, as electronic communication started to become more prevalent. The intent was to extend federal restrictions on government wiretaps from telephones to computer communications. But as we created other electronic communication devices and moved content to the cloud, the Act became outdated. The primary gripes are that it:

  • Allows government agencies to request emails more than 180 days old with just an administrative subpoena, which the agency itself can issue, vs. having to get a warrant from a judge.
  • Doesn’t require notifying affected customers when their data is being requested, giving them a chance to challenge the data demand. In fact, the Act includes a non-disclosure provision that can specifically prohibit providers from notifying customers.

The lobbying and lawsuits:
Plenty of wide-ranging groups have been advocating for ECPA reform, including the American Civil Liberties Union, the Center for Democracy & Technology, the Electronic Frontier Foundation, the Digital Due Process Coalition, the Direct Marketing Association and even the White House, in its 2014 Big Data Report.

On April 14, Microsoft added a little more weight to its argument. The company filed a lawsuit against the U.S. Justice Department, suing for the right to tell its customers when a federal agency is looking at their email. The lawsuit points out that the government’s non-disclosure secrecy requests have become the rule vs. the exception. In 18 months, Microsoft was required to maintain secrecy in 2,576 legal demands for customer data. Even more surprising, the company said, was that 68 percent of those requests had no fixed end date—meaning the company is effectively prohibited forever from telling its customers that the government has obtained their data.

The reform:
Two weeks after Microsoft filed its suit, the U.S. House voted 419-0 in favor of the Email Privacy Act, which would update the ECPA in these key ways:

  • Require government representatives to get a warrant to access messages older than 180 days from email and cloud providers.
  • Allows providers to notify affected customers when their data is being requested, unless the court grants a gag order.

The last step in the process is for the Senate to turn to the reform bill into law. While no timeline has been given, the Senate is getting a lot of pressure to act quickly.

Download The Guide to Modern Endpoint Backup and Data Visibility to learn more about selecting a modern endpoint backup solution in a dangerous world.

How to Reduce Costs and Security Threats Using Two Amazon Tools

May 10, 2016 | Leave a Comment

By David Lucky, Director of Product Management, Datapipe

AWSCloudfrontHave you ever gone to see a movie that would have been amazing if not for one person? The plot was engaging, the dialogue was well-written, and there were strong performances from most of the cast. But there was just that one actor who simply didn’t live up to the rest of the film, and it made every scene he was in that much worse? Simply put, that actor was bad, and brought down the whole operation.

That idea of the “bad actor” can be applied to Internet clients, as well. Fortunately, you’re not hurting any feelings by sussing them out: the bad actors are usually automated processes that can harm your systems. The two most common forms are content scrapers, which dig into your content for their own profit, and bad bots, who will misrepresent who they are to get around any restrictions stopping them.

We’d all like to believe that everyone accessing content will use it appropriately. Unfortunately, we can’t always assume the best, and being proactive in dealing with these bad actors will reduce security threats to your infrastructure and apps.

Even better, blocking bad actors will also lower your operating costs. When these bots access your content, you’re serving the traffic to them, whether you want to or not. That adds more to your overall costs. By blocking them, you’re restricting traffic from a number of undesired sources. Luckily, AWS has a pair of tools you can combine to say goodbye to these bad actors: Amazon CloudFront with an AWS web application firewall (WAF).

With AWS WAF, you can define a set of rules known as a web access control list (web ACL). Every single rule contains a set of conditions, plus an action. Any request that’s received by CloudFront gets handed over to AWS WAF for further inspection; if the request matches, the user can access the content as attempted. If the request doesn’t match the conditions in a specified rule, the default action of the web ACL is taken. These conditions will remove quite a bit of unwanted traffic, as you can set filters by source IP address, strings of text, and a whole lot more. As for the web ACL actions, you can count the request for later analysis, allow it, or block it.

Perhaps the best attribute of the WAF is that you can smoothly integrate it within your existing DevOps, and automate workflows to react. Since bad actors are always switching their methods to mask their actions, your proactive detection methods must constantly change, as well. Having those automations in place is immensely helpful in finding bad actors and restricting their access.

There’s a great walkthrough of how to set up this solution on the AWS Security Blog, step-by-step. Feel free to check it out for more information, or get in touch with us if you have any additional questions. And for AWS customers that need even more than what the AWS WAF has to offer, there are services that are complimentary to the AWS WAF that provide enhanced protection for business critical applications on AWS. You won’t even need to thank the Academy when all of those bad actors are removed.

DoD Updates Government Security Requirements for Cloud, But What Does That Really Mean?

May 6, 2016 | Leave a Comment

By Brian Burns, Bid Response Manager/Government Affairs, Datapipe

DF-ST-87-06962IT officials from the Department of Defense (DoD) have released an update to the Cloud Computing Security Requirements Guide (CC SRG), which establishes security requirements and other criteria for commercial and non-Defense Department cloud providers to operate within DoD. These kinds of updates are not uncommon. In fact, they are encouraged through an interesting use of a DevOps type methodology – as the DoD explains:

DoD Cloud computing policy and the CC SRG is constantly evolving based on lessons learned with respect to the authorization of Cloud Service Offerings and their use by DoD Components. As such the CC SRG is following an “Agile Policy Development” strategy and will be updated quickly when necessary.

The DoD offers a continuous public review option and accepts comments on the current version of the CC SRG at all times, moving to update the document quickly and regularly to address the constantly changing concerns of an evolving technology like public and private cloud infrastructure. The most recent update includes administrative changes and corrections and some expanded guidance on previously instated requirements, with the main focus on the updates being to clarify standards set in version one and alleviate confusion and any potential inaccuracy.

If you are interested, you can read through the entire CC SRG revision history online.

What is particularly interesting here is the DoD’s acknowledgment that management of cloud environments is constantly evolving, security requirements and best practices need to be iterative, and updates need to be made regularly to ensure relevancy. It’s also important to note that the CC SRG is only one of many government policies put in place to help government agencies securely and effectively implement cloud infrastructures. There are also guidelines like NIST SP 800-37 Risk Management, NIST 800-53, FISMA and FedRAMP to consider. All of these provide a knowledge base for cloud computing security authorization processes and security requirements for government agencies.

What the DoD’s updates to the CC SRG should reinforce for agencies is that they need to have a clear cloud strategy in place in order to ensure compliance and success in the cloud. Determining the best implementation of these guidelines for your needs is difficult in and of itself. Add to that the ongoing management and updates required to keep up with ever-evolving guidelines and an IT team can find itself struggling.

By partnering with systems integrators and software vendors, or working directly with a managed service provider, like Datapipe, government agencies can more easily develop a long-term cloud strategy to architect, deploy, and manage high-security and high-performance cloud and hosted solutions, and stay on top of evolving government policies and guidelines.

For example, Microsoft Azure recently announced new accreditation for their Government Cloud, Amazon AWS has an isolated AWS region designed to host sensitive data and regulated workloads called AWS GovCloud, and you can learn more about our new Federal Community Cloud Platform (FCCP), which meets all FISMA controls and FedRAMP requirements, and all of our specific government cloud solutions on the Datapipe Government Solutions section of our site.