Cloud 101CircleEventsBlog
Join AT&T's experts & CSA's Troy Leach on April 4 to boost your cyber resilience in 2024!

CSA Federal Cloud Security Symposium Hosted by MITRE (McLean, VA)

CSA Federal Cloud Security Symposium Hosted by MITRE (McLean, VA)

Blog Article Published: 05/17/2010

By Dov Yoran

On August 5th, 2009, Cloud Security Alliance Federal Cloud Security Symposium was hosted by MITRE Corporation. This full day venue provided government personnel with access to leading commercial cloud security experts. Throughout the day perspectives on cloud computing, its benefits and its security implication were discussed with respect to the public sector.

The day began with Jim Reavis, CSA’s executive director providing an overview to the 200 strong audience on CSA’s organization, mission and goals. He spoke on how the economics of cloud computing will create a transformational change as organizations can move significant budget from capex to opex. He foresees the economic pressures are so compelling that businesses will bypass IT & Governance all together if they don’t become part of the solution in cloud adoption.

The day continued with Peter Mell from NIST providing a carefully articulated definition of cloud computing. He spoke about the challenges of composing this definition - not being able to please everyone, but putting forth something that everyone as a whole can understand. He discussed the potential threat exposure of large scaled cloud environments. Continuing with to the idea of micro clouds, structures that might have less threat but could also reap economic benefits. As one can imagine, Peter’s guidance was to employ different levels of clouds for different security concerns.

Next, Jason Witty from Bank of America, Glenn Brunette of Sun and Ward Spangenberg of IO Active discussed the cloud threat model in a panel session addressing initial concerns on attack perspectives. Glenn believes that social engineering is still the weakest link in a security provider’s arsenal. He continued by saying that even if the provider exposes some of their technologies, it really shouldn’t matter because defense in depth strategies should be employed. Jason re-affirmed the social engineering weakness, but also gave insight that the same userid/password compromises can now potentially give one access to massive amounts of information and resources – potentially a much bigger threat if compromised.

When asked about trusting the cloud provider, all three universally agreed that the insider threat exposure can be mitigated by compartmentalizing data and ensuring segregation of duty of cloud personnel. Jason commented on the importance of data classification, discussing how the government is ahead of the private sector in this arena. This first step taken should be identifying data and then defining its appropriate risk exposure.

All three addressed the uniqueness of cloud computing – commenting that it can be leveraged by both businesses as well as the bad guys. Ward spoke about how the concentration of risk in applications and systems are greater due to their interdependency. But the traditional risks are still alive in the cloud and need to continue to be addressed.

The next panel, Encryption and Key Management in the Cloud focused on the underlying challenge of dispersion of data and operations. The panel debated the success of PCI compliance and how lessons learned can be applied to the cloud. Jon Callas from PGP thought it was successful simply for the purpose of pushing security into the business world in a non-overbearing manner, but still having, “a little bit of teeth.” Pete Nicolleti from Terremark agreed with the effectiveness on the idea of continuous compliance that PCI drives, however he feels it didn’t go far enough, believing it should have more stringent actions for those that fail.

The afternoon began with a panel on the legal ramifications of cloud computing. The discussion jumped right into the inherent conflicts of SLAs – on one hand the provider needs to achieve consistency, on the other hand the client needs flexibility. Dan Burton from Salesforce outlined that most clients are ok with the standard online click through agreements. But he also recognizes the needs of large financial companies and government organizations with sensitive data – however in reality, there’s only so far the provider can go and the ultimate decision is up to the customer on what data they are comfortable leveraging a provider for.

Dan passionately reminded the room that market forces are so powerful they will take the lead on cloud computing. Government and legal will have to follow simply because the business transformation is moving so fast. Jeffrey Ritter from Waters Edge noted that governing law is behind the times, i.e. it was not written with a global framework of information sharing, manufacturing and cloud computing in mind with its rapid data exchanges across boarders. Legislation has not even really begun to think about these implications in a legal framework.

The afternoon continued by the Incident Response and Forensics panel, lead by the ever energetic Pam Fusco. One of the key issues discussed was the investigation process. Wing Ko from Maricom Systems described how personnel used for investigations should have previous courtroom experience. David Ostertag from Verizon noted that the investigator doesn’t necessarily have to know about the underlying business itself, but they do need to have knowledge of the specific regulations for the client at hand. The focus needs to be on the data itself (whether in motion or at rest), understanding its location, protection, etc. This lead to a lively debate on responsibility for the chain of custody, for which Dave stated that the physical owner of the server is responsible, so it depends on the business model - fully managed, co-lo, etc. This concept is particularly interesting when investigations are conducted on a virtual machine. Dave explained the registry exists on the virtual machine, so it if goes down, the information will be lost. There was widespread disagreement from the audience as participants suggested taking snapshots of the image as the log files are persistent for a period of time even if they don’t last forever. This discussion was further ignited by the idea of confiscation of a physical server even if it affects several independent companies on that one server. If law enforcement needs to come and get the information, the client’s site will go down. All agreed that customers need to be made aware of legal terms.

Next, Glenn Brunette took the room through a detailed presentation on virtualization hardening. He reminded us not to overlook the traditional issues, for example, the physical connections (making sure network cables are connected, looking to redundancy and better protection by not having all servers in the same rack, etc.) The usual basics of patching, hardening and clearly defining rules based access control leveraging least privileges were all presented. These were especially important in the cloud environment whereby the user will not have access control to the hypervisor, just to the virtual machine image.

Another suggested measure of protection was the concept of tokenizing the data, i.e. passing it through a filter to not expose certain fields, thereby protecting data from the provider. Glenn also spoke about the basics, using vetted, certified hardware and software providers and emphasizing the use of open standards. He concluded with somber concern on the administrative challenges of keeping pace with the scale of virtual machine and cloud processing (detecting, imaging, shutting down, deleting, etc).

The day concluded with an Interoperability and Application panel lead by George Reese of Stradis. The spirited debate was sparked by John Willis who claimed interoperability doesn’t even matter right now because we’re at such an early stage of the cloud explosion - we don’t even know where it’s going to be in two years. To the contrary, Dan Burton argued that interoperability is extremely important. No one knows where the innovation is going to come from, and a company loses this benefit if not interoperable. He believes that customers are driving towards interoperability, not just wanting one provider. And if they become locked in, they will walk with their feet. He spoke about Facebook’s integration with Salesforce via an API to port public data, citing that no one would have imagined that a few short years ago. However there was some rebuke from the audience in that Salesforce could not ultimately vouch for the authenticity of the data. That responsibility lies with the end user.

Ultimately, this last discussion epitomizes the juxtaposition of cloud computing benefits and challenges. The inherent economic efficiencies, speed to market, ease of adoption and growth implications are obvious. The security concerns also need to be addressed to help mitigate vulnerabilities and exploits with the rapid adoption of any new technology, especially one as universal as cloud computing.

Share this content on your favorite social network today!