So you’ve probably heard by now that Facebook will be creating a crypto-currency called “Project Libra” and if you haven’t well, now you know.
So first let’s cover what is good about this. Facebook has announced Project Libra as a Stablecoin, its value will be pegged to a basket of stable “real world” currencies (I’m guessing something like a mix of USD, Euro and Yen), so speculation won’t really be a thing. Lessons from other stablecoin launches have clearly been learned by Facebook, this one will be using OpenSource technology, it will actually be “owned” by the “Libra Foundation” which is headquartered in Switzerland. We already have the typical mix of white papers talking about the Libra blockchain, the on-chain software that will be used to enforce the chain governance, rules, smart contracts and so on. As is typical there’s not an actual running production instance, just the test network, and the software hasn’t yet been formally audited or put through a formal verification process, but it will be. Essentially Facebook is using every signal possible to show this as a legitimate and trustworthy crypto-currency that can be used for payments.
To be honest the technology and governance structure looks fine, there’s nothing really new or significantly different which I think is a good thing, Project Libra is designed to provide a stablecoin that can be used as a payment system, something you don’t really need or want a lot of new surprises and excitement in.
So are there any real downsides to Project Libra? Probably the biggest one is that Facebook is pushing this forwards, despite setting up an association with a goal of 100 major participants (companies, banks, NGO’s, etc.) this project is still heavily tied to Facebook, and many people have a love-hate relationship with Facebook.
There’s nothing really ugly about Libra either, but one aspect I’m curious to see play out is how tradable digital assets sold via Libra will handle pricing discrimination. Many companies would rather sell digital assets (like in game skins) at a discount in developing countries as opposed to not selling anything at all. For digital assets that can be exchanged or traded in game this could present an arbitrage opportunity for end users and secondary markets may develop, and as we’ve seen companies often hate this, because secondary markets are often lucrative (and frustrating for users, opportunities for fraud abound).
But there is one thing that Facebook brings to the crypto-currency table that almost nobody else can (apart from maybe Linkedin or Google…) which is KYC.
KYC is Know Your Customer, it’s literally knowing who the account holder(s) are, their identity, location, address, which jurisdiction they are in and so on. This helps prevent things like identity theft and financial fraud, and also ties into the AML side of crypto-currency regulation. Anti-Money Laundering is exactly what it sounds like, and also ties into terrorist and other criminal funding activities.
Facebook has arguably the world’s largest social graph, and the deepest knowledge of many people (many people essentially stream their entire life, and the lives of their families on Facebook). Facebook can easily verify who people are (and in many cases they already have via your phone number and so on) in a way that almost nobody else can. This combined with Facebook’s reach (they can simply add Libra capability to their website and mobile client and boom, hundreds of millions of people have access to it instantly) gives them a potential advantage no other crypto-currency has ever had.
In the process of outlining several use cases across discrete economic application sectors, we covered multiple industry verticals, as well as some use cases which cover multiple verticals simultaneously. For this document, we considered a use case as relevant when it provides the potential for any of the following:
disruption of existing business models or processes;
strong benefits for an organization, such as financial, improvement in speed of transactions, auditability, etc.;
large and widespread application; and
concepts that can be applied in real-world scenarios.
From concept to the production environment, we also identified six separate stages of maturity to get a better assessment of how much work has been done within the scope and how much more work remains to be done.
Proof of concept
Some of the industry verticals which we identified are finance, supply chain, media/entertainment, and insurance, all of which are ripe for disruption from a technological point of view.
The document also clearly identified the expected benefits from the adoption of DLTs/blockchain in these use cases, type of DLT, use of private vs public blockchain, infrastructure provider-CSP and the type of services (IaaS, PaaS, SaaS). Identification of some other key features in the use case implementations such as Smart Contracts and Distributed Databases have also been outlined.
The working group hopes this document will be a valuable reference to all key stakeholders in the blockchain/DLT ecosystem, as well as contribute to its maturity.
Documentation of Relevant Distributed Ledger Technology and Blockchain Use Cases v2
By Elisa Morrison, Marketing Intern, Cloud Security Alliance
When CSA was started in 2009, Uber was just a German word for ‘Super’ and all CSA stood for was Community Supported Agriculture. Now in 2019, spending on cloud infrastructure has finally exceeded on-premises, and CSA is celebrating its 10th anniversary. For those who missed the Summit, this is the CSA Summit Recap Part 2, and in this post we will be highlighting key takeaways from sessions geared towards CSPs and CISOs.
During this session, Jason Garbis identified three steps towards implementing Zero Trust: reducing attack surfaces, securing access, and neutralizing adversaries. He also addressed how to adopt modern security architecture to make intelligent actions for trust. In implementing Zero Trust, Garbis highlighted the need for:
Authentication. From passwords to biometric to tokens. That said, authentication alone is not sufficient for adequate security, as he warned it is too late in the process.
Network technology changes. Firewall technology is too restricted (e.g. IP addresses are shared across multiple people). The question in these cases is yes or no access. This not Zero Trust. Better security is based on the role or person and data definition. This has more alternatives and is based on many attributes, as well as the role and data definition.
Access control requirements. There is a need for requirements that dynamically adjust based on context. If possible, organizations need to find a unified solution via Software-Defined Perimeter.
Every CEO wants to embrace cloud, but how can you do it securely? The old world was network-centric, and the data center was the center of universe. We could build a moat around our network with firewalls and proxies. The new world is user-centric, and the network control is fluid. Not to mention, the network is decoupled from security, and we rely on policy-based access as depicted in the picture below.
In order to address this challenge, organizations need to view security with a clean slate. Applications and network must be decoupled. More traffic on the cloud is encrypted, but offers a way for malicious users to get in, so proxy and firewalls should be used for inspection of traffic.
Ten Years in the Cloud – PANEL
The responsibility to protect consumers and enterprise has expanded dramatically. Meanwhile, the role of the CISO is changing – responsibilities now include both users and the company. CISOs are faced with challenges as legacy tools don’t always translate to the cloud. Now there is also a need to tie the value of the security program to business, and the function of security has changed especially in support. In light of these changes, the panel unearthed the following five themes in their discussion of lessons learned in the past 10 years of cloud.
Identity as the new perimeter. How do we identify people are who they say they are?
DevOps as critical for security. DevOps allows security to be embedded into the app, but it is also a risk since there is faster implementation and more developers.
Ensuring that security is truly embedded in the code.Iterations in real-time require codified security.
Threat and data privacy regulations. This is on the legislative to-do list for many states; comparable to the interest that privacy has in financial services and health care information.
Security industry as a whole is failing us all. It is not solving problems in real-time; as software becomes more complex it poses security problems. Tools are multiplying but they do not address the overall security environment. Because of this, there’s a need for an orchestrated set of tools.
Now we have entered the gateway wars …Web vs. CASB vs. SDP. Whoever wins, the problem of BYOD and unmanaged devices still remains. There is also the issue that we can’t secure endpoint users’ mobile devices. As is, the technologies of mirror gateway and forward proxy solve the sins of “reverse proxy” and have become indispensable blades. Forward proxy is the solution for all apps when you can manage the endpoint, and mirror gateway can be used for all users, all endpoints and all sanctioned apps.
Cloud is a means to an end … and the end requires organizations to truly transform. This is especially important as regulators expect a high level of control in a cloud environment. Below are the key takeaways presented:
Cloud impacts the strategy and governance from the strategy, to controls, to monitoring, measuring, and managing information all the way to external communications.
The enterprise cloud requires a programmatic approach with data as the center of the universe and native controls only get you so far. Cloud is a journey, not just a change in technology.
Developing a cloud security strategy requires taking into account service consumption, IaaS, PaaS, and SaaS. It is also important to keep in mind that cloud is not just an IT initiative.
This session examined how Valvoline went to the cloud to transform its security program and accelerate its digital transformation. When Valvoline split as an IPO with two global multi-billion startup they had no datacenter for either. The data was flowing like water, there was complexity and control created friction, not to mention a lack of visibility.
They viewed cloud as security’s new north star, and said the ‘The Fourth Industrial Revolution’ was moving to the cloud. So how did they get there? The following are the five lessons they shared:
Inspired by the cryptocurrency model, OpenCPEs is a way to revolutionize how security professionals measure their professional development experiences.
OpenCPEs provides a method of validating experiences listed on your resume without maintaining or storing an individual’s personal data. Learn more about this project by downloading the presentation slides.
By Elisa Morrison, Marketing Intern, Cloud Security Alliance
CSA’s 10th anniversary, coupled with the bestowal of the Decade of Excellence Awards gave a sense of accomplishment to this Summit that bodes well yet also challenges the CSA community to continue its pursuit of excellence.
The common theme was the ‘Journey to the Cloud’ and emphasized how organizations can not only go faster but also reduce costs during this journey. The Summit this year also touched on the future of privacy, disruptive technologies, and introduced CSA’s newest initiatives in Blockchain, IoT and the launch of the STAR Continuous auditing program. Part 1 of this CSA Summit Recap highlights sessions from the Summit geared toward the enterprise perspective.
Every CEO wants to embrace cloud but how to do it securely? To answer this question this trio looked at the journeys other companies such as Kellogg and NRC took to the cloud. In Kellogg’s case they found that when it comes to your transformation the VMs of single-tenant won’t cut it. They also brought to light the question of the ineffectiveness of services such as hybrid security. Why pay the tax for services not used?
For NCR, major themes were how to streamline connectivity and access to cloud service. The big question was how do end users access NCR data in a secure environment? They found that applications and network must be decoupled. And, while more traffic on the cloud is encrypted, it offers another way for malicious users to get in. Their solution was to use proxy and firewalls for inspection of traffic.
The Future of Privacy: Futile or Pretty Good? – Jon Callas
ACLU technology fellow Jon Callas brought to light the false dichotomy we see when discussing privacy. It is easy to be nihilistic about privacy, but positives are out there as well.
There is movement in the right direction that we can already see, examples include: GDPR, California Privacy Law, Illinois Biometric Privacy Law, and the Carpenter, Riley, and Oakland Magistrate decisions. There has also been a precedent set for laws with more privacy toward consumers. For organizations, privacy has also become the focus of competition and companies such as Apple, Google, and Microsoft all compete on privacy. Protocols such as TLS and DNS are also becoming a reality. Other positive trends include default encryption and that disasters are documented, reported on, and a concern.
Unfortunately, there has also been movement in the wrong direction. There is a balancing act between the design for security versus design for surveillance. The surveillance economy is increasing, and too many platforms and devices are now collecting data and selling it. Lastly, government arrogance and the overreach to legislate surveillance over security is an issue.
All in all, Callas summarized that the future is neither futile nor pretty good and it’s necessary to balance both moving forward.
From GDPR to California Privacy – Kevin Kiley
This session touched on third-party breaches, regulatory liability, the need for strong data processing paramount to scope and how to comply with GDPR and CCPA. Kiley identified a need for a holistic approach with more detailed vendor vetting requirements. He outlined five areas organizations should improve to better their vendor risk management.
Onboarding. Who’s doing the work for procurement, privacy, or security?
Populating & Triaging. Leverage templated vendor evaluation assessments and populate with granular details.
Documentation and demonstration
Building an Award-Winning Cloud Security Program – Pete Chronis and Keith Anderson
This session covered key lessons learned along the way as Turner built its award-winning cloud security program. One of the constant challenges Turner faced was the battle between the speed to market over security program. To improve their program, Turner enacted continuance compliance measurement by using open source for cloud plane assessment. They also ensured each user attestation was signed by both the executive and technical support. For accounts, they implemented intrusion prevention, detection, and security monitoring. They learned to define what good looks like, while also developing lexicon and definitions for security. It was emphasized that organizations should always be iterating from new > good > better. Lastly, when building your cloud security program they emphasized that not all things need to be secured the same and not all data needs the same level of security.
MGM’s global user base meant they wanted to expand functions to guest services, check-in volume management and find a way of bringing new sites online faster. To accomplish this, MGM embarked on a cloud journey. Their journey was broken into business requirements (innovation velocity and M&A agility) along with necessary security requirements (dealing with sensitive data, the need to enable employees to move faster, and the ability to deploy a security platform).
As they described MGM’s digital transformation the question was raised, where is sensitive data stored in the cloud? An emerging issue that continues to come up is API management. Eighty-seven percent of companies permit employees to use unmanaged devices to access business apps, and the BYOD policy is often left unmanaged or unenforced. In addition, MGM found that on average number 14 misconfigured IaaS services are running at a given time in an average organization, and the average organization has 1527 DLP incidents in PaaS/IaaS in a month.
To address these challenges, organizations need to consider the relations between devices, network and the cloud. The session ended with three main points to keep in mind during your organization’s cloud journey. 1) Focus on your data. 2) Apply controls pertinent to your data. 3) Take a platform approach to your cloud security needs.
There is a gap in the security controls framework for IoT. With the landscape changing at a rapid pace and over 2020 billion IoT devices, the need is great. Added to that is the fact that IoT manufacturers typically do not build security into devices; hence the need for the security controls framework. You can learn more about the framework and its accompanying guidebook covered in this session here.
Panel – The Approaching Decade of Disruptive Technologies
While buzzwords can mean different things to different organizations, organizations should still implement processes among new and emerging technologies such as AI, Machine Learning, and Blockchain, and be conscious of what is implemented.
This session spent a lot of its time examining Zero Trust. The perimeter is in different locations for security, and it is challenging looking for the best place to establish the security perimeter. It can no longer be a fixed point, but must flex with the mobility of users, e.g. mobile phones require very flexible boundaries. Zero Trust can help address these issues, it’s BYOD-friendly. There are still challenges, but Web Authentication helps as a standard for Zero Trust.
Cloud has revolutionized security in the past decade. With cloud, you inherit security and with it the idea of a simple system has gone out the window. One of the key questions that was asked was “Why are we not learning the security lessons from the cloud?” The answer? Because the number of developers grows exponentially among new technology.
The key takeaway: Don’t assume your industry is different. Realize that others have faced these threats and have come up with successful treatment methodologies when approaching disruptive technologies.
CISO Guide to Surviving an Enterprise Cloud Journey – Andy Kirkland, Starbucks
Five years ago, the Director of Information and Security for Starbucks, Andy Kirkland, recommended not going to the cloud for cautionary purposes. Since then, Starbucks migrated to the cloud and learned a lot on the way. Below is an outline of Starbucks’ survival tips for organizations wanting to survive a cloud journey:
Establish workload definitions to understand criteria
Utilize standardized controls across the enterprise
Provide security training for the technologist
Have a security incident triage tailored to your cloud provider
Establish visibility into cloud security control effectiveness
Define the security champion process to allow for security to scale
PANEL – CISO Counterpoint
In this keynote panel, leading CISOs discussed their cloud adoption experiences for enterprise applications. Jerry Archer, CSO for Sallie Mae, described their cloud adoption journey as “nibbling our way to success.” They started by putting things into the cloud that were small. By keeping up constant conversations with regulators, there were no surprises during the migration to the cloud. Now, they don’t have any physical supplies remaining. Other takeaways were that in 2019 containers have evolved and we now see: ember security, arbitrage workloads, and RAIN (Refracting Artificial Intelligence Networks).
Download the full summit presentation slides here.
In the process of outlining several use cases across discrete economic application sectors, we covered multiple industry verticals, as well as some use cases which cover multiple verticals simultaneously.
For the purpose of this document, we considered a use case as relevant when it provides potential for any of the following:
—disruption of existing business models or processes;
—strong benefits for an organization, such as financial, improvement in speed of transactions, auditability, etc.;
—large and widespread application; and
—concepts that can be applied in real-world scenarios.
From concept to production environment, we also identified six separate stages of maturity—concept, proof of concept, prototype, pilot, pilot production, and production—to get a better assessment of how much work has been done within the scope and how much more work remains to be done.
Some of the industry verticals which we identified are traditional industries, such as shipping, airline ticketing, insurance, banking, supply chain, and real estate, all of which are ripe for disruption from a technological point of view.
We also clearly identified the expected benefits from adoption of DLTs/blockchain in these use cases, type of DLT, use of private vs public blockchain, infrastructure provider-CSP and the type of services (IaaS, PaaS, SaaS). Identification of some other key features in the use case implementations such as Smart Contracts and Distributed Databases have also been outlined.
Future iterations of this document will provide updates on these use cases, depending on the level of progress seen over time. We hope this document will be a valuable reference to all key stakeholders in the blockchain/DLT ecosystem, as well as contribute to its maturity.