Why the Capital One Data Breach is a Poster Child for Today’s Cybersecurity Threats

Capital One Data Breach
The Case: USA v. Paige A Thompson

Earlier this week, the FBI arrested and charged a Seattle-area hacker, Paige A. Thompson, for accessing and threatening to distribute personal data from over 106 million Capital One Financial customers, including U.S. Social Security and Canadian Social Insurance numbers, bank account identifiers, and credit card applications dating back to 2005. The breach comes just a week after credit-reporting agency Equifax, another high-profile cybersecurity target, reached a $700 million settlement with U.S. regulators for disclosing personal information from over 147 million consumers in 2017.

The negative business outcomes of such a massive hack, one of the largest to ever affect a financial services firm, are both immediate and material. In addition to prolonged negative press and social media exposure for both Capital One and its executives, the company forecast direct short-term costs in the $100-150 million range, and lost 6% of its value immediately following the announcement, erasing $2.5 billion from its market cap.

While the size and scope of the Capital One hack will keep it in the headlines for months or even years to come, what makes this an important case study for executives and cybersecurity practitioners alike is that it clearly illustrates several of the most important risks that enterprises currently face – securing DevOps processes when transitioning to the cloud, controlling privileged access, detecting malicious insiders, and treating data as a liability as well as an asset. With all of these hallmarks, the Capital One incident is a poster child for today’s cybersecurity threats.

In the Cloud, Who’s On First?

For most organizations, complex multi-cloud and hybrid cloud architectures are a fact of life, and here to stay. In fact, by 2021, IDC predicts that 9 out of 10 companies will have multiple cloud services in their digital infrastructure. Capital One was both an early adopter and a leading advocate of migrating to the cloud, with CIO Rob Alexander presenting a keynote at the 2015 AWS conference, and the company planning to eliminate its own data centers within the next few years.

But despite some narratives that the cloud is inherently less safe, or somehow to blame for incidents like this, it’s important to note that the security layers for which cloud providers are actually responsible are rarely to blame – in this case, the AWS S3 infrastructure was not compromised, and its default configurations generally offer adequate protection. Instead, seemingly minor setup, configuration, or integration errors become vulnerabilities that are magnified by the volume and sensitivity of data being collected, processed, moved around, and stored within the cloud.

In this incident, Capital One did, in fact, determine that “a firewall misconfiguration permitted commands to reach and be executed by that server, which enabled access to folders or buckets of data in Capital One’s storage space at the cloud computing company”. It’s a perfect example of how inadequate DevOps processes and standards can lead to multi-million dollar losses.

What Does Privileged Access Even Mean Anymore?

One of the more interesting angles being reported in the Capital One incident is the fact that the hacker, Paige A. Thompson, was a former systems engineer at Amazon. Whether this makes her a true “insider” is debatable, but she likely had detailed knowledge of AWS, and may have exploited it in her attack. What we do know is that she managed to run privileged commands to exfiltrate data from the Capital One cloud environment. A firewall misconfiguration opened the door, but where were the access controls that might have prohibited or at least detected this transfer of data?

This scenario shows that in a world where there are no “perimeters” that clearly delineate what is inside versus outside the enterprise, we must re-imagine what tools and capabilities are needed to protect sensitive data as it moves fluidly and seamlessly between devices, applications, platforms and databases that may be owned or operated by any number of companies, cloud providers, partners, and end users. What controls are needed to stop malicious code from opening connections and transferring sensitive data from servers, data lakes or SaaS platforms that may be far from your enterprise data center?

When Data Becomes a Liability

A final factor in the massive size and scope of this cybersecurity incident is related to another hot topic in digital transformation – the principles of privacy and data protection enshrined in legislation such as GDPR. Take a closer look at this statement from Capital One: “The largest category of information accessed was information on consumers and small businesses as of the time they applied for one of our credit card products from 2005 through early 2019.”

Given the current predilection of architects and developers to throw everything into a giant data lake forever “just in case”, it’s doubtful if executives at Capital One ever bothered to consider the ROI of retaining application forms from seventeen years ago in a hot S3 bucket versus the potential liability of losing it in a breach. Although most data protection laws do not specify concrete retention periods, the spirit of these mandates generally fall into the category of not keeping personally identifiable information (PII) longer than needed for the purposes for which it was processed. At some point, companies need to ask “is keeping this worth it?”

Although this factor seems more concerned with policy than technology, it perfectly illustrates that in our digitally-transformed world, cybersecurity, identity, access, privacy, and data protection are all interdependent – and that architecture or infrastructure decisions made solely by technology practitioners in a vacuum can generate tremendous risk and lead to severe business consequences down the line.

A Way Forward

The reality is that all of us are facing the same risks as Capital One. Every organization needs to embrace the cloud to become more scalable and agile, and in doing so, we create ever-more complex hybrid IT environments. Coupled with a well-known shortage of security professionals, this makes the likelihood of misconfigurations or process failures an inevitable occurrence. Additionally, the volume, sensitivity and value of data being collected to drive today’s digital experiences, amplified by the increasing number of accounts and services with elevated privileges needed to manage them, has created a perfect storm for hackers.

All of this means that our current security model – focused largely on point solutions that artificially silo or divide capabilities such as identity management, privileged access, or API security – are becoming too slow, error-prone, and reliant on human scaling to address today’s elevated risks. We need to change our response to these threats by better combining security and connectivity with artificial intelligence and automation into platforms capable of monitoring data across the entire enterprise, responding instantly to perceived threats, and safeguarding trust across the data custody chain from mainframe to mobile and IoT. At Broadcom, this is the approach to cybersecurity we’re currently sharing with our customers, and we invite you to learn more as well.

About the author

Clayton Donley is currently the Head of Broadcom’s Security and Integration Business, which delivers leading solutions for Privileged Access Management, Identity & Access Management, Payment Security, and API Management.

Prior to Broadcom, Clayton held key engineering and product management roles at CA Technologies and Oracle. Most recently he led engineering for CA’s core security products. At Oracle, he was responsible for product management for Oracle’s security portfolio, and also held the role of VP of Engineering. He joined Oracle in 2005 with the acquisition of OctetString, a virtual directory software company that he founded in 2000. Prior to founding OctetString, Clayton held a variety of software engineering, consulting, and IT roles, including two years in Beijing, China, where he led IT Architecture for Motorola’s Cellular Infrastructure Group.

Clayton was a prolific contributor to the open source community in the early days of LDAP directories, and authored a book on LDAP programming. In his spare time, Clayton enjoys photography, experimenting with home automation, and traveling with his wife of 22 years and their two middle-school-aged children. While originally from the Chicago area, he currently enjoys sunshine and snowless winters in the San Francisco Bay Area.