During this extended period of social distancing filled with increased online activity, I can’t help but reflect on all the user data that has been created, stored, hacked, exposed, bought, shared and sold over the last 10 years. What’s known as the black market is built on this immeasurable and personally identifiable data – information both believed to be secured and known to be exposed – and frankly, it is entirely of our own creation.
The transition from traditional onsite data colocation to the use of third-party cloud shared tenant services should be on everyone’s minds. With this growing shift, everyone from individuals to enterprises will continue to fuel threat actors by improperly storing information in the cloud.
Adversaries today do not have to spend nearly as much time or effort exploiting an organization – it’s a no brainer for them to suck down improperly secured data from the cloud. In fact, I would argue that the amount of data exposed by misconfigured S3 buckets and or third‐party vendors (for example misconfigured Mongo databases, Elastic Search Engines or other applications) far exceeds exposure by any other threat actor activity.
Major factors contributing to improperly secured data include a misconception that the cloud is inherently more secure than storing data on‐premise, the struggle to define the scope of an enterprise environment and a lack of visibility into threat actor environments, the perpetual selling of security solutions as if they are a silver bullet, and a shortage of security professionals.
The cloud is only as secure as we make it
I regularly hear people say the cloud is so much more secure, but when asked, “Why is it more secure?” the responses are not reassuring. Larger organizations are likely to have highly skilled teams to secure their own infrastructure, but the cloud model is designed for ease of use, and reduced friction and complexity – a ripe combinations for folks with less technical skills to launch data into the cloud. In fact, placing the data you govern into a shared tenant service is as easy as putting in a valid credit card.
However, many companies move to virtual servers in cloud services and simply duplicate traditional on‐ site services. They do not consider that in order to remain secure, these servers require the exact attention that an on‐site server requires, continuous backporting and patching, network services firewall and identity access management. Because the cloud is often utilized by organizations who do not have robust security teams, this maintenance and security hygiene often goes unchecked.
You can’t protect unknown data from unknown attackers
It’s well understood by now that organizations are challenged by defining the boundaries and scope of their environments, and knowing where web applications ingress and egress, whether an environment has adequate segmentation or if it’s a flat network. But it bears repeating that in order to protect data, you have to know everywhere it is and what it means.
Conducting tabletop exercises that leverage modern threat vectors such as Stride, Trike or other frameworks is one way to track the most likely ways a threat actor could gain access or circumvent intended security controls, but many organizations are unprepared to complete these exercises or internally discuss the technical issues surrounding the results. In other words, they lack the language and ability to quantify threat risks to the organization which prevents the brand from defining their appetite for risk.
The industry is still selling silver bullets and alert fatigue
Enterprise security solutions are being sold as silver bullets. Many of these solutions are generally syslog tools marked up with hot words like “AI” and “Next Generation”, but really should be noted as “Lipstick on a Pig”. These solutions are often the cause of alert fatigue and companies quickly losing sight of the forest for the trees.
It doesn’t matter how easy to use a tool is or how positive the intended outcome is – an organization must be able to remediate their identified risk and have a plan to determine whether the risk is greater than the technical debt. Often times this looks like delaying a product rollout and ultimately delaying revenue, or working in haste by dumping data into a new and easy to use product through cloud services that creates unaccounted for risk.
A lack of “highly seasoned” IT professionals
At the crux of the issues surrounding improperly secured information in the cloud is the lack of IT professionals available in the market today. Companies that lack robust IT teams, understandably, seek out flexible options to keep their business operations streamlined and continue supporting growth.
While organizations are hyper‐focused on alert fatigue, underfunded security teams or those who simply cannot find the needed talent will be at greater risk of having their data stolen. Hiring managers should consider expanding their search radius for filling these roles, as there are many talented job seekers that could get up to speed quickly if time is allotted for training.
The transition to third party cloud environments as an enabler… eventually
I do believe third party cloud environments will eventually be the enabler we prop them up to be. For larger organizations it may be an enabler to have more control over environments by creating actual CI/CD heavily security, controlled environments such as sanitized development environments with actual sanitized quality control and testing environments. After all, it’s easy to quickly duplicate and/or burn down environments in the cloud. However, many traditional security controls are often bypassed by decisions to quickly adapt to modern third-party platforms.
Data has overtaken the materials of old as the currency that drives the world. As we move further into this decade, it behooves organizations large and small to consider what data they actually need to collect or store; how and where they are securing it; and the role they may play in fueling the underground economy. Assessing how data loss will affect a company (and a company’s tolerance for such loss) is certainly complex but is imperative. I implore organizations to leverage threat vectoring frameworks and avoid the pitfalls of believing the cloud is inherently more secure.