Risk and the Pareto Principle: Applying the 80/20 rule to your risk management strategy

risk management strategyEnterprises these days are putting more resources into monitoring and managing business risk. And with good reason – in light of a growing number of vulnerabilities and advanced threats, they’re dealing with a more complex risk environment that also impacts their technology partners and other third parties.

Of course, unknown and hidden vulnerabilities increase enterprise risk by leaving organizations susceptible to data theft, cyber espionage and other business disruptions. For regulated industries, vulnerability exploits can also result in hefty financial penalties and additional audits. What also may be less obvious, but eventually costlier, is that these vulnerabilities can often leave an organization susceptible to attack for years down the road.

Take the Yahoo breach as an example of the long and far reaching tail of business risk. While the breach itself likely dated back to 2012, it reared its head four years later in August 2016 when a hacker publicly announced that he had just placed 200 million Yahoo login credentials – including MD5-hashed passwords and date of births – for sale on the underground marketplace. And will this breach manifest in other ways down the road? More than likely.

While organizations are investing in Threat and Vulnerability Management (TVM) solutions to understand their exposure to risk, they’re also realizing that it’s nearly impossible to address the explosion of vulnerabilities that they’re suddenly detecting in their environment. A TVM solution might be a step in the right direction, but organizations also need to approach their risk posture more strategically.

Research indicates that the majority of risk (about 80 percent) is sourced to a fraction of their vulnerabilities (20 percent or less.) Looking ahead, that means organizations need to prioritize the vulnerabilities that present the most risk. By focusing on critical flaws with the potential for damage, enterprises can make a huge dent to business risk, while also streamlining threat management processes to be more efficient, cost effective and smarter.

Threat and vulnerability management and the Pareto Principle

In light of these threat trends, it’s not surprising that enterprise organizations are paying more attention to their risk posture and actively monitoring business risk – the growing number of cyberattacks and insider breaches that are often buried inside millions of events and vulnerabilities.

How can organizations hope to wrap their arms around all of those vulnerabilities hidden in their network? The short answer is that they probably can’t – and shouldn’t try. In order to truly understand their risk posture and address the threats that have the potential to cause the most damage, they need to be more strategic.

To start, organizations need to understand the Pareto Principle – otherwise known as the 80-20 rule – and how it applies to their threat environment. At a high level, the Pareto Principle, named for economist Vilfredo Pareto, stipulates that roughly 80 percent of the effects or results are attributed to 20 percent of the causes or invested input.

It’s a universal concept that also applies to users’ vulnerability environment. From a risk standpoint, that means that approximately 80 percent of the business risk that can cause the most harm comes from just 20 percent of the vulnerabilities.

So, in order to successfully gauge risk and develop an effective risk strategy, organizations will not only need to find vulnerabilities, they’ll need to identify and determine the right vulnerabilities – the ones that present the biggest risk both to their business and security posture.

In short, this is the equivalent of “finding a needle in a stack of needles.” The ability to locate, triage and then patch the most serious vulnerabilities is a lot more challenging than simply finding them. For that, organizations will need to invest in business risk and intelligence technologies which often includes some kind of Threat and Vulnerability Management (TVM) solution, designed to streamline the aggregation and correlation of asset vulnerability data with threat intelligence, while scoring risk and analytics prioritizing actions that tightly align with business objectives. In fact, solutions now provide risk intelligence coupled with TVM capabilities, which can also be offered from the cloud to accommodate an enterprise’s unique environment and scale as it grows.

But that’s just the beginning. Whatever solution that’s adopted needs to incorporate three salient macro-dimensions that will help enterprises to apply the Pareto Principle to their risk environment – rapidly identifying 20 percent of the most critical vulnerabilities while more effectively mitigating 80 percent of the impact.

The Data Model: Like the foundation of a building, the ability to locate, query and prioritize the data is where it all starts, essentially setting the stage for an effective Pareto Principle approach to risk.

It’s no secret that as organizations are now required to support thousands of practitioners and millions of asset objects, enterprise risk has become a big data challenge. To address this, organizations will need to create a strategy designed to effectively query, assess, analyze and prioritize the most important threat and risk data. Among other things, this includes smart connectivity with a large number of ecosystem partners, which enables organizations to quickly populate that model, and incorporates advanced correlation engine to ensure high performance functionality, regardless of the query.

Automation: These days, automation is not a luxury but a necessity for any organization attempting to get ahead of their business risk. Automation gives organizations the ability to streamline the process of operationalizing their security solutions – this includes content mapping, leveraging pre-built workflows, data ingestion with filtering, self-service business intelligence, and UI customization among other things that are now available “out of the box.” In addition to streamlining operations, automation is now an essential feature for data collection, providing organizations security threat information and asset discovery on an ongoing basis. And the biggest advancement in automation is the ability to configure, not program, changes.

Risk Scoring and Analytics: For organizations, one of the biggest priorities is board reporting – which means they need quick and easy access to dashboards and heat maps that can be generated in near real time. They also need the ability to easily slice and dice risk intelligence as needed for business leaders, security personnel and IT team members. They need the ability to assemble vulnerability and threat intelligence feeds into comprehensive analytics that reflect their own business-specific risk likelihoods and impacts.

Specifically, they require one data model, but multiple reporting options. The good news is that there are numerous innovations in risk scoring algorithms to quantify and prioritize risks based on multi-attribute weightings for business priorities, security data, and operational and compliance policies.

Modern analytics are mandatory for enterprises to quickly visualize business critical risk and make remediation immediately actionable. Among other things, this allows organizations to leverage scoring algorithms that quantify and prioritize vulnerabilities based on business requirements, threat exploits, and vulnerability impacts. They entail the ability to correlate assets with business context and threat intelligence, and conduct event analysis so organizations can see the entire picture of their risk posture. Prioritizing those vulnerabilities is conducted with configurable vulnerability risk scoring for security operations as well as business risk scoring across multiple lines of business as well as third parties.

When facing scrutiny from the board, analytics can provide key risk metrics – trending business risk prioritization and remediation effectiveness that includes factors such as aging, cost per vulnerability and incident reduction. Analytics also enable organizations to correlate vulnerabilities and patch information, including the ability to prioritize and group vulnerabilities based on criteria such as asset criticality, compliance regulations, vendors and SLA commitments. What’s more, for compliance and internal audits, analytics are vital for vulnerability exceptions – tracking critical flaws not remediated by policy.

Organizations can’t manage what they can’t see. A big picture of risk environment is a start. But ultimately, honing in on the most important 20 percent by understanding where to look and what to look at will offer a crucial leg up in managing the threats and vulnerabilities that have the potential to cause the most damage.