Patching strategies

Cybercriminals have initiated an arms race by refining the malware manufacturing and development process to systematically bypass defense mechanisms. There are many limitations of traditional defense mechanisms, however, security patches are found to be a primary and effective means to escape this arms race as they remediate the root cause of compromise. The major challenge lies in the timely patching of software portfolios, which is like chasing a continuously moving target.

To delve deeper into this challenge, Secunia has compared different patching strategies under the assumption of limited resources. Measurements demonstrate that an intelligent patching strategy can result in increased resilience against exploits; lowering risk levels by up to 80% and maximizing operational efficiency.

An attacker’s mind-set
Cybercriminals are constantly refining their tactics in line with the evolution of the industry and it could therefore be stated that an opportunity for a cybercriminal is represented by the following formula: Opportunity = #Hosts x #Vulnerabilities.

The number of hosts certainly correlates with the 2 billion users with Internet access – a number which has increased by more than 400% in the last decade. With such a huge amount of Internet users, it becomes clear that end-points are being increasingly targeted as even the smallest rate of success of an attack translates into a considerable number of compromised systems.

Corporate and private end-points are both extremely rewarding targets for cybercriminals. End-points are difficult to defend due to their dynamic environments and the unpredictable usage patterns by users. End-points are also highly valuable as they are the location where the most valuable data is found to be the least protected – e.g. access to all data needed to conduct an organisation’s business. Even if no sensitive data is present, the end-point’s computing power and bandwidth provide valuable resources, for example as an infection point, proxy, or for distributed password cracking services.

In other words, everyone who uses the Internet – around 31% of the Earth’s population – is a target.

Evolving vulnerability risks
The recent white paper, “How to Secure a Moving Target with Limited Resources’ by Secunia tracks a representative end-point comprising the operating system (Windows XP) and a software portfolio with the industry’s top 50 most prevalent programs. This representative portfolio has programs from 14 different vendors installed: 26 programs from Microsoft and 24 programs from third-parties (non-Microsoft).

To measure the number of vulnerabilities per host, data gathered from over 3 million users of Secunia’s free, lightweight scanner that identifies and patches insecure programs on end-points, is used. The analysis of this data reveals an alarming trend – the number of vulnerabilities affecting this typical end-point increased by 71% in the last year. These findings suggest that end-points are increasingly targeted with the majority of vulnerabilities exploitable from remote, thereby providing direct system access to the attacker.

Figure 1 – History of the number of vulnerabilities affecting a typical end-point with Windows XP (left), distribution of the origin of vulnerabilities in 2010: OS operating system, MS Microsoft programs, TP third-party programs (right).

Threat origins

A breakdown of these vulnerabilities by origin reveals the driver behind this trend – vulnerabilities in third-party programs by far outnumber vulnerabilities in the operating system or Microsoft programs. In fact, the share of vulnerabilities found in the operating system and the Microsoft programs, versus all vulnerabilities found on the end-point, almost halved from 55% in 2006 to 31% in 2010. This means that, while patching the operating system and all Microsoft products on a typical end-point remediated 55% of the vulnerabilities in 2006, the same patching strategy remediated only 31% of the vulnerabilities in 2010.

The sheer complexity of patching will undoubtedly leave a large number of systems incompletely patched – and thus vulnerable. For instance, to fully patch a typical end-point, you will need to master at least 14 different update mechanisms from 14 different vendors. Patching and remediating 31% of the vulnerabilities in the operating system and the 26 Microsoft programs is the easy part – just one “Microsoft Update” is needed. However, dealing with the 24 third-party programs, representing 69% of the vulnerabilities, is altogether more complex and time-consuming. Mastering another 13 update mechanisms is therefore necessary to remediate these threats.

This process then needs to be repeated for each and every end-point under your control.

This complexity in keeping an end-point fully patched has a measurable effect on security. Secunia’s research reveals that, on average, between 6-12% of the third-party programs are found to be insecure compared to only 2% of the Microsoft programs. Thus, the 69% of the third-party program vulnerabilities, paired with their low patch level, means that cybercriminals don’t even need precious 0-day exploits in order to carry out their “work”.

In terms of defence, creators of malicious software and botnet agents have developed and used a broad spectrum of tools and techniques that can easily bypass traditional anti-virus technologies. Therefore, these findings turn common perceptions – that the operating system and Microsoft products are the primary attack vector and that traditional defence methods provide sufficient security against vulnerabilities – on their heads.

Shifting patch management goal posts
A patch remediates the root cause of compromise and thereby neutralises a large number of attack vectors. In light of the limitations of anti-virus and other defence technologies, and the effectiveness of patches to remediate the root cause of compromise, controlled and timely patching of the infrastructure in order to minimise business risk should be considered as a primary security measure. For typical organisations, patching all programs is operationally and economically prohibitive.

However, a “one approach fits all” strategy no longer works, and the ever evolving threat landscape is causing the goal posts to continually move. The main dilemma is identifying the critical programs worth patching to achieve the largest reduction in risk. This is where the concept of the moving target comes into play. While some programs are vulnerable in several consecutive years, many programs are only vulnerable in some years while not in others. Programs with low prevalence are also frequently found to be considered critical in some years.

From a security perspective, it is a bad investment to only deploy a patch for a program with vulnerabilities that are “Not critical” or “Less critical” while programs with “Highly critical” vulnerabilities remain unpatched.

It is not just the most popular and widely used programs – the “usual suspects’ – that should be monitored with caution. Today’s attacks typically use a large number of different exploits to open up attacks against a wide range of vulnerable programs, thus less prevalent programs are not ruled out by cybercriminals and can also lead to compromise.

Intelligent patching
How can you ensure that you identify the right vulnerability to patch at the right time? Patching strategies are therefore compared to find the optimal strategy to achieve the largest reduction in risk for a given investment of security resources. An example is an organisation with 200 programs in its infrastructure that already patches the operating system, which is reasonable with the availability of “Microsoft Update”, and has the resources to additionally patch 10 different programs. The challenge is to identify and patch the “right” 10 programs out of the 200 – it is this approach that results in the largest reduction in risk.

To analyse the effectiveness of different patching approaches, two strategies of selecting 10 out of 200 programs to be patched every year over a five year period are compared – patch the top 10 by largest market share (static strategy) or according to risk (dynamic strategy), based on the criticality of vulnerabilities taken from Secunia’s vulnerability database.

Results show that the choice of patching strategy has a considerable effect on how much risk can be remediated. Averaged over the last six years, patching the top 10 most critical programs remediates 71% of the total risk while patching the top10 most prevalent programs remediates 31% of the risk, or 2.3 times less. The total risk is the risk of the entirely unpatched portfolio of the 200 programs. The strategy of patching the top10 most critical programs every year covered 18 different programs in total over five years, which further supports the notion of the moving target.

Knowing what to patch is proven to be crucial in light of limited security resources. While patching the same number of programs per year at roughly the same expense, the optimal strategy results in 2.3 more risk remediated. Intelligent patch prioritisation also pays off considerably. For example, if risk requirements demand that at least 80% of the risk of unpatched programs has to be remediated, this can be achieved by either patching the Top-12 most critical programs or by patching the Top-37 most prevalent programs.

Figure 2 – Risk remediated by A) patching the Top-N most prevalent programs or B) patching the Top-N most critical programs in 2010.

Yes you can – achieve more with less!
Research indicates that for all vulnerabilities affecting a typical end-point in 2010, 65% had a patch available on the day of the disclosure of the vulnerability, and 75% of the vulnerabilities had a patch available within 10 days of disclosure. Organisations cannot hide behind the threat of 0-days when a solution is available to remediate 65% of vulnerabilities.

The dynamics of a software portfolio, paired with the rapid changes in the threat environment, imply a flexible approach to ensure that organisations patch what is most critical. Significantly, Secunia’s research concludes that it is not the amount invested in IT security that is of importance for achieving optimal risk reduction with the same or less resources – rather, it is the type of technology and its capabilities that matter.