WhiteHat Security released the tenth installment of its Security Website Security Statistics Report, providing a first-time breakdown of the state of website security by industry and company size.
Compiled using data from more than 2,000 production websites across 350 organizations, this latest issue shines a spotlight on the need for organizations to focus on improving responsiveness in remediating vulnerabilities in order to reduce risk and improve the effectiveness of the SDLC over time.
Until now, no metrics have been available for organizations to apply as a benchmark for evaluating themselves against their industry peers. WhiteHat’s research findings give executives the insight they need to determine whether the resources that are invested in source code reviews, threat modeling, developer training and security tools are making a measureable impact in reducing their website security risk.
Furthermore, the industry breakdown allows them to see how their efforts compare to their peers, and if any significant changes need to be made to strengthen website security. For example, the data shows that financial services organizations have learned that quick identification and remediation of SQL Injection vulnerabilities, which if exploited give attackers access to corporate databases, are imperative. And yet, that industry still struggles with overall remediation rates.
Based on WhiteHat’s research, organization size does not significantly impact an industry’s average number of serious vulnerabilities, the specific vulnerability classes that affect it, or time-to-fix metrics. However, there is a correlation regarding remediation rate, a key indicator of risk.
Typically, the larger the organization, the fewer vulnerabilities that are resolved (by percentage). The average website contained nearly 13 serious vulnerabilities, and large organizations (more than 2,500 employees) had the highest average number of serious vulnerabilities. In terms of industry, banking, insurance and healthcare were top performers, while IT, retail and education were at the bottom of the stack.
Knowing that websites are under constant siege and code security is imperfect, there are three important website security metrics that organizations must track:
1. the number of serious and remotely exploitable vulnerabilities
2. the time-to-fix once identified, and
3. the remediation rate.
“When organizations look at their risk management, they are challenged by the question ‘How secure is secure enough?'” said Jeremiah Grossman, founder and chief technology officer, WhiteHat Security. “Rather than relying on arbitrary best practices or check-box security, WhiteHat’s statistics gives organizations baseline metrics from which they can evaluate their own security posture compared to the risk profile of others in their industry.”
WhiteHat’s tenth report contains data collected between January 1, 2006 and August 25, 2010, through the deployment of WhiteHat Sentinel. Cross-Site Scripting (XSS) and Information Leakage remain by far the most prevalent occurring in seven out of 10 websites. As predicted in past reports, Cross-Site Request Forgery (CSRF) has moved up to fourth spot on the “Overall Top Vulnerability Classes” chart.
Additionally, there is a newcomer – Brute Force. Only a year or two ago these issues were not considered dangerous or worth spending time on by most organizations. Today, malicious hacker activity has elevated awareness and organizations are demanding that these attacks be identified and reported on before exploitation occurs. These changes demonstrate the dynamic nature of website security and the need for ongoing website security programs that continuously evaluate risk.