There is a downward trend in IT’s ability to consistently coordinate, measure and improve security data management processes, including log management, compliance reporting, real-time monitoring, forensic investigation and incident response, according to Sensage.
While resourcing was cited as a major issue this year, as compared to 2011 results, limited access and poor data fidelity were the top barriers preventing organizations from achieving a more sustainable, consistent security management program.
The Sensage report, which analyzes results over a three year period, indicates that the massive (and mostly manual) effort associated with collecting and interpreting security data has created a severe downturn in both the mood of security teams, as well as perception of their effectiveness by stakeholders.
“While many referred to 2011 as ‘The Year of the Breach,’ we see 2012 shaping up to be ‘The Year of Inspection,'” said Joe Gottlieb, CEO of Sensage. “Given the responses highlighting the need for better data access, and revealing inconsistent measurement and process improvements, this year’s respondents appear to be much more honest, realistic and self-aware. This is a significant change compared to previous years, as professionals are becoming more vocal about their dissatisfaction with traditional security practices’ inability to provide the intelligence necessary to counter evolving threats and address organizations’ changing requirements.”
The Sensage survey further highlights the demands placed on resource-constrained security teams, identifying a close relationship between the fidelity of security data and work required to analyze and act on information. Many practitioners want more actionable information faster and there is an overall lack of trust in the data they collect. In 2011 and 2012 Sensage asked if respondents needed better data access and analysis:
- In 2011, 57% said “Yes” which clearly indicated a prevalent challenge in this area.
- In 2012, awareness of this challenge appears to have grown significantly, with 79% noting that they need better data access and analysis.
When studying responses stating that professionals had “inconsistent” and “consistent” measurements and comparing them year over year, Sensage discovered that, while slightly more than 50% of the respondents felt they were inconsistently measuring in 2010 and 2011, 61% shared that challenge in 2012.
While responses in 2010 and 2011 reflected a close split between those who consider their processes coordinated and those that don’t, that was not the case in 2012, where 66% of respondents felt that they were resorting to reactive triage or had no coordination at all.
2010 and 2012 shared a similar percentage of teams who had no proactive process improvement. Inside the numbers, the data yielded troubling findings:
The bad news: A massive drop — from 18% in 2010 to 5% in 2012 — of those who felt they had a consistent and adequately staffed process improvement program.
More bad news: When comparing respondents who maintain consistent process improvement, there was a significant drop, from 65% in 2011 to 40% in 2012.
Worse news: 96% of 2012 respondents had no process, inconsistent process or consistent process that was understaffed.