IT security spending should be aligned with risk analysis results. Too frequently, though, this is not the case.
Thought leaders in information security have been describing a shift in security control effectiveness over time from network-based, to application-based, and ultimately to data-centric security.
Eight years on from the Jericho Forum identifying the phenomena of de-perimeterization, and the consequences of this on information security, it is more than a little discouraging how much of the security technology spend across the industry is still focused on network security controls, and how little is spent on application security and data-centric security controls.
A proper alignment of risk management and analysis with IT security spending can help individual organizations to drive towards the optimal security controls for their environment, based on their identified risks.
Some root causes of poorly aligned risk management and IT security spending and focus include:
- Failures to broadly assess risks and to create risk registers, so that significant risks are not overlooked
- Failures to do deep dive analysis of the most significant risks, to fully understand the probable frequency and probably impact of future loss
- Reliance on a “checklist compliance culture” identifying mandatory security controls (and blindly implementing them), frequently at the expense of a real risk-based approach to security
- Reliance on (and overspending on) perimeter security models and technologies, when the threat environment has moved on, perimeter security technologies are now stopping attacks, and application and data-centric security controls are needed to secure information. The security industry principle of “defense in depth” is too often forgotten in practice, as is the last line of defense (protecting the data with strong access controls and encryption).
- A continued over-reliance on preventive controls, without implementing detective controls, and reactive controls, to quickly spot and react to security issues, and limit the damage.
- Reliance on ineffective legacy risk models and frameworks that don’t quantify risk, leaving the business to try and understand H-M-L or Red-Yellow-Green risk rankings, which provide little help in understanding the magnitudeand impact of risks,and no means of evaluating the relative risk reduction of various mitigation options
- The “point in time” nature of risk analysis provides a further challenge in maintaining alignment with security. The fact is that the components comprising risk change over time, as threats change, the value of assets change, and control effectiveness changes. Moving risk management programs towards a “real time risk posture” should be a goal, while recognizing that this is a hard problem to solve.
Aligning risk management and analysis to security spending requires a strong risk management culture, and the use of a risk framework which produces useful results that can inform security spending.
The Open FAIR body of knowledge from The Open Group, comprising the Risk Taxonomy Standard (O-RT), and the Risk Analysis Standard (O-RA), facilitates risk analyses with quantified results, and allows for easier comparison of various risk mitigation options. Open FAIR also makes communication with business executives more straightforward, enabling discussions regarding risk and security spending to be grounded in probabilities, financial impact, and relative risk reduction for various security spending alternatives.