Despite a positive (and significant) decrease from over 4 million unfilled cybersecurity jobs in 2019, there is still a staggering 3.12 million global shortage of workers with cybersecurity skills. You may find this somewhat inevitable, given that IT innovation changes things so quickly and business will always, as a result, be playing catch up. However, I argue that we have the tools to tackle the gap and might have done so already were it not for our grave misunderstanding of the challenge.
Many thought leaders have approached the skills shortage from a cumulative perspective. They ask “How on Earth can companies afford to keep re-training their teams for the latest cyber-threats?” The challenge, to them, emanates from the impracticalities of entry level training becoming obsolete as new challenges emerge.
Of course, the question of ongoing training is very important, but I believe it has misled us in our evaluation of the growing disparity between the supply and demand of cyber-professionals. What we should be asking is “How can we create a generation of cyber-professionals with improved digital skills and resilience to tackle an enemy that continually mutates?”
Defining the relationship between people and tech is of the utmost importance here. Cybersecurity is not merely a technical problem, it’s a human problem. This is a critical intersection. People are not the weakest link in an effective cybersecurity defense strategy, but the most crucial. However, technology is the apparatus that can properly arm us with the skills to defend against attacks.
The silver bullet
The only thing we can be certain of is that cyberattacks are taking place right now and will continue to take place for the foreseeable future. As a result, cybersecurity will remain one of the most critical elements for maintaining operations in any organization.
There is a growing appetite for reform in cybersecurity training, particularly among higher education institutions (e.g., with the UK’s top universities now offering National Cyber Security Centre (NCSC) certified Bachelor’s and Master’s programs. It is in the interest of the British government that this appetite continues to grow, as the Department for Culture, Media & Sport reported there were nearly 400,000 cybersecurity-related job postings from 2017-2020.
In addition, COVID-19 has been a significant catalyst in increasing uptake and emphasis on cyber skills since the steep rise in the use of digital platforms in both our work and personal lives has expanded the surface area for attacks and created more vulnerability.
Overall, though, young people remain our best hope for tackling the global cyber skills gap, and only by presenting cybersecurity to them as a viable career option can we start to address it. This is the critical starting point. Once we do this, the next important step is to give universities and schools the facilities to offer sophisticated cyber training.
Empowering the next generation
If we’re being honest, professors and CTOs are often concerned with providing their students and employees with a theoretical understanding of cybersecurity; that is, what the motives behind attacks might be, the means they use to carry out attacks, and the potential losses involved. While this provides a great theoretical background for cyber-training and may encourage vigilance, it is not always helpful in practical terms.
By encouraging young people to take up courses in computer science or cybersecurity, whilst also supporting their learning via military and enterprise-grade platforms, the next generation of professionals will be well equipped to enter the workforce.
Giving young people access to the best resources in the field is the only way to ensure they will play an active part in closing the skills gap. The standardization of cyber training practices for teens right through to experienced consultants will empower workers of all calibers to take an active role in reformulating their own organizations’ training strategy, strengthening it and enabling seamless integration between teams.
Cyber range technology has emerged as the frontrunner when it comes to inciting this kind of bottom-up stability in cybersecurity. Cyber range technology enables the user – be that a university, business, or government – to generate a realistic, capable and credible virtual environment which requires trainees to respond to cyber-attack simulations in real-time. Within the simulated network, users learn to cope under high levels of stress, locating and exploiting vulnerabilities on various network systems. This helps them develop the skills to identify, monitor and resist cyber-attacks.
Cyber ranges can mimic your IT systems and provide sophisticated training in the form of task-driven Capture-The-Flag (CTFs), live-fire exercises, or a combination of both (threat hunting). They are available in open-source, and can be deployed quickly through the cloud, making roll-out to anywhere in the world a smooth process.
This technology is already the gold standard for governments, but its real disruptive capability lies in its deployment to higher-education institutions and even high schools. Here, students can hone their skills and prepare for tackling real cyber-attacks.
Cyber skills gap: Simplifying the problem
The key to solving the cyber skills gap lies in mobilizing the next generation of (already) tech-savvy young people, and simply shifting our focus towards helping them develop cyber-skills before they enter the workplace.
By taking a two-pronged approach, and bringing together a change in focus, supported by the newest and most sophisticated technology on the market, we can start to implement a real, viable strategy for tackling this immense challenge – before it’s too late.