Microsoft unveils AI-powered Security Copilot analysis tool

Microsoft has unveiled Security Copilot, an AI-powered analysis tool that aims to simplify, augment and accelerate security operations (SecOps) professionals’ work.

AI security analysis tool

Using Microsoft Security Copilot

Security Copilot takes the form of a prompt bar through which security operation center (SOC) analysts ask questions in natural language and receive practical responses.

They can ask it to identify ongoing incidents, analyze code snippets, provide information about added links or files, analyze alerts from other security tools used by the enterprise, and more.

Microsoft Security Copilot is billed as a tool for cutting through – among other things – the chaos of alerts and incidents popping up and demanding security operation center (SOC) analysts’ attention.

It can surface prioritized threats in real time, offer predictive guidance to help analysts foil attackers’ next steps and, in general, accelerate incident investigation and response.

The tool can integrate insights and data from various security tools, including Microsoft Defender (antivirus), Sentinel (cloud-native SIEM and SOAR), Purview (unified data governance for SaaS), and Intune (unified endpoint management for corporate and BYOD devices).

The answers it provides are based internal (corporate) data and external data.

An AI tool that facilitates security analysis, collaboration and learning

Security Copilot enables fast(er) work and collaboration.

Helpful responses can be pinned to a digital pin board, which becomes a summary of findings related to specific investigations and can be shared with colleagues (or they can add their own findings to it).

Prompts can be designed to automatically perform a set of steps meant to achieve a specific goal and then folded into a “prompt book” so others in the SOC can use it even if they lack the actual knowledge that went into creating it.

“A security team’s capacity will always be limited by the team’s size and the natural limits of human attention. Security Copilot boosts your defenders’ skills with its ability to answer security-related questions – from the basic to the complex,” explained Vasu Jakkal, corporate vice president of security, compliance, and identity management at Microsoft.

“Security Copilot continually learns from user interactions, adapts to enterprise preferences, and advises defenders on the best course of action to achieve more secure outcomes. It also supports learning for new team members as it exposes them to new skills and approaches as they develop. This enables security teams to do more with less, and to operate with the capabilities of a larger, more mature organization.”

Accuracy and data security

Microsoft doesn’t hide the fact that Security Copilot doesn’t always get everything right, but counts on users to identify the errors in answers and report them to the company to help them improve the tool.

“Security Copilot is a closed-loop learning system, which means it’s continually learning from users and giving them the opportunity to give explicit feedback with the feedback feature that is built directly into the tool. As we continue to learn from these interactions, we are adjusting its responses to create more coherent, relevant and useful answers,” Jakkal added.

Finally, Microsoft promises to protect the data users enter into Security Copilot, to refrain from monetizing it and from using it to train or enrich foundation AI models used by other customers.

At the moment, Security Copilot is still in private preview.

Don't miss