Microsoft announced its new AI-powered service for security professionals, Security Copilot, in early 2023. If you aren't familiar with the service, it's a security-focused extension that runs on top of Azure OpenAI. The overarching idea is to feed all relevant security signals - such as events, incidents, and logs - to a Large Language Model and perhaps extract insights and intelligence from all that noise through an excellent prompt.
The added value might yield from Microsoft's extensive visibility across the millions of tenants and hundreds of millions of users and how they might use that data as a basis for training the LLM.
The Microsoft Security Copilot is not yet available in preview. I also do not have access to any private previews.
What is the Security Copilot?
You can see Security Copilot in action here:
The anticipated prompts you'd ask the Copilot service include "Do I have active incidents?" and "What active threats do we have?". The responses would then look at whatever base data the LLM has, plus take into account any custom data of your tenant.
In a way, you could argue that you could use Azure OpenAI once you feed your custom security-related data to it - but that probably is impossible for many companies. The recently released (and then pulled) Azure ChatGPT is a viable starting point for your custom Copilot implementation.
Preparing for the arrival of Security Copilot
I've gathered three key things you can do today to prepare for the arrival of Microsoft Security Copilot.
First, note what security services and capabilities you currently use from Microsoft. Include services such as Microsoft Sentinel, relevant Log Analytics Workspaces, Defender for Endpoint, Defender for Cloud, etc. The intention here is to verify the data points you would utilize via Security Copilot - as not everything might be initially available through the service. Expanding the wisdom of the AI with knowledge where you have data will be more than helpful.
Second, plan how you will trial and test Microsoft Security Copilot. In your production tenant? While possible, I suggest using a separate Microsoft 365 tenant and, preferably, a separate Microsoft Entra ID tenant. This is to try out all Security Copilot capabilities without seeing live production incidents and data. Keep in mind that, most probably, the new service won't write anything, but it will produce scripts, diagrams, and tooling for you to use – and ideally, you wouldn't run these against a production environment.
For this second part, you can spin up a new Microsoft 365 tenant and acquire a single license for your test usage. Ideally, that would be a Microsoft 365 E5 license, but you could also get by with E3 or even E1, depending on what licenses and features you have in your production environment.
Third, and last - if you haven't, learn the basics of KQL and PowerShell. These will be crucial building blocks on your journey towards securing your platforms.
For KQL, I suggest this free Microsoft Learn course. Also, Rod Trent's Must Learn KQL is a splendid resource to advance to. For PowerShell, I suggest this free course on Azure PowerShell, and this free course on Microsoft 365 PowerShell.
I fear that Microsoft Security Copilot will be very "1.0" upon its arrival. That's not bad, but any fancy tooling - such as the fantastic m365cli from the PnP crew - might not be pluggable initially to this service.
Lastly, considering which users will be required to license for Microsoft Security Copilot. It's too early to talk about licensing, but presumably, you won't license all your tenant users. Once we know more about the exact cost - mirroring the announced cost of Microsoft 365 Copilot at around $30/month/user - we can better prepare for the rollout of the service.