As many of you know from reading here, I’ve been on a trek to do many of the important things around AI security including figuring out how to secure it and how to monitor it. To fully accomplish this, I’ve had to create my own apps so they could generate the data needed to identify what could be monitored with modern tools like Microsoft Sentinel.
In the Security Copilot section of my blog, you’ll find my meager attempts at creating those apps: https://rodtrent.substack.com/t/security-copilot
But those simple apps have been valuable and have been able to generate the data I needed to begin identifying what can be monitored in the AI streams. So, you can start your own trek down this path, you can find my apps, code, KQL, Sentinel detections and more all open-sourced at: https://aka.ms/RodAI
Now, I’m onto my next phase of the endeavor and instead of developing solutions around Python code, I’m using the ease and functionality of Azure Cognitive services and Azure AI Studio to create in minutes what took me days before. I’ve really fallen in love with the Azure Cognitive services and how easy it is now to create a web app that is being fed my own data with just a few clicks.
I intend to write this up in greater detail later on, but wanted to include the steps involved, with Learn Docs links and descriptions so those of you that want can build your own now.
I have a session coming up soon for MEMUG in Denver where I’ll be demoing each of these steps in person.
Step 1 - Storage Container
The first step to building the Copilot is to provide a Storage account in the Azure portal where you’ll store the files the Copilot will access and where the Indexer will index against.
See: Quickstart: Use Azure Storage Explorer to create a blob
Step 2 - Indexer
After the Container has been created, now create an Indexer in the Azure portal. The Indexer will run on a schedule you select to index the documents stored in the Container.
See: Quickstart: Create a search index in the Azure portal
I have my Indexer running once a day, but if you’re updating more or less than I am, you can adjust the schedule to match your requirements.
Step 3 - Add your own data
Now that you have your source data stored and being indexed, you jump into Azure AI Studio and use the new “Add your data” feature to configure where your Chatbot will look for its source data.
This feature has a few data source options. In this case, I’m choosing Azure Cognitive Search and selecting the search service and indexer I created already.
See: Quickstart: Chat with Azure OpenAI models using your own data
Step 4 - Deploy an App (or update an existing one)
Once the data has been loaded into Azure AI Studio, you can deploy that configuration to either a new web app or update an existing one by using the Deploy to option at the top right of the interface. This process usually takes less than 10 minutes - most times a lot less.
Once it’s completed the web app will have its own URL under the default mywebappname.azurewebsites.net.
See: Deploy a web app
One big, important tip here is that you can create a Copilot for any topic and limit who has access to each. Each “Copilot” just needs its own data source (Container and indexer) and web app. You could effectively give HR its own Copilot so that only HR data is available. And because it’s using Azure OpenAI and Cognitive services, you can restrict who has access to each Copilot using RBAC. For simplification, use a different Resource Group per web app so you can control access more easily. Each web app will have its own URL, so just deploy that URL to the appropriate audience.
If you want to go even further, you could also do what I’ve done and generate custom domains for the web app.
See: Map an existing custom DNS name to Azure App Service
Step 5 - Maintenance
Want to enhance, update, or add more enterprise data to your web app? Jump back to the Storage Container in the Azure portal and just upload new files or replace existing ones. The Indexer will do the rest and keep your web app source content up to date.
See: Quickstart: Upload, download, and list blobs with the Azure portal
Step 6 - Keep it handy
Lastly, I could have gone down the route to build a browser plugin (and maybe I will eventually), but Microsoft Edge makes it so easy to install my web app as a Sidebar app. My app is always available on the sidebar. My source data is the entire library of KQL documentation (shown in the previous image and the full datasets available HERE) but I’m continuing to add more data I feel is also important as I go along. Additionally, because Edge is managed like any other application on a Windows device, you can deploy your web app icon to your users per group, department, org, etc.
See:
Next Steps
Here soon, I’ll go back and update the look and feel, the icons, branding, and etc. for the web app and I’ll share how to do that here on the blog. I’ll also be implementing some of the new GPT-4 capabilities including plugins and actions.
But I wanted to supply this information here now to show how easy it is to create your own Copilot (chatbot assistant) using Azure OpenAI, Azure Cognitive Search service, Azures Storage service, and Azure AI Studio and prepare you to join me on the next steps.
As more information will soon be forthcoming about Security Copilot, this will serve as a primer to understand how it all works and give unique insight into the true value of a build-your-own solution versus Security Copilot. I promise, as this year advances you’ll come to understand how truly valuable Security Copilot is.
[Want to discuss this further? Hit me up on Twitter or LinkedIn]
[Subscribe to the RSS feed for this blog]
[Subscribe to the Weekly Microsoft Sentinel Newsletter]
[Subscribe to the Weekly Microsoft Defender Newsletter]
[Subscribe to the Weekly Azure OpenAI Newsletter]
[Learn KQL with the Must Learn KQL series and book]
[Learn AI Security with the Must Learn AI Security series and book]
very cool Rod,
Where I'm struggling is:
Importing mixed media - scraped web sites, json files, complex CSV files (like kql that includes double quotes an other common separators)
Once I have my data indexed I want to work with multiple indexes - eg. security news, mitre att&ck library, etc - so I haven't thought much about how to work with multiple indexes/dataframes.
keep it up!
(I'm doing all my stuff in python/langchain/openai functions etc. but I'd love to have an Azure way to do it all.).
The links to "QuickStart" pages don't appear to be working. Is there something wrong?