Supercharged SecOps Series— Forensics Triage with Azure and KQL 🚀
Intro to the Supercharged SecOps series
Welcome to my new blog series; Supercharged SecOps. In this series I will aim to share real insights into how to scale your Security Operations Centre to the next level! This series will cover DFIR, ticketing, automation and leveraging cloud services in a SOC.
Recently I’ve been working on ways to improve some key DFIR metrics:
- Time to triage forensics data
- Time to first finding
Along with generally standardising an approach to conducting DFIR engagements, particularly on intrusion engagements. It’s important to highlight that this is for intrusion engagements as it’s all about speed/efficiency.
In this article I’ll show you how to immediately supercharge your DFIR triage process, utilising Azure services and Kusto Query Language (KQL).
Step 1 — Parse your forensics data!
Note: I won’t be covering how to acquire forensics data at scale, you can however, use tools such as Velociraptor, Crowdstrike Falcon Real Time Response, KAPE etc.
In this example, I’ll keep it simple with a single ‘Security.evtx’ file from a real Domain Controller that proves evidence of compromise. Due to the sensitive nature, I can’t share this, but, if you want to follow along with some data that’s worth hunting in, I’d highly recommend the logs from:
https://github.com/sbousseaden/EVTX-ATTACK-SAMPLES
My favourite tool for parsing event logs is: https://www.sans.org/tools/evtxecmd/ by none other than https://www.sans.org/profiles/eric-zimmerman/. This won’t be a lesson in using Zimmerman tools, but, here is the command line I used:
PS > {PathToEVTxECmd}\EVtxECmd.exe -f "{PathToEvtxFile}\Security.evtx --csv "{PathtoYourOutput}\" --csvf Output.csv
An example with the placeholders is:
PS > C:\Users\mikecybersec\Documents\DFIRTools\EvtxECmd.exe -f "C:\Users\mikecybersec\Documents\DFIREvidence\Security.evtx" --csv "C:\Users\mikecybersec\Documents\DFIREvidence\EvtxOutput" --csvf Output.csv
If successful, you should see something like this:
Step two — Send to Azure, the fun bit!
So to begin, you’ll need two services in Azure:
- Azure Data Explorer (A data analytics platform)
- Azure Storage Account & Container
Navigate to: https://dataexplorer.azure.com/freecluster and claim your free cluster. There are some limitations on free, but it works great for this scenario.
Navigate to: https://portal.azure.com and create yourself a Storage Account. MS Docs here on how to make one: https://learn.microsoft.com/en-us/azure/storage/common/storage-account-create?tabs=azure-portal
Once done, make yourself a container: https://learn.microsoft.com/en-us/azure/storage/blobs/blob-containers-portal#create-a-container
Once you’ve done that, you can now upload your Output.csv file to your new container! It should look like this:
Step three — Bringing data into ADX!
Okay, your data is in Azure storage now. The next step is to import this into your free ADX cluster and start the supercharging process 🚀
To connect your storage container to ADX, go to “Get Data” and select “Azure Storage” as your source.
On the next page, you will be able to configure the ingestion:
- Select your Azure subscription
- Select the storage account you’ve created in the prior step
- Select the container you also created
- Optional: Add a file filter, I added a .csv filter as I have other file types in the root of the container.
Step four — Perform forensics investigations using KQL
The next step is to now write some KQL queries that will uncover evil in your forensics artefacts!
Here’s an example KQL query to get things going:
EventLogsTest
| where EventId == 4776
// The next line excludes any 4667 events that contain your corporate hostname naming convention, my test environment devices start with SEC i.e. SEC-Desktop01
| where PayloadData2 !contains "SEC-"
// There were a bunch of events where Workstation was null, this removes them
| Where [PayloadData2] != "Workstation:"
| project TimeCreated,EventId, Channel, Computer, PayloadData1, PayloadData2
| sort by TimeCreated asc
This query identifies the potential presence of a rogue workstation trying to validate credentials, by excluding our expected hostname naming convention, we should be left with some hosts worth looking at.
Putting it all together
You should have something like my below results, if you have the supporting data. The data I used was from a genuine intrusion (Akira Ransomware) where they used a workstation named ‘kali’ and ‘DESKTOP-XXXX’ (Sorry, unable to share specific hostname).
You can now see our ‘kali’ workstation, immediately cutting through the noise in an analytical approach.
You can then export these to visuals (Click “Add Visual”) and pin to an ADX dashboard! Any future Security.evtx logs you import to that same DB table will be picked up by this query on the dashboard and will display there! neat right?
Repeatable forensics analytics, at scale, in the cloud.
Ok, what else can you do?
More queries! Ultimately, if you can convert any forensics artefact to CSV such as:
- MFT
- Amcache
- Web History
- etc.
Then you can bring it into ADX, write/store queries for a repeatable and efficient forensics approach. You can also automate your acquisition process and initial parsing using the same zimmerman tool, or something like log2timeline.