Azure AD Log Analytics KQL queries via API with PowerShell

Log Analytics is a fantastic tool in the Azure Portal that provides the ability to query Azure Monitor events. It provides the ability to quickly create queries using KQL (Kusto Query Language). Once you’ve created the query however you may want to run that query through automation negating the need to use the Azure Portal every time you want to get the associated report data.

In this post I detail;

  • creating a Log Analytic Workspace
  • enabling API Access
  • querying Log Analytics using the REST API with PowerShell
  • outputting data to CSV

Create a Workspace

We want to create a Workspace for our logs and queries. I created mine using the Azure Cloud Shell in the Azure Portal. I’m using an existing Resource Group. If you want it in a new Resource Group either create the RG through the portal or via the CLI using New-AzResourceGroup

$rgName = 'MYLogAnalytics-REPORTING-RG'
$location = 'australiaeast'
New-AzOperationalInsightsWorkspace -ResourceGroupName $rgName -Name Azure-Active-Directory-Logs -Location $location -Sku free

The Workspace will be created.

Create LogAnalytics Workspace.PNG

Next we need to get the logs into our Workspace. In the Azure Portal under Azure Active Directory => Monitoring => Diagnostic settings select + Add Diagnostic Setting and configure your Workspace to get the SignInLogs and AuditLogs.

API Access

In order to access the Log Analytics Workspace via API we need to create an Azure AD Application and assign it permissions to the Log Analytics API. I already had an Application I was using to query the Audit Logs so I added the Log Analytics to it.

On your Azure AD Application select Add a permission => APIs my organization uses and type Log Analytics => select Log Analytics API => Application permissions => Data.Read => Add permissions

Finally select Grant admin consent (for your Subscription) and take note of the API URI for your Log Analytics API endpoint ( for me as shown below.

API Access to Log Analytics with KQL

Under Certificates and secrets for your Azure AD Application create a Client Secret and record the secret for use in your script.

Azure AD Application Secret.PNG

Link Log Analytics Workspace to Azure AD Application

On the Log Analytics Workspace that we created earlier we need to link our Azure AD App so that it has permissions to read data from Log Analytics.

On your Log Analytics Workspace select Access Control (IAM) => Add => Role = Reader and select your Azure AD App => save

Link Log Analytics Workspace to Azure AD Application.PNG

I actually went back and also assigned Log Analytics Reader access to my Azure AD Application as I encountered a couple of instances of InsufficientAccessError – The provided credentials have insufficient access to perform the requested operation

API Access to Log Analytics with KQL - Log Analytics Reader.PNG

Workspace ID

In order to query Log Analytics using KQL via REST API you will need your Log Analytics Workspace ID. In the Azure Portal search for Log Analytics then select your Log Analytics Workspace you want to query via the REST API and select Properties and copy the Workspace ID.

WorkspaceID for REST API Query.PNG

Querying Log Analytics via REST API

With the setup and configuration all done, we can now query Log Analytics via the REST API. I’m using my oAuth2 quick start method to make the requests. For the first Authentication request use the Get-AzureAuthN function to authenticate and authorise the application. Subsequent authentication events can use the stored refresh token to get a new access token using the Get-NewTokens function. The script further below has the parameters for the oAuth AuthN/AuthZ process.

Function Get-AuthCode {
function Get-AzureAuthN ($resource) {
function Get-NewTokens {

Get-AzureAuthN ($resource)
# Future calls can just refresh the token with the Get-NewTokens Function

To call the REST API we use our Workspace ID we got earlier, our URI for our Log Analytics API endpoint, a KQL Query which we convert to JSON and we can then call and get our data.

$logAnalyticsWorkspace = "d03e10fc-d2a5-4c43-b128-a067efake"
$logAnalyticsBaseURI = ""
$logQuery = "AuditLogs | where SourceSystem == `"Azure AD`" | project Identity, TimeGenerated, ResultDescription | limit 50"
$logQueryBody = @{"query" = $logQuery} | convertTo-Json

$result = invoke-RestMethod -method POST -uri "$($logAnalyticsBaseURI)/$($logAnalyticsWorkspace)/query" -Headers @{Authorization = "Bearer $($Global:accesstoken)"; "Content-Type" = "application/json"} -Body $logQueryBody

Here is a sample script that authenticates to Azure as the Application queries Log Analytics and then outputs the data to CSV.


If you need to use the power of KQL to obtain data from Log Analytics programatically, leveraging the REST API is a great approach. And with a little PowerShell magic we can output the resulting data to CSV. If you are just getting started with KQL queries this document is a good place to start.

Output Log Analytics to CSV.PNG

VSCode Configuration Sync between environments

With every new project comes another development environment. Another installation of Visual Studio Code and the inevitable loss of productivity whilst you get all the necessary extensions installed and configured. If only there was a quick and painless way to perform a VSCode Configuration Sync between environments.

A quick search today for a method to sync configuration between VSCode environments revealed a VSCode Extension from Shan Ali Khan.

The VSCode Settings Sync Extension does everything the name implies leveraging GitHub Gists as the configuration store. The process is super simple too. Simply;

  • Install the Settings Sync Extension
  • Generate a GitHub Access Token to allow it to create a configuration Gist (one time task)
  • Upload your VSCode Configuration from your VSCode instance that has your Gold Config
  • Install the Settings Sync Extension on the target machine
  • Download your config and watch as it retrieves your configuration and installs all your extensions

The last two steps can then be repeated in each new environment. And if you enable Auto-upload (disabled by default) on your Master Configuration machine your configuration stored in the associated Gist will always be the latest. Likewise you can Auto Download your configuration to target machines (disabled by default).

The documentation for Settings Sync is super easy to follow and will have your environments in sync super quick.

The screenshot below shows Settings Sync downloading the 36 extensions I have configured that I use for the different projects I write into a new environment.

VSCode Configuration Settings Sync between computers
VSCode Configuration Settings Sync between computers

Thanks for your work Shan Ali Khan. I’ve donated to your great extension.

Windows Terminal with Tabs, on Steroids

PowerShell Cmdline Emojis Windows 10 Tabbed Terminal

At Microsoft Build last week, one of the many announcements was a new Windows Terminal.

If you spend anytime as an IT Support Person/ DevOps type role and you checkout that second link above you’ll be mightily keen for this new Terminal.

Tabs in a Terminal Window YES (heck I remember paying for a product to provide that to me in a browser) 15+ years ago; a Terminal Window that is a standard command prompt (with Unicode Support) YES; a Terminal Window for cross platform, CMD, PowerShell, PowerShell Core, Windows Subsystem for Linux DAMN YES. And of course you don’t want to have to wait for this, you want it now.

So did I, so I built the Preview Alpha Release. This post details how I did it.

Windows 10 Tabbed Terminal with icons
Windows 10 Tabbed Terminal with icons


There are a few hoops you need to jump through to get on this right now, as it isn’t available as a download. It will be coming to Windows 10 in a few months, but let’s get it now.

  • Become a Windows Insider by registering for a Windows Insider Account here
  • Have a Windows 10 v 1903 build (via registering for Windows Insiders above)
    • the process to do this I show below
  • Inside your Windows 10 machine you will then need;
    • Windows 10 SDK v 1903
    • Visual Studio 2017 (I use 2019)
      • Choose the following Workloads
        • Desktop Development with C++
        • Universal Windows Platform Development
        • For Visual Studio 2019, you’ll also need to install the “v141 Toolset” and “Visual C++ ATL for x86 and x64”
    • Git for Windows command-line

Windows 10 Test Machine Version 1903

I built a Windows 10 1709 Virtual Machine in Azure from the Azure MarketPlace. Having connected to it, I needed to enable the Windows Insider Program on it. To do that select;

Windows => Settings => Update & Security => Windows Insider Program => Get Started

Enable Windows Insider.PNG

Select Link an account and provide the account you used to sign up for Windows Insiders.

Link an Account.PNG

If, when you attempt to link an account you get a blank login window/page when being prompted for your Windows Insider Account you may need to make a couple of changes to the Windows 10 Local Security Policy Security Options. Below is the configuration of my test Windows Insider Windows 10 Virtual Machine. I’ve highlighted a few options I needed to update.

Local Security Policy.PNG

Select the Skip Ahead to the next Windows release to update Windows 10.

Skip ahead to the next Windows Release.PNG

If you are doing this like I am on a Windows 10 Virtual Machine in Azure, you’ll first go from build 1709 to 1803.

Windows 1709 to 1803.PNG

After Windows 10 has updated to 1803 log back in, go back to Windows Insider Program and chose Skip ahead to the next Windows release.

Skip ahead to the next Windows Release - 1903.PNG

Under Settings => Update & Security => Windows Update and select Check for Updates and you will see Windows 10 version 1903 become available.

1803 to 1903.PNG

Under Windows => Settings => Update & Security => Enable Developer Mode

Enable Developer Mode.PNG

Terminal Application

With the other dependencies detailed in the prerequisites above (Windows 10 1903 SDK, Visual Studio etc) downloaded and installed on your Windows 10 machine we can get on to the fun bit of building the new Terminal. Create a folder where you want to put the source to build the terminal and from a command prompt change directory into it and run the following commands;

Git clone
cd Terminal
git submodule update --init --recursive

Git Clone.PNG

Then in Visual Studio select Open a project or solution and open the Terminal Visual Studio Solution. Select SDK Version 10.0.18362.0 and No Upgrade for the Platform Toolset

Open the Solution in VS -1.PNG

Select Release and x64 and then from the Build menu Build Solution.

Build Release x64.PNG Finally, right click on CascadiaPackage and select Deploy


Terminal (Dev) will then be available through the Start Menu.

Windows Terminal Dev.PNG

Opening the Windows Terminal will give you a single Terminal Window. Press Cntrl + T to open an additional tab. 

Use the drop down menu to select Settings and you will be presented with the JSON configuration document. See (below under Icons for mine that enables CMD, PWSH, PowerShell, WSL – Ubuntu and WSL – Suse.


To have icons for your terminal tabs obtain some 32×32 pixel icons for your different terminals and drop them into the RoamingState directory under the Windows Terminal App. For me that is


Then update your profiles.json configuration file located in the same directory and add the name of the appropriate icon for each terminal.


As much as we use nice UI’s for a lot of what we do as Devs/IT Pro’s, there are still numerous tasks we perform using terminal shells. A tabbed experience for these complete with customisation brings them into the 21st century. Now the wait for another month or two to have it delivered as part of the next Windows 10 Build.

Microsoft Build – ‘Build apps that integrate, automate, and manage security operations’ Presentation

User Secure Score Risk Profile - 640px

At Microsoft Build last week I was honoured to co-present the Build apps that integrate, automate, and manage security operations” session on the Microsoft Security Graph with Preeti Krishna and Sarah Fender

The session was recorded and is available here

The PowerPoint presentation itself is here BUILD19_SecurityDeveloperPlatform

I provided a demo of my Microsoft U.S.E.R (User Security Evaluation Reporter) Application that leverages the Microsoft Security Graph amongst others.

The GitHub Repo for the project is available here.

Building Apps in Azure with only Scripting Skills – Global Azure Bootcamp 2019

Today (27 April 2019) is Global Azure Bootcamp day. It is the 6th year for the free event that is run in communities all over the world to teach Azure Cloud technologies. This year is my second. Last year I presented on Creating the Internet of YOUR Things and today I’m presenting Building Apps in Azure with only Scripting Skills.

In this session I gave a beginners guide to building an Azure Web App using VSCode and NodeJS. In the demo’s we only needed to visit the Azure Portal for a small Azure Function for our Web App to integrate with.

The objective of the presentation was to show IT Pro’s who have never written a Web App before how to use our modern tools (VSCode) and Azure with Serverless infrastructure to provide a streamlined process for writing a Web App, with a very small amount of Javascript and CSS and of course PowerShell.

Here is the GitHub Repo for the small Web App that I used for the demo’s. A Bastard Operator from Hell Excuse Generator (that I also used in my AutoRest post here) and the PowerPoint presentation.

Building Apps in Azure with only Scripting Skills – April 2019

GitHub Repo

Below is a screenshot of an excuse and a suggested remediation beer.

What is today's excuse.PNG

See you next year for Global Azure Bootcamp 2020.

Generating and Configuring Free SSL Certs for Azure Windows IaaS Virtual Machines

Infrastructure-as-a-Service has provided the ability to quickly deploy hosts in Cloud environments such as Azure. However the certificate that comes with the host isn’t ready for Web Services. I hadn’t had to do this for quite some time and it came to my realisation again that whilst there are a few guides available they are for different scenarios than what I require. In my development environments I’m looking for an SSL Certificate that;

  • is free
  • can be verified through HTTP verification methods (not DNS as we obviously don’t own the * namespace)
  • can be used on Windows Server IIS Websites
  • is trusted by the major browsers
  • is trusted by Azure PaaS Services (e.g Azure API Management)

This blog post details the process to generate free SSL Web Server Certificates using ZeroSSL and Let’s Encrypt, along with the process to implement them on IIS on Windows. Specifically;

  • Generating the Certificate using ZeroSSL
  • Converting the certificate to PFX format for IIS
  • Installing the Certificate on IIS
  • Testing a website configured with the SSL Cert

Generating the Certificate

Start by navigating to ZeroSSL and select Start SSL Certificate Wizard

Enter the address of the host you are generating the certification for. This can be found from the Overview section of your Virtual Machine in the Azure Portal under DNS Name. Select Accept ZeroSSL TOS and Accept Let’s Encrypt SA (pdf) and select Next.

Generate Azure IaaS Windows SSL Certificate Start.PNG

The Certificate Signing Request will be generated. Select Next.

Generate Azure IaaS Windows SSL Certificate 1.PNG

The Account Key will be generated. You can save the CSR and Account Keys from this page. Select Next.

Generate Azure IaaS Windows SSL Certificate 2.PNG

Create the folder structure in IIS as described in the ZeroSSL Verification page under your Windows Server IIS webroot. This by default will be c:\inetpub\wwwroot.

Generate Azure IaaS Windows SSL Certificate 3.PNG

You will need to create the .well-known folder using a windows command prompt.

Generate Azure IaaS Windows SSL Certificate 4

Create a web.config file in the acme-challenge directory so that IIS will provide access to the file created.

      <mimeMap fileExtension="." mimeType="text/plain" />

Ensure the permissions for the new directory and file will also allow access.

Generate Azure IaaS Windows SSL Certificate 4a

Click on the file link to make sure that ZeroSSL can access your IIS Website with the file in the path required. Depending on your Azure Networking, you may require an inbound NSG Rule for port 80 as shown below.

Generate Azure IaaS Windows SSL Certificate NSG 80 Incoming.PNG

Once you have validated the file is accessible, select Next to allow ZeroSSL to verify you have created the file with the appropriate content to validate you own the host. You will then be provided with your certificate.

Generate Azure IaaS Windows SSL Certificate 6.PNG

The top certificate file contains both your domain certificate and the issuer’s certificate.

  • Save it to your local Windows 10 workstation with the .txt extension. e.g. ServerCerts.txt

The second file contains your domain key.

  • Save the domain key to another file with the .txt extension e.g. ServerDomainKey.txt

Converting the Certificate

In order to be able to import the certificate in IIS we need it in PFX format. OpenSSL allows us to convert the .txt files into PFX format. OpenSSL is available Windows Subsystem for Linux by default.  Jumping into a WSL window run OpenSSL with the following syntax referencing the two files you created above.

openssl pkcs12 -export -out ServerCert.pfx -inkey ServerDomainKey.txt -in ServerCerts.txt

Provide a secure password for the PFX file. You will need this when importing the certificate on your IIS webserver.

Convert Cert Format.PNG

Checking out the certificate we can see that the certificate generated is a LetsEncrypt Cert for our Azure IaaS Host.

Cert Details 1

and the certificate path all looks good.

Cert Details 2

Installing the Certificate

On our IIS Server under our IIS Host in Internet Information Server (IIS) Manager select Server Certificates and Import to import the new Certificate as shown below.

Import Cert.PNG

Then selecting the IIS Website we want the certificate for, select Bindings and edit your HTTPS binding selecting the new certificate.

IIS Server Bindings.PNG

Testing the Certificate

After an IIS Reset you should then be able to access the IIS based Website using HTTPS. In my case the Lithnet FIM/MIM Service REST API.

Lithnet MIM Service REST API.PNG


Now that I have documented the end to end process it will be much quicker next time generating a certificate for the next host. With the certificate details saved I can also easily update the certificate when it expires in three months time.

I also have Azure API Management integration with the MIM Service using the Lithnet FIM/MIM Service REST API up and running in another environment.

MIM Service via API Management.PNG



Querying SailPoint IdentityNow Virtual Appliance Clusters with PowerShell

Today I was configuring an Integration Module for SailPoint IdentityNow. As part of that integration I needed the ID of an IdentityNow Virtual Appliance Cluster. It seemed I hadn’t previously documented doing this and I couldn’t find my previous script. So here’s a quick post for the next time I need to get info about the SailPoint Identity Now Virtual Appliance Clusters in my environment.

The following script uses v3 Authentication as detailed in this post.


  • line 2 with your IdentityNow Orgname
  • line 5 with your Admin Account Name
  • line 6 with your Admin Password
  • line 16 with your IdentityNow v3 API Client ID
  • line 17 with your IdentityNow v3 API Client Secret



Outputting data from an Azure Function to Power BI with PowerShell

PowerShell Azure Function to Power BI via Event Hub and Stream Analytics

Last week I wrote this post that detailed how to use the Azure Table Storage output binding in an Azure PowerShell Function. As part of the same solution I’m working on, I also need to get data/events into Power BI for reporting dashboards. An Azure Function (PowerShell) has the ability to obtain the data but the path to Power BI requires a number of steps that start with using the Azure Function Event Hub output binding.

The end to end process is;

  • Azure Function obtains information from an API (Brewery DB)
    • processes the data and;
  • Sends the data to an Azure Event Hub
  • A Stream Analytics Job picks up the data and puts it into a Power BI Dataset.

Azure Function to Event Hub to Power BI.png

This post will detail the process of configuring this process using the same Beer Styles example from the Azure Table Storage Azure Output Binding post here and PowerShell as the documentation doesn’t give a working example for PowerShell.

The inputs and Azure Function are the same. Just the output from the Azure Function to Azure Event Hub is added. You can have multiple output bindings, so this example will add the Event Hub output whilst keeping the Azure Table Service output as well.

Azure Event Hub

Before we can add and configure an Azure Event Hub Output Binding on an Azure Function we need to create an Azure Event Hub Namespace.

From the Azure Portal, create a resource (Event Hub).

Created Event Hub Namespace.PNG

Once the Event Hub Namespace has been created, we need to create an Event Hub. Select the Event Hub Namespace => Event Hubs => + Event Hub and give it a name.

Created Event Hub.PNG

Power BI

From the Azure Portal create a resource Power BI Embedded and create a Workspace if you don’t already have one.

Azure Stream Analytics

Now we need to create the Stream Analytics Job that will take the event data from the Event Hub and put it into a Power BI Dataset.

From the Azure Portal create a resource Stream Analytics Job. Give it a name and select create.

Stream Analytics Job.PNG

Once created select your Stream Analytics Job => Inputs => Add stream input => Event Hub. Provide a job Alias, select your Azure Subscription, the Event Hub Namespace and Event Hub created earlier and select Create.

Stream Analytics Input.PNG

Select Outputs from your Stream Analytics Job => + Add => Power BI => Authorize to authorise access to Power BI. Provide an output Alias, select your Group workspace and provide a Dataset name and Table name => Save.

Stream Analytics Output.PNG

Select Query and update the query to copy the input to the output. Select Save. If you weren’t filtering what you wanted into the Power BI Dataset on the Azure Function (or other input to the Event Hub) you could filter it with a query here.

Stream Analytics Query.PNG

Select the Job Overview and select Start => Now.

Stream Analytics Job Overview Start.PNG

Azure Function App Event Hub Output

With all the wiring in place to get our data out to Power BI we can now configure our Azure Function App to output to our Event Hub.

Select your Azure Function => Integrate => Outputs => New Output => Event Hub => New Connection => Event Hub and select your Event Hub Namespace, Event Hub => Select.

Azure Function Output Event Hub.PNG

Update the Event Hub name to the name of your Event Hub and select Save.

Output Event Hub Name.PNG

Selecting our Azure Function from the previous blog post adding in the following line will also copy our output object to our Event Hub Output Binding.

 $outputArray | convertTo-json | Out-File $outputEventHubMessage

Az Func Output to Event Hub.PNG

Processing our Events

Executing our Azure Function sends the events to our Output Bindings.

Azure Function Run.PNG

We can see the progress through looking at our Event Hub Metrics and Stream Analytics Overview (spike in events as they are processed)

Job Processing.PNG

After a minute we can see the dataset has been created in Power BI.

Power BI Dataset.PNG

We can then create a report from the dataset

New Report.PNG

using the data that was ingested based off our Azure Function.

Create Report Info.PNG


Using the Azure Function Event Hub Output Binding in conjunction with Stream Analytics we can easily get data to Power BI for reporting and dashboards.

Leveraging the Azure Functions Table Storage Output Binding with PowerShell

Recently I wrote this post on using PowerShell to bulk load data into Azure Table Service. Whilst this method works great it does rely on the AzureRM PowerShell module to provide the ability to batch ingest data into Azure Table Service.

I’m working on a solution that requires levels of automation to obtain data from events from Microsoft Graph and ingest that data into Azure Table Service. That doesn’t work with the AzureRM PowerShell Module.

Azure Functions provide additional Bindings for Input and Output, but I’d never had the need to spend the time working it out how to output to Azure Table Storage (with PowerShell). The documentation covers examples for C#, Javascript, Java and Python. But what about PowerShell? Nothing. In this post I cover how to use the Azure Table Storage Azure Function Output Binding.

Azure Function Configuration

If you’re creating a new Azure Function App in 2019 and wanting to use PowerShell, after creating your Azure Function App you need to configure the Function app settings for Runtime version 1. This can only be configured prior to creating a Function.

Set Azure Function to v1.PNG

Using Azure Storage Explorer select your Storage Account associated with the Azure Function Plan and under Tables create the table you will be putting data into.

Azure Table Service myEvents Table.PNG

After creating your Azure PowerShell Function select Integrate and under Outputs add Azure Table Storage. Provide the Azure Storage Account Table that you created above.

Azure Function Table Service Output Binding.PNG

Example PowerShell Azure Function

Here is an example Azure PowerShell Function that connects to the BreweryDB API to obtain the 175 Beer Styles.

It then creates a PowerShell Object for each style and adds it to an Array of Beer Styles. The array is then converted to JSON and passed to the Azure Table Service Output Binding  outputTable configured earlier.

As this is just an example there is no error handling etc, but a working example of obtaining data, transforming it and sending it to Azure Table Service.

Looking at the Azure Table Service Table with Azure Storage Explorer after executing the Azure Function all the Beer Styles have been added.

Beer Styles Added to Azure Table Service.PNG


The Azure Table Service Output Binding for Azure Functions provides a quick and simple method to allow data to be ingested into Azure Table Service. An added benefit over my previous integration is that the data doesn’t need to be batched into batches of 100 records.

Forefront/Microsoft Identity Manager – Attempted to access an unloaded AppDomain

This post is more a note-to-self for future me in case I’m in this scenario again. Today I encountered the error Attempted to access an unloaded AppDomain.

I have a custom Forefront/Microsoft Identity Manager Management Agent that requires multiple credentials for the Web Service it is integrating with. In order to secure parts of the credentials that cannot be provided as part of the Connectivity configuration tab on the Management Agent Proporties I have generated them and exported them using Export-Clixml as detailed in this post here.

Today I was migrating a Management Agent from one environment to another and was sure I’d regenerated the credentials correctly. However the Management Agent wasn’t working as expected. Looking into the Applications and Services Logs => Forefront Identity Manager Management Agent Log ….

Forfront Identity Manager Management Agent.PNG

i found the following error ….

Unhandled exception, the CLR will not terminate: System.AppDomainUnloadedException: Attempted to access an unloaded AppDomain

Attempted to access an unloaded AppDomain.PNG

Retracing my steps, I had logged on to the Synchronisation Server with the incorrect credentials to generate my protected credentials XML file.

When the Synchronisation Server was running the Management Agent and attempting to run Import-clixml “credentialFilename” the credentials/account that had exported the credentials did not match the service account that the synchronisation server was running with. And the error listed above was thrown.


Import-clixml and Export-clixml do exactly what they are supposed too and respect the context by which the credentials were exported and will only be able to access them when imported under that same context. The error doesn’t really tell you that but does hint at it with Attempted to access an unloaded AppDomain if you know what you are trying to do.