Leveraging the Azure Functions Table Storage Output Binding with PowerShell

Recently I wrote this post on using PowerShell to bulk load data into Azure Table Service. Whilst this method works great it does rely on the AzureRM PowerShell module to provide the ability to batch ingest data into Azure Table Service.

I’m working on a solution that requires levels of automation to obtain data from events from Microsoft Graph and ingest that data into Azure Table Service. That doesn’t work with the AzureRM PowerShell Module.

Azure Functions provide additional Bindings for Input and Output, but I’d never had the need to spend the time working it out how to output to Azure Table Storage (with PowerShell). The documentation covers examples for C#, Javascript, Java and Python. But what about PowerShell? Nothing. In this post I cover how to use the Azure Table Storage Azure Function Output Binding.

Azure Function Configuration

If you’re creating a new Azure Function App in 2019 and wanting to use PowerShell, after creating your Azure Function App you need to configure the Function app settings for Runtime version 1. This can only be configured prior to creating a Function.

Set Azure Function to v1.PNG

Using Azure Storage Explorer select your Storage Account associated with the Azure Function Plan and under Tables create the table you will be putting data into.

Azure Table Service myEvents Table.PNG

After creating your Azure PowerShell Function select Integrate and under Outputs add Azure Table Storage. Provide the Azure Storage Account Table that you created above.

Azure Function Table Service Output Binding.PNG

Example PowerShell Azure Function

Here is an example Azure PowerShell Function that connects to the BreweryDB API to obtain the 175 Beer Styles.

It then creates a PowerShell Object for each style and adds it to an Array of Beer Styles. The array is then converted to JSON and passed to the Azure Table Service Output Binding  outputTable configured earlier.

As this is just an example there is no error handling etc, but a working example of obtaining data, transforming it and sending it to Azure Table Service.

Looking at the Azure Table Service Table with Azure Storage Explorer after executing the Azure Function all the Beer Styles have been added.

Beer Styles Added to Azure Table Service.PNG

Summary

The Azure Table Service Output Binding for Azure Functions provides a quick and simple method to allow data to be ingested into Azure Table Service. An added benefit over my previous integration is that the data doesn’t need to be batched into batches of 100 records.

Forefront/Microsoft Identity Manager – Attempted to access an unloaded AppDomain

This post is more a note-to-self for future me in case I’m in this scenario again. Today I encountered the error Attempted to access an unloaded AppDomain.

I have a custom Forefront/Microsoft Identity Manager Management Agent that requires multiple credentials for the Web Service it is integrating with. In order to secure parts of the credentials that cannot be provided as part of the Connectivity configuration tab on the Management Agent Proporties I have generated them and exported them using Export-Clixml as detailed in this post here.

Today I was migrating a Management Agent from one environment to another and was sure I’d regenerated the credentials correctly. However the Management Agent wasn’t working as expected. Looking into the Applications and Services Logs => Forefront Identity Manager Management Agent Log ….

Forfront Identity Manager Management Agent.PNG

i found the following error ….

Unhandled exception, the CLR will not terminate: System.AppDomainUnloadedException: Attempted to access an unloaded AppDomain

Attempted to access an unloaded AppDomain.PNG

Retracing my steps, I had logged on to the Synchronisation Server with the incorrect credentials to generate my protected credentials XML file.

When the Synchronisation Server was running the Management Agent and attempting to run Import-clixml “credentialFilename” the credentials/account that had exported the credentials did not match the service account that the synchronisation server was running with. And the error listed above was thrown.

Summary

Import-clixml and Export-clixml do exactly what they are supposed too and respect the context by which the credentials were exported and will only be able to access them when imported under that same context. The error doesn’t really tell you that but does hint at it with Attempted to access an unloaded AppDomain if you know what you are trying to do.

 

Using AutoRest for PowerShell to generate PowerShell Modules

AutoRest for PowerShell

Recently the Beta of the AutoRest for PowerShell Generator has been made available. At the recent Microsoft MVP Summit in Seattle Garrett Serack gave those that were interested a 1 hr corridor session on getting started with AutoRest for PowerShell.

AutoRest is a tool that generates client libraries for accessing RESTful web services. Microsoft are moving towards using AutoRest to generate SDK’s for the API’s in the standard languages they provide SDK’s for. In addition the AutoRest for PowerShell generator aims to automate the generation of PowerShell Modules for Azure API’s.

In this post I’ll give an intro to getting started with AutoRest to generate PowerShell modules using an AutoRest example and then an Azure Function based API.

AutoRest for PowerShell Prerequisites

AutoRest is designed to be cross-platform. As such its dependencies are NodeJS, PowerShell Core and Dotnet Core.

AutoRest for PowerShell Installation

Installation is done via command line once you have NodeJS installed and configured

npm install -g autorest@beta

if you have previously installed AutoRest you will want to update to the latest version as updates and fixes are coming regularly

autorest --reset

AutoRest Update.PNG

Building your first AutoRest for PowerShell Module

The AutoRest documentation has a couple of examples that are worth using initially to get to know the process and the expected output from AutoRest.

From a PowerShell Core command prompt (type pwsh in a command window) make the following Invoke-WebRequest (IWR) to retrieve the XKCD Swagger/OpenAPI file. I pre-created a sub-directory named XKCD under the directory where I ran the following command.

iwr https://raw.githubusercontent.com/Azure/autorest/master/docs/powershell/samples/xkcd/xkcd.yaml -outfile ./xkcd/xkcd.yaml

Download XKCD Yaml.PNG

We can now start the process to generate the PowerShell module. Change directory to the directory where you put the xkcd.yaml file and run the following command.

autorest --powershell --input-file:./xkcd.yaml

AutoRest Generate Libs.PNG

Following a successful run a generated folder will be created. We can now build the PowerShell Module. We can add the -test flag so that at the end of Generating the Module, Pester will be used to test it. First run PowerShell Core pwsh  then build-module

pwsh 
./generated/build-module.ps1 -test

AutoRest - Build Module.PNG

With the module built and the tests passed we can run it. We need to load the module and look to see the cmdlets that have been built for the API.

.\generated\run-module.ps1 xkcd.psm1
get-command -module xkcd

AutoRest - Load Module.PNG

Using the Get-ComicForToday cmdlet we can query the XKCD API and get the Comic of the Day.

Get-XkcdComicForToday | fl

AutoRest - Comic of the Day.PNG

Taking it one step further we can download the comic of the day and open it, with this PowerShell one-liner.

invoke-webrequest (Get-XkcdComicForToday).img -outfile image.png ; & ./image.png

Download and Display XKCD Comic of the Day.PNG

Let’s Create a Simple API – BOFH Excuse Generator

I created an HTTP Trigger based PowerShell API Function that takes GET requests and returns a BOFH – Bastard Operator from Hell (warning link has audio) style excuse. For those that haven’t been in the industry for 20+ years or aren’t familiar with the Bastard Operator from Hell, it is essentially a set of (semi) fictional transcripts of user interactions from a Service Desk Operator (from Hells’) perspective set in a University environment. Part the schtick is an Excuse of the Day. My API when queried returns a semi plausible Excuse of the Day.

Here is my HTTP Trigger PowerShell Azure Function (v1). A simple random excuse generator.

I configured the function for anonymous, GET operations only and the route to be excuse

Testing the Function from my local machine showed it was working and returning an excuse.

Now it’s time to generate the OpenAPI Spec. Select the Azure Function App => Platform Features => API definition

Select Generate API definition template and the basics of the OpenAPI spec for the new API will be generated. I then updated it with the output application/json and what the API returns as shown in the highlighted sections below. Select Save 

Now we can test it out and we see that we have success.

Taking the OpenAPI Spec (YAML format file) for our BOFH Excuse API we go through the steps to successfully generate the PowerShell Module using AutoRest. Running the freshly baked cmdlet from the BOFH PowerShell Module returns a BOFH Excuse. Awesome.

Summary

Using AutoRest it is possible to generate PowerShell Modules for our API’s. Key to the successful generation is the definition of the OpenAPI Spec for our API. In my example above, obviously if you don’t define what the API call returns then the resulting cmdlet will query the API, but won’t return the result.

Empowering your long running PowerShell Automation Scripts with SMS/Text Notifications

18 months ago I wrote this post that detailed integrating Push Notifications into your scripts. That still works great, but does require that you have the associated Push Bullet application installed in your browser or on your devices. More recently I wrote about using Burnt Toast for Progress Dialogs’ for long running scripts. That too is all great if you are present on the host running those scripts. But what if you want something a little more native and ubiquitous? Notifications for those autonomous or long running scripts where you aren’t active on the host running them, and you don’t want the hassle of another application specific for that purpose? How about SMS/Text notifications?

This post details how to use Twilio (a virtual telco) from your PowerShell scripts to send SMS/Text alerts from your scripts so you can receive notifications like this;

Everything is on Fire.PNG

Twilio

Twilio is a virtual telco (amongst other products), that allows you to use services such as the mobile network via your application. For their SMS/TXT service they even give you a credit to get started with their service. To send SMS/TXT messages using their service from Australia each message is AU$0.0550.

Sign-up for a Twilio trial account, enable and verify your account. Take note of the following items as you will need them for your script;

  • service mobile number (initially Trial Number)
  • Account SID
  • your Auth Token

Using these pieces of information via the Twilio API we can send SMS/Text Notifications from your PowerShell scripts (well any language, but I’m showing you how with PowerShell). You can get your Service Number, Account SID and Auth Token from the Dashboard after registering for a Trial Account.

Trial Account Dashboard.PNG

To enable SMS/TXT go to the Twilio Programmable SMS Dashboard here and create a New Messaging Service.

For my use (notifications from scripts) I selected Notifications, Outbound Only.

Create New Messaging Service.PNG

Once created you will see the following on the Programmable SMS Dashboard. That’s it, you’ve activated SMS/TXT in Trial Mode.

SMS Dashboard.PNG

The Script

Here is a Send-TextNotification PowerShell Function that takes;

  • Mobile Number to send the notification to
  • Mobile Number the message is coming from
  • Message to send
  • AuthN info (Account SID and Account Token)
  • Account SID

Line 46 sends a SMS/TXT notification using my Send-TextNotification script. Update;

  • Line 32 for your Twilio Account SID
  • Line 34 for your Twilio Account Token
  • Line 40 for the mobile number to send the message to
  • Line 42 for the authorised number you verified to send from
  • Line 44 for your notification message

Summary

Using the Twilio service, my small function and a few parameters you can quickly add SMS/TXT notifications to your PowerShell scripts. Once you have it up and running I encourage you to upgrade your account and pay the few dollars for use of the service which also removes the “Sent from your Twilio trial account” text from the messages. Twilio also has a WhatsApp Notification Service.

IMG-9658.JPG

 

Changing SailPoint IdentityNow Identity Profiles Priorities using PowerShell

In SailPoint IdentityNow a single user is highly likely to be represented on multiple Sources, that in turn are likely to be authoritative for differing SailPoint IdentityNow Identity Profiles. Often the first or last Identity Profile you create isn’t the one you wish to have the highest or lowest profile and you therefore need to change an Identity Profiles precedence so that the correct Identity Profile is associated with your identities.

The priority of IdentityNow Identity Profiles cannot be changed through the Portal, but it is possible to perform the change via the API as detailed in this Compass document.

Rather than following the Postman path described in that document, knowing I’m going to need to do this irregularly but relatively quickly I’ve written a little PowerShell script to make the changes.

By default an Identity Profile when created is added to the bottom of the list and their priority increased by 10 from the last Identity Profiles’ priority. The script will by default make the Identity Profile you choose 5 higher that the Identity Profile you’re moving it above.

The following screenshot shows 5 Identity Profiles in their priority order. Let’s say we wanted to move the System Accounts Identity Profile from the bottom priority to between Cloud Identities and Badged Identities.

IdentityNow Priority List.PNG

Using the script (at the bottom of this post) we can authenticate to IdentityNow and retrieve the IdentityNow Profiles with their Priorities. It will ask which IdentityNow Profile you wish to increase the priority of. By default it defaults to the one at the lowest priority.

Identity Profile Priority Changer 1.PNG

You are then prompted for where you would like to move it. Type the name of the Identity Profile you want to move it above.

Identity Profile Priority Changer 2a.PNG

Confirm your selections by typing ‘y’. Anything else will cancel the operation.

Identity Profile Priority Changer 3.PNG

The update will be made in IdentityNow and the output will indicate the updated priority given to the Identity Profile that was moved.

Identity Profile Priority Changer 4.PNG

Checking in the IdentityNow Portal we can see that they Identity Profile was moved from the bottom to between Cloud Identities and Badged Identities.

IdentityNow Priority List Updated.PNG

The Script

Below is the script that performs the changes to Identity Profiles priorities. Update the following script for;

  • Line 2 for your Client ID
  • Line 4 for your Client Secret
  • Line 8 for your Org name
  • Line 10 for your Admin Account name
  • Line 11 for your Admin Account password

Summary

Using this script is a quick way to change the priority of Identity Profiles in SailPoint IdentityNow.

SailPoint IdentityNow Identity Profiles Mapping Report

Last year I wrote this post here that detailed using the SailPoint IdentityNow API to generate an IdentityNow Sources HTML Report using PowerShell.
In a similar vein here is a post that does a similar function, but for the IdentityNow Identity Profiles. The example script below will connect to IdentityNow and extract all the Identity Profiles and pull out the details for the Mappings and create an HTML Report with a section for each Identity Profile.

SailPoint IdentityNow Identity Profiles Report.PNG

Report Script

Update the script below for;

  • Line 2 for your v1 API ClientID
  • Line 4 for your v1 API Client Secret
  • Line 8 for your Org Name
  • Line 10 for your Admin Name
  • Line 11 for your Admin Password
  • Line 22 for an image path for the report
  • Line 27 for the output path for the report

The Report

The resulting report that will be located in the output path you specified, will let you expand each of your Identity Profiles and see the attributes mapping configuration associated with it. A snippet of an Identity Profile is shown below.

SailPoint IdentityNow Identity Profiles Report Details.PNG

Summary

The ability to report on the attribute mappings for Identity Profiles gives a quick way to document or report on the attribute mappings. If you’re so inclined the script can be easily extended to report on all other aspects of the configuration items of an IdentityNow Identity Profile.

The image I’m using in the report is from this page and sized at 240 x 82 px.

Darren’s PowerShell Snippets Volume 1

I live in PowerShell and my memory is pretty good. There are a number of common PowerShell commands and one-liners or functions that I use a lot and I can remember them. However, then there are the ones I use less regularly and I often find myself trying to recall the last time I used them in a script, in order to locate that script just to get those couple of lines. So I’m posting a bunch of them in this post, if for nothing else to help me find them quickly. Let’s consider this as my common PowerShell Snippets and Volume 1.

Unix Time

For a number of API’s I interact with I need to provide the current time in Unix format as part of the API request. This online liner around Get-Date does that.

$utime = [int][double]::Parse((Get-Date -UFormat %s))

the output looks like this

1551906706

URL Encode

Often you need to encode a URL or query. The following shows taking a query and URL encoding it.

$query = "attributes.firstname='Darren' AND attributes.lastname='Robinson'"
Add-Type -AssemblyName System.Web
$queryEncoded = [System.Web.HttpUtility]::UrlEncode($query)

The encoded query then looks like

attributes.firstname%3d%27Darren%27+AND+attributes.lastname%3d%27Robinson%27

Basic Authentication Header

The following will create a Basic Authentication Header from a ClientID and Client Secret that can then be used with Invoke-RestMethod or Invoke-WebRequest

$clientID = 'abcd1234567'
$clientSecret = 'abcd12345sdkslslfjahd'
$Bytes = [System.Text.Encoding]::utf8.GetBytes("$($clientID):$($clientSecret)")
$encodedAuth =[Convert]::ToBase64String($Bytes)
$header = @{Authorization = "Basic $($encodedAuth)" }

You then use the $header variable in your web request e.g

invoke-restmethod -method get -uri "https://webservice.com" -headers $header

Converting a String to Proper/Title Case

Sometimes you get a string that is SHOUTING at you. Or just badly formatted and you need to make it look as it should. The following will convert a string to Title Case

$Surname = (Get-Culture).textinfo.totitlecase("BaDlY-ForMAtted SurNAME".tolower())

The BaDlY-ForMAtted SurNAME will then become

Badly-Formatted Surname

TLS

I have written this up before in more detail for a slightly different scenario here. But often the quick one-liner to force PowerShell to use TLS 1.2 for web requests is

[Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12

Splitting a Large Collection into manageable chunks

Way to regularly I have a data-set of 100k+ objects. In order to parallelise the processing of that data into multiple threads I need to break that 100k+ collection into smaller chunks.

The following takes LARGEDATASET, say a collection of 10’s or 100’s of thousands of objects and splits it into collections of 1000.

$counter = [pscustomobject] @{ Value = 0 }
$groupSize = 1000
$groups = $LARGEDATASET | Group-Object -Property { [math]::Floor($counter.Value++ / $groupSize) }

You can then determine how many it created with

$groups.Count

and then checkout each Group by incrementing through the collection array

$groups[0].Group

Parallel Processing

As an extension of splitting a large collection into smaller collections you may then want to process a collection with multiple threads (e.g in Parallel).

The following snippet leverages the Invoke-Parallel Function from Rambling Cookie Monster that I’ve mentioned previously such as here.

Change the Throttle switch for the number of threads. Change the timeout if required (if for instance you are smashing your (not someone else’s) API.

& .\PATH-TO\invoke-parallel.ps1
$result = Invoke-Parallel -InputObject $CollectionOfObjects -throttle 10 -runspaceTimeout 60 -ImportVariables -ImportModules -ScriptBlock {
   try {
        $query = Invoke-RestMethod -method Get -uri "https://api.application.com" -headers $header
      } catch {
        write-host $_
      }
   @($query)
}
$result.Count

Join Large Collections

This I mentioned previously in this post here.  This snippet is the ability to quickly join objects between two large collections. It utilises the Join-Object function again from Rambling Cookie Monster

$reportData = Join-Object -Left $collection1 -Right $collection2 -LeftJoinProperty UserPrincipalName -RightJoinProperty UserPrincipalName -Type AllInLeft

Un-escaping a JSON Document

Using ConvertTo-JSON will escape special characters as shown below.

{"description": "Australian G\u0026f Logistics Ltd"}

Using the following will unescape the JSON document (if the system you’re interfacing with doesn’t unescape on consumption).

$jsonString | ConvertTo-Json | % { [System.Text.RegularExpressions.Regex]::Unescape($_) }

The JSON document will then become

{"description": "Australian G&f Logistics Ltd"}

That’s it for Vol 1 of Darren’s PowerShell snippets. I’ll start compiling others as I search for them and don’t find them in this Vol.

Creating SailPoint IdentityNow Access Profiles via API and PowerShell

Managing SailPoint IdentityNow Access Profiles is easy enough to do using the SailPoint IdentityNow Portal. But what if you have the requirement to update, report on, or create numerous Access Profiles? That’s where the SailPoint IdentityNow API comes into play. The Access Profiles API is documented here but doesn’t go into a lot of detail. In this post I’ll detail interfacing with it using PowerShell primarily to create and update Access Profiles.

Prerequisites

You will need to Authenticate to the IdentityNow API. Both v2 and v3 authentication methods work. I detail the v2 method here and the v3 method here. Personally I’m using the v3 method. Just make sure you change your Headers for the requests to whatever method you use and the naming of your variables.

For reference my v3token variable is $v3Token so my Authentication Header is then @{Authorization = “$($v3Token.token_type) $($v3Token.access_token)”}

Getting Access Profiles

The Access Profiles API URI is

https://$($yourOrgName).api.identitynow.com/v2/access-profiles

If you know the ID of the Access Profile you can return just that Access Profile via it’s ID (where $accessProfileID is the ID that looks like 2c91808466a64e330112a96902ff1f69)

https://$($yourOrgName).api.identitynow.com/v2/access-profiles/$($accessProfileID)

The following script will return Access Profiles from your SailPoint IdentityNow Tenant. Update;

  • Line 2 for your IdentityNow Org name

Updating Access Profiles

To update an Access Profile the API URI is;

https://$($orgName).api.identitynow.com/v2/access-profiles/$($accessProfileID)

The following will update an existing Access Profile to make Request Comments Required and Denied Comments Required equal False.

Update;

  • Line 2 for your IdentityNow Org name
  • Lines 6 for the ID of the Access Profile you want to update
  • Lines 9,10 for the settings to update

Preparing to create an Access Profile

In order to create an Access Profile, there are a number of configuration items that you will need to provide. The key items are;

  • SourceID is the (currently) five digit ID of a source that you can get from the IdentityNow Portal when looking at the properties of a Source. Or via API as I detailed in this post.
  • OwnerID is the Identity ID for the user you will make the owner. To do that you will need to query IdentityNow for the user (see below for an example)
  • Entitlements
    • In order to get the Entitlements ID(s) to assign to the Access Profile you will need to query the Source. This post here details querying Sources to get Entitlements whereby you can get the ExternalID of Entitlements.

Search for Owner ID Request Object

Here is an example Search Request JSON Object required for the Search User call. Update it for a unique attribute for an Identity to query and return.

Update for your criteria. e.g if you copy the JSON below;

  • create a variable name
    • $requestFilter = ‘JSON snippet content from below’
  • Update the search criteria for your search
    • $newRequestFilter = $requestFilter .Replace(“darren.robinson@customer.com.au”,”yourUser@mydomain.com”)
  • Then search for the user and get the ID of the identity
  • Update
    • Line 2 for your Orgname
    • Line 8 for the user to search for that matches the JSON object from above

Creating an Access Profile

Finally, now that we have the prerequisite information to create an Access Profile we can create it. Modify for your environment based off information retrieved from the processes above. Namely;

  • Line 1 for your Orgname
  • Line 2 for the SourceID associated with the Access Profile
  • Line 3 for the Access Profile Owner’s ID
  • Lines 7-10 for your Access Profile Details
  • Line 14 for the Entitlements
  • Line 19 for the Approver (see below for more details)

Access Profile Approvers

For Approvers you can provide the order for approval. For the approval by the Access Profile Owner and then the Manager use the following when creating the Access Profile in Line 19 above.

  • $accessProfile.add(“approvalSchemes”,”accessProfileOwner, manager”)

Other options are:

  • SourceOwner
  • appOwner
  • Governance Group. See managing Governance Groups here to get the Governance Group ID (GUID format)
    • workgroup: 86929844-3391-4ce2-80ef-760127e15813

Summary

Whilst the creation of an Access Profile via API does require some configuration, if you have multiple to create and you know the criteria it is possible to automate the task. I hope this helps others.

 

Searching and Returning SailPoint IdentityNow Entitlements using the API and PowerShell

Entitlements on IdentityNow Sources can be leveraged for many purposes within IdentityNow. Recently I’ve been looking to automate some Access Profiles that will in-turn have entitlements associated with them.

This post details how to query for Entitlements in IdentityNow using the v3 API and PowerShell.

Prerequisites

You will need to Authenticate to the IdentityNow API. The v3 authentication method is required. I detail the v3 method here. The Headers for the requests detailed in this post use the following variables for the JWT oAuth Token.

My v3token variable is $v3Token so my Authentication Header is then
@{Authorization = “$($v3Token.token_type) $($v3Token.access_token)”; “Content-Type” = “application/json”}

Searching for Entitlements

The Base API URI to search for entitlements is;

https://$($org).api.identitynow.com/cc/api/entitlement/list

You will also need to provide a timestamp and a source for which you want to retrieve entitlements for.

Generating the Timestamp

The timestamp is in Unix format which can be generated in PowerShell like this;

$utime = [int][double]::Parse((Get-Date -UFormat %s))

Getting a list of Sources

I’ve previously described listing IdentityNow Sources in this post and this post. Essentially though you can return a list of all sources by performing a GET request to

https://$($orgName).api.identitynow.com/cc/api/source/list

Obtain the Source ExternalID from the source of  your choice that you then wish to return entitlements for.

Entitlement Results

You can limit the number of entitlements returned by using the limit option. The following will return the first 1000 entitlements for a source starting at 0

&start=0&limit=1000

If the source has more than 1000 then you will need to page the results to return the next 1000 results. Continue until you’ve returned all.

&start=1000&limit=1000

Of course you can just not provide a limit and all entitlements will be returned in a single call.

Finding an Entitlement on a Source

Using the power of PowerShell it is quick to find the Entitlement you want if you know some of the information about it. For referencing an Entitlement when creating an Access Profile via the API you will need the Entitlement ID  e.g

$myEntitlement = $sourceEntitlements.items | Select-Object | Where-Object {$_.displayName -like "*Sydney*"}
$myEntitlement.id

Summary

Using the IdentityNow API and the v3 endpoints we can retrieve entitlements for a Source and quickly locate the entitlement and the entitlement ID for use during automation of IdentityNow tasks such as Access Profile creation.

Azure AD/Active Directory User Security Evaluation Reporter

During December 2018 – February 2019 Microsoft have run an online Microsoft Graph Security Hackathon on Devpost.

The criteria of the hackathon was;

  • Build or update a functioning Microsoft Graph-powered solution that leverages the Microsoft Graph Security API

Following the announcement of the Hackathon I was encouraged by Kloud management to enter. During the busy month of December I started to formulate a concept for entry in the Hackathon taking learnings from the hackathon I entered in 2018. Over the Xmas holiday period I started working on my entry which continued into January and February at nights and weekends.

Problem

A Security Administrator within an Organisation enables security related configuration options on an Azure Tenant to implement security controls that align an organisation with Microsoft recommendations and best practice.

The Azure Security Score provides an evaluation on the alignment of an organisation with best practice, however to some extent it still requires end users to have the right configuration for security related elements of their profile. But as a Service Desk Operator or Cyber Security Officer there isn’t a single view of a user’s security posture that can give you an individual user security score summary. My solution……

Microsoft User Security Evaluation Reporter (USER)

Microsoft User Security Evaluation Reporter (USER) is an Azure AD and Active Directory tool for use by the Service Desk and Cyber Security Officers to get instant visibility of an organisations Azure Security Score that allows them to then evaluate current risks within an organisation right down to individual users.

When the Microsoft USER loads the current Azure Security Score is retrieved, evaluated and displayed for alignment with Microsoft Recommendations. Also, on load the last 5 Active Security Risk Events are displayed.

Microsoft USER Recent Risk Events and Azure Secure Score.PNG

The Service Desk Operator or Cyber Security Officer can select one of the recent Security Events or search for a user and drill down into the associated identity. They will be quickly able to understand the users’ individual security posture aligned with best practice.

What are the recent Security Risk Events for that user? Does that user;

  • Have MFA enabled? Is MFA enabled with an Authenticator App as the primary method?
  • Is the users Active Directory password in the Pwned Passwords v4 list from Have I Been Pwned?
  • Has the user recently being attempting Azure Password Reset functions?
  • What are the last 10 logins for that user?
  • What is the base user information for that user and what devices are registered to that user? Are they Azure AD Joined?

User Secure Score Summary.PNG

The clip below gives a walk through with more detail of my Microsoft USER tool.

How I built it

The solution is built using;

  • NodeJS and Javascript
  • leveraging Azure Functions to interface with Azure AD, Microsoft Graph, Azure Table Service
  • Lithnet Password Protection for Active Directory that in turn leverages the Have I Been Pwned v4 dataset
  • All secrets are stored in Azure Key Vault.
  • The WebApp is Application Insights enabled.
  • The WebApp is deployed using a Docker Container into Azure App Service

The architecture is shown below.

MS User Security Evaluation Reporter Architecture

The Code

A Repo with the code can be found here. Keep in mind I’m not a developer and this is my first WebApp that was put together late at night and over weekends and only tested in Chrome and Edge. The Readme also contains hopefully everything you should need to deploy it.