Quickly generating a dataset of fictitious Users using Randomised Real Data and PowerShell

Introduction

I’ve lost count of the number of times I’ve had the need to generate a representative dataset of users. Of course I have access to many production datasets but for many reasons they can’t be used. Finding previous datasets I’ve randomly generated always seems to take longer than it should, so with my most recent iteration of having to generate a fictitious list of users with Australian addresses, I’ve documented how I went about it, along with the source data I used and the script to create it.

Source Data

For my data sources to base my dataset off, I wanted representative data for Australia for both people names and locations. After a few quick searches I found;

  • that Data South Australia has lists of baby names for both male and female babies in SA. I downloaded the 2017 lists as CSV’s.
  • for Surname, also from Data South Australia I borrowed the 19th Century Arrivals list and manipulated the Fullname column to separate it on “,” then used the Excel Function to remove duplicates. I deleted all other columns so that I was left with just over 13,000 surnames in a CSV file.
  • Matthew Proctor’s list of Australian Postcodes as a CSV. This provides Postcode, Suburb and State.
  • Brisbane City Council (Australia’s largest Council) has a dataset with all bus locations that includes Street names as a CSV. Like I did for Surname I used the Excel Function to remove duplicates, removed the blanks and the other columns and then had just over 1600 street names.

The Script

The script is pretty simple. It imports each of the CSV’s listed above and generates a random number based on the number of records in each file.

The GitHub Repo contains the PowerShell script along with the source files. Change line 3 for the location where you store the CSV files and change line 66 for the number of users to generate. I’ve left the end of the script empty. I either insert the API call to create the users, or the PowerShell cmdlet with the data to do the creation depending on where I’m creating the users.

darrenjrobinson/Generate-Random-Users

Generate-Random-Users – Using real data, randomise it to create realistic users with Australian addresses

The Output

Here is a sample output in JSON format.

{
"Street": "370 Miskin St",
"Surname": "Burne",
"Suburb": "WOODBROOK",
"Postcode": "3451",
"State": "VIC",
"GivenName": "Miro"
}
{
"Street": "293 Preston Rd",
"Surname": "Partingale",
"Suburb": "MARRARA",
"Postcode": "812",
"State": "NT",
"GivenName": "Daniella"
}
{
"Street": "409 Orchard St",
"Surname": "Liaseyer",
"Suburb": "THURGOONA",
"Postcode": "2640",
"State": "NSW",
"GivenName": "Ariana"
}
{
"Street": "775 Station Rd",
"Surname": "Nevin",
"Suburb": "AVON DOWNS",
"Postcode": "862",
"State": "NT",
"GivenName": "Naria"
}

Summary

Using data publicly available and PowerShell it is possible to quickly generate a dataset of representative users and addresses. Generating other attributes is as easy as extrapolating from the existing data or supplementing it with additional source data files.

A Voice Assistant for Microsoft Identity Manager

This is the third and final post in my series around using your voice to query/search Microsoft Identity Manager or as I’m now calling it, the Voice Assistant for Microsoft Identity Manager.

The two previous posts in this series detail some of my steps and processes in developing and fleshing out this concept. The first post detailed the majority of the base functionality whilst the second post detailed the auditing and reporting aspects into Table Storage and Power BI.

My final architecture is depicted below.

Identity Manager integration with Cognitive Services and IoT Hub 4x3

I’ve put together more of an overview in a presentation format embedded here.

GitPitch Presents: github/darrenjrobinson/MIM-VoiceAssistant/presentation

The Markdown Presentation Service on Git.

If you’re interested in building the solution checkout the Github Repo here which includes the Respeaker Python Script, Azure Function etc.

Let me know how you go @darrenjrobinson

Implementing Azure API Management with the Lithnet Microsoft Identity Manager Rest API

Introduction

Earlier this week I wrote this post that detailed implementing the Lithnet REST API for FIM/MIM Service. I also detailed using PowerShell to interact with the API Endpoint.

Now lets imagine you are looking to have a number of Azure Serverless features leverage your Rest API enabled Microsoft Identity Manager environment. Or even offer it “as-a-Service”. You’ll want to have some visibility as to how it is performing, and you’ll probably want to implement features such as caching and rate limiting let alone putting more security controls around it. Enter Azure API Management, which provides all those functions and more.

In this post I detail getting started with Azure API Management by using it to front-end the Lithnet FIM/MIM Rest API.

Overview

In this post I will detail;

  • Enabling Azure API Management
  • Configuring the Lithnet FIM/MIM Rest API integration with Azure API Management
  • Accessing MIM via Azure API Management and the Lithnet FIM/MIM Rest API using PowerShell
  • Reporting

Prerequisites

For this particular scenario I’m interfacing Azure API Management with a Rest API that uses Digest Authentication. So even though it is a Windows WCF Webservice you could do something similar with a similar API Endpoint. If the backend API endpoint is using SSL it will need to have a valid certificate. Even though Azure API Management allows you to add your own certificates I had issues with Self Signed Certificates. I have it working fine with Lets Encrypt issued certificates. Obviously you’ll need an Azure Subscription as well as an App/Servive with an API.

Enabling Azure API Management

From the Azure Portal select Create a resource and search for API management and select it.

Add API Mgmt.PNG

Select Create

Create API Mgmt.PNG

Give your API Management Service a name, select a subscription, resource group etc and select Create.

API Mgmt Config 1.PNG

Once you select Create it will take about 30 minutes to be deployed.

Configuring the Lithnet FIM/MIM Rest API integration with Azure API Management

Once your new API Management service has been deployed, from the Azure Portal select the API Management services blade and select the API Management service that you just created. Select APIs.

API Config 1.PNG

Select Add API and then select Add a new API

API Mgmt Config 2.PNG

Give the API a name, description, enter the URI for your API EndPoint, and select HTTPS. I’m going to call this MIMSearcher so have entered that under API URL Suffix. For initial testing under Products select starter. Finally select Create.

API Mgmt Config 4.PNG

We now have our base API setup. From the Backend tile select the Edit icon.

API Mgmt Config 5.PNG

As the backed is authenticated using Basic Authentication, select Basic in Gateway credentials and enter the details of an account with access that will be used by the API Gateway. Select Save.

API Mgmt Config 6.PNG

Now from our API Configuration select Add operation.

API Mgmt Config 7.PNG

First we will create a test operation for the Help page on the Lithnet FIM/MIM Rest API. Provide a Display name, and for the URL add /v2/help. Give it a description and select Create.

Note: I could have had v2 as part of the base URI for the API in the previous steps. I didn’t as I will be using API’s from both v1 and v2 and didn’t want to create multiple operations.

API Mgmt Config 8.PNG

Select the new Operation (Help)

API Mgmt Config 9.PNG

Select the Test menu. Select Send.

API Mgmt Config 10.PNG

If everything is set up correctly you will get a 200 Success OK response as below.

API Mgmt Config 11.PNG

Accessing MIM via Azure API Management and the Lithnet FIM/MIM Rest API using PowerShell

Head over to your API Portal. The URL is https://.portal.azure-api.net/ where is the name you gave your API Management Service shown in the third screenshot at the top of this post. If you are doing this from the browser you used to create the API Management Service you should be signed in already. From the Administrator menu on the right select Profile.

Test API Mgmt 1.PNG

Click on Show under one of the keys and record its value.

Test API Mgmt 2.PNG

Using PowerShell ISE or VSCode update the following Code Snippet and test.

$APIURL = 'https://.azure-api.net//v2/help'
$secret = 'yourSecret'
$Headers = @{'Ocp-Apim-Subscription-Key' = $secret} 
[Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12

$response = Invoke-RestMethod -Uri $APIURL -Headers $Headers -ContentType "application/json" -UseBasicParsing -Method Get
$response

The snippet will create a Web Request to the new API and display the results.

Test API Mgmt 3.PNG

Querying the Lithnet Rest API via Azure API Management

Now that we have a working solution end-to-end, let’s do something useful with it. Looking at the Lithnet Rest API, the Resources URI is the key one exposing Resources from the MIM Service.

Resources.PNG

Let’s create a new Operation for Resources similar to what we did for the Help. After selecting Create configure the Backend for Basic Authentication like we did for Help.

Get Resources.PNG

Testing out the newly exposed endpoint is very similar to before. Just a new APIURL with the addition of /?Person to return all Person Resources from the MIM Portal. It lets us know it’s returned 7256 Person Objects, and the Results are still paged (100 by default).

Get Persons.PNG

Let’s now Search for just a single user. Search for a Person object whose Display Name is ‘darrenjrobinson’.

$query = "Person[DisplayName='darrenjrobinson']"
$queryEncoded = [System.Web.HttpUtility]::UrlEncode($query)

$APIURL = "https://.azure-api.net//v2/resources/?filter=/$($queryEncoded)" 
$secret = 'yourSecret'
$Headers = @{'Ocp-Apim-Subscription-Key' = $secret} 
[Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12

$user = Invoke-RestMethod -Uri $APIURL -Headers $Headers -ContentType "application/json" -UseBasicParsing -Method Get
$user

Executing, we get a single user returned.

Search for User.PNG

Reporting

Using the Publisher Portal we can get some Stats on what is happening with our API Management implementation.

Go to https://.portal.azure-api.net/admin and select Analytics.

We then have visibility to what has been using the API Management Service. At a Glance gives and overview and you can drill down into;

  • Top Users
  • Top Products
  • Top subscriptions
  • Top APIs
  • Top Operations

At a glance looks like this;

At a Glance Stats.PNG

And Top Operations looks like this;

Top Operations.PNG

Summary

That is a quick start guide to implementing Azure API Management in front of a Rest API and using PowerShell to integrate with it. Next steps would be to enable caching, and getting into more of the advanced features. Enjoy.

Getting started with the Lithnet REST API for the Microsoft Identity Manager Service

Introduction

A common theme with my posts on Microsoft Identity is the extensibility of it particularly with the Lithnet tools that Ryan has released.

One such tool that I’ve used but never written about is the Lithnet REST API for the Microsoft Identity Manger Service. For a small proof of concept I’m working on I was again using this REST API and I needed to update it as Ryan has recently added some new functionality. I realised I hadn’t set it up in a while and while Ryan’s documentation is very good it was written some time ago when IIS Manager looked a little different. So here is a couple of screenshots and a little extra info to get you started if you haven’t used it before to supplement Ryan’s documentation located here.

Configuring the Lithnet REST API for the Microsoft Identity Manager Service

You can download the Lithnet REST API for the FIM/MIM Service from here

If you are using the latest version of the Lithnet Rest API you will need to make sure you have .NET 4.6.1 installed. If you are running Windows Server 2012 R2 you can get it from here.

When configuring your WebSite make sure you choose .NET v4.5 Classic for the Application Pool.

WebSite AppPool Settings.PNG

The web.config must match your MIM version. Currently the latest is 4.4.1749.0 as detailed here. That therefore looks like this.

WebConfig Resource Management Version.PNG

Finally you’ll need an SSL Certificate. For development environments a Self-Signed Certificate is fine. Personally I use this Cert Generator. Make sure you put the certificate in the cert store on the machine you will be testing access with. Here’s an example of my command line for generating a cert.

Cert Generation.PNG

You could also use Lets Encrypt.

In your bindings in IIS have the Host Name match your certificate.

Bindings.PNG

If you’ve done everything right you will be able to hit the v2 endpoint help. By default with Basic Auth enabled you’ll be prompted for a username and password.

v2 EndPoint.PNG

Using PowerShell to query MIM via the Lithnet Rest API

Here is an example script to query MIM via the Lithnet MIM Rest API. Update for your credentials (Lines 2 and 3), the URL of the server running the API Endpoint (Line 11) and what you are querying for (Line 14). My script takes into account Self Signed Certs in a Development environment.

Example output from a query is shown below.

Example Output.PNG

Summary

Hopefully that helps you quickly get started with the Lithnet REST API for the FIM/MIM Service. I showed an example using PowerShell directly, but using an Azure Function is also a valid pattern. I’ve covered similar functionality in the past.
 

Deploying a SailPoint IdentityNow Virtual Appliance in Azure

Introduction

The CentOS image that SailPoint provide for the IdentityNow Virtual Appliance that performs integration between ‘Sources’ and IdentityNow is VMWare based. I don’t have any VMWare Infrastructure to run it on and really didn’t want to run up any VMWare environments for this component. All my other infrastructure is in Azure. I’d love to run my VA(s) in Azure too.

In discussions with SailPoint I understand it is simply a case that they haven’t certified their CentOS image on Azure. So I figured I’d convert the VM, get it into Azure and see if it works from my Sandpit environment. This blog post details how I got it working.

Disclaimer: If you use this for more than a Sandpit/Test environment let your SailPoint CSM know. This isn’t an approved process or a support configuration. That said it works for me.

Overview

This is the high-level process I threw together that worked for me.

  1. Obtain the CentOS Image from the IdentityNow Virtual Appliance Setup
  2. Convert the VMWare VMDK image to Hyper-V VHD format using VirtualBox vboxmanage (free)
  3. From the Azure MarketPlace create a Seed VM based on CentOS (with new Resource Group, Storage Account, Virtual Network etc)
  4. Upload the VHD to the Azure Storage Account (associated with VM from Step 3) using Azure Storage Explorer
  5. Create a new VM based off the VM from Step 3 to use the disk from Step 4 as the Operating System disk
  6. Log in and configure the Virtual Appliance

Convert VMWare VM to Hyper-V.png

Prerequisites

  1. Virtual Box (for the disk image converter). You could probably do it with other tools but I’ve used this before and it just works.
  2. Enough hard disk space for the VA image and the converted image. The base image is ~2.8Gb and when converted to a fixed disk image it becomes ~128Gb (which can compress to ~3Gb for initial upload).
  3. Azure Storage Explorer. We’ll need this to upload the converted virtual disk to Azure.

SailPoint Virtual Appliance CentOS VMWare Image

To download the CentOS VMWare Image login to the Admin section of your IdentityNow Tenant.  Under Admin => Connections => Virtual Appliances create a New Cluster. Select that Cluster then Virtual Appliances => New 

Download the Appliance Package 

Create New VA.PNG

Converting the CentOS VMWare Virtual Disk to a Fixed Hyper-V Virtual Disk

I already had Virtual Box installed on my computer. I had to give the full path to VBoxManage (as shown below) and called it with the switches to convert the image;

vboxmanage clonehd –format VHD –variant Fixed

The –variant Fixed switch takes the dynamic image and converts it to Fixed as this is a requirement in Azure.

ConvertVADisk 1.PNG

The image conversion started and completed in under ten minutes.

Converted Fixed.PNG

Creating an Azure CentOS VM

In the Azure Portal I created a New Resource and chose CoreOS.

NewCoreOS 1

I gave it a name, chose HDD as the disk type and gave it a Username and Password.

NewCoreOS 2

I chose sizing in line with the recommendations for a Virtual Appliance.

NewCoreOS 3

And kept everything else simple (for my sandpit environment).

NewCoreOS 4

After the VM had deployed I had a Resource Group with the necessary Virtual Network, Storage Account etc.

Resource Group.PNG

Upload the Converted Disk to Azure Storage

I created a vhd container (in the Storage Group associated with the VM I just created) to hold the new VHD. Using Azure Storage Explorer I then uploaded the converted image. Select Page Blob for the blob type.

Upload VHD

You’ll want to have a decent internet connection to do this. I converted the SailPoint image on an Azure VM (to which I added a 256Gb data disk too). I then uploaded the new 128Gb VHD disk image from within Azure to the target Resource Group in about 75 minutes.

Upload VHD 2

Below I show the SailPoint Virtual Appliance CentOS OS converted disk image uploaded to Azure Storage Account Blob Storage.

Upload VHD 3.PNG

Generate SAS Token / Get Blob URI

We won’t used a SAS Token, but this just gives easy access to the Storage Blob URL. Right click on the VHD Blob and select Generate Shared Access Signature. Select Create.

Right Click - Get Shared Access Signature

Copy the URL. We’ll need parts of this for the script to create a new CentOS VM with our VA Disk Image.

Get VHD and BLOB Details

Create the new VM for our Virtual Appliance

Update the script below for:

  • The Resource Group you created the Seed VM in (line 2)
  • The Seed VM Name (line 4)
  • The Seed VM Subnet Name (line 6)

Each of those are easily obtained from the Seed VM Summary as highlighted below.

  •  update the Disk Blob details in Live 8 and 10 as copied earlier

After stepping through the script to create the new VM, and happy with the new name etc, I executed the New-AzureRMVM command.

Create New VM

And the VM was created in a couple of minutes.

Create VM Initiated

Accessing the new VM

Getting the IP address from the new VM Summary I SSH’d into it.

VM Started

And logged in with the default credentials from SailPoint. (Windows Subsystem for Linux is awesome).

SSH In to VA

Next Steps

  1. Change the password on your Virtual Appliance (passwrd)
  2. Create a DNS Name, update the configuration as per SailPoint VA Configuration tasks
  3. Create the VA and Test the Connection from the IdentityNow Portal
  4. Delete your original SeedVM as it is no longer required
  5. Add an NSG to the new VM
  6. Create another VM in a different location for High Availability and configure it in IdentityNow

Below shows my Azure based Virtual Appliance connected and all setup.

Cluster Up and Running.PNG

Summary

Whilst not officially supported it is possible to convert the SailPoint Virtual Appliance VMWare based image to an Azure compatible Hyper-V image and assign it as the Operating System disk on an Azure Linux (CoreOS) Virtual Machine. If you need to do something similar I hope my approach gives you some ideas.

If you then need to create another Virtual Appliance in Azure you have a Data Disk you can assign to a VM and upload to wherever it needs to be for creation of another Virtual Appliance VM.

How to use the FIM/MIM Azure Graph Management Agent for B2B Member/Guest Sync between Azure Tenants

 

Introduction

UPDATE: June 2018
When I originally wrote this post the intent was to test
the ability of the Graph MA to export to Azure AD. 
That works.

That then extended to messing with an identity type other 
than member (which works to an extent) but I detailed 
guests. However that is incomplete. I do have a working 
solution that utilises the Graph Invitation API via my 
favourite PowerShell MA (Granfeldt PS MA) and the 
PowerShell cmdlet New-AzureADMSInvitation from the 
Azure AD PowerShell Module.

That solution involves the MS Graph MA connected to a 
Partner tenant to get visibility of partner users and 
then creating relevant users in the home tenant via a 
PowerShell MA and the New-AzureADMSInvitation cmdlet. 
Another MS Graph MA connected to the home tenant provides 
easy visibility of additional guest user attributes for
ongoing functions such as reporting and de-provisioning. 

I will write that up later in July.  Stay tuned and keep
the above in mind when reading this post.

 

Just landed from the Microsoft Identity Manager Engineering Team is a new Management Agent built specifically for managing Azure Users and Groups and Contacts.

Microsoft have documented a number of scenarios for implementing the management agent. The scenarios the MA has been built for are valid and I have customers that will benefit from the new MA immediately. There is however another scenario I’m seeing from a number of customers that is possible but not detailed in the release notes. That is B2B Sync between Azure Tenants; using Microsoft Identity Manager to automate the creation of Guests in an Azure Tenant.

This could be one-way or multi-way depending on what you are looking to achieve. Essentially this is the Azure equivalent of using FIM/MIM for Global Address List Sync.

B2B MA.png

Overview

The changes are minimal to the documentation provided with the Management Agent. Essentially;

  • ensure you enable Write Permissions to the Application you create in the AAD Tenant you will be writing too
  • Enable the Invite Guest users to the organization permission on the AAD Application
  • Create an Outbound Sync Rule to an AAD Tenant with the necessary mandatory attributes
  • Configure the Management Agent for Export Sync Profiles

In the scenario I’m detailing here I’m showing taking a number of users from Org2 and provisioning them as Guests in Org1.

What I’m detailing here supplements the Microsoft documentation. For configuring the base MA definitely checkout their documentation here.

Microsoft Graph Permissions

When setting up the Graph Permissions you will need to have Write permissions to the Target Azure AD for at least Users. If you plan to also synchronize Groups or Contacts you’ll need to have Write permissions for those too.

Graph Permissions 1

In addition as we will be automating the invitation of users from one Tenant to another we will need to have the permission ‘Invite guest users to the organization’.

Graph Permissions 2

With those permissions selected and while authenticated as an Administrator select the Grant Permissions button to assign those permissions to the Application.

Grant Permissions 1Grant Permissions 2

Repeat this in both Azure AD Tenants if you are going to do bi-directional sync.  If not you only need write and invite permissions on the Tenant you will be creating Guest accounts in.

Creating the Import/Inbound Sync Rules Azure Tenants

Here is an example of my Import Sync Rules to get Members (Users) in from an Azure Tenant. I have an inbound sync rule for both Azure Tenants.

Sync Rules.PNG

Make sure you have ‘Create Resource in FIM‘ configured on the source (or both if doing bi-directional) Graph Connector.

Sync Rule Relationship.PNG

The attribute flow rules I’ve used are below. They are a combination of the necessary attributes to create the corresponding Guest account on the associated management agent and enough to be used as logic for scoping who gets created as a Guest in the other Tenant. I’ve also used existing attributes negating the need to create any new ones.

Inbound SyncRule Flow.PNG

Creating the Export/Outbound Sync Rule to a Partner B2B Tenant

For your Export/Outbound rule make sure you have ‘Create resource in external system’ configured.

Export Relationship.PNG

There are a number of mandatory attributes that need to be flowed out in order to create Guests in Azure AD. The key attributes are;

  • userType = Guest
  • accountEnabled = True
  • displayName is required
  • password is required (and not export_password as normally required on AD style MA’s in FIM/MIM)
  • mailNickname is required
  • for dn and id initially I’m using the id (flowed in import to employeeID) from the source tenant. This needs to be provided to the MA to get the object created. Azure will generate new values on export so we’ll see a rename come back in on the confirming import
  • userPrincipalName is in the format of
    • SOURCEUPN (with @ replaced with _ ) #EXT# DestinationUPNSuffix
    • e.g user1_org2.com#EXT#org1.com

Export Attributes.PNG

Here is an example of building a UPN.

UPN Rule.PNG

Sets, Workflows and MPR’s

I didn’t need to do anything special here. I just created a Set based on attributes coming in from the source Azure Tenant to scope who gets created in the target Tenant. An MPR that looks for transition into the Set and applies the Workflow that associates the Sync Rule.

End to End

After synchronizing in from the source (B2B Org 2) the provisioning rules trigger and created the Users as Guests on B2B Org 1.

Prov to Org1 1.PNG

Looking at the Pending Export we can see our rules have applied.

Pending Export.PNG

On Export the Guest accounts are successfully created.

Export Success.PNG

On the confirming import we get the rename as Azure has generated a new CN and therefore DN for the Guest user.

Rename on Import 2.PNG

Looking into Azure AD we can see one of our new Guest users.

User in AAD.PNG

Summary

Using the Microsoft Azure B2B Graph Management Agent we can leverage it to create users from one Tenant as Azure AD Members in another Tenant. Stay tuned for another post detailed the solution detailed in the Update in the Introduction.

Validating a Yubico YubiKeys’ One Time Password (OTP) using Single Factor Authentication and PowerShell

Multi-factor Authentication comes in many different formats. Physical tokens historically have been very common and moving forward with FIDO v2 standards will likely continue to be so for many security scenarios where soft tokens (think Authenticator Apps on mobile devices) aren’t possible.

Yubico YubiKeys are physical tokens that have a number of properties that make them desirable. They don’t use a battery (so aren’t limited to the life of the battery), they come in many differing formats (NFC, USB-3, USB-C), can hold multiple sets of credentials and support open standards for multi-factor authentication. You can checkout Yubico’s range of tokens here.

YubiKeys ship with a configuration already configured that allows them to be validated against YubiCloud. Before we configure them for a user I wanted a quick way to validate that the YubiKey was valid. You can do this using Yubico’s demo webpage here but for other reasons I needed to write my own. There wasn’t any PowerShell examples anywhere, so now that I’ve worked it out, I’m posting it here.

Prerequisites

You will need a Yubikey. You will need to register and obtain a Yubico API Key using a Yubikey from here.

Validation Script

Update the following script to change line 2 for your ClientID that  you received after registering against the Yubico API above.

Running the script validates that the Key if valid.

YubiKey Validation.PNG

Re-running the submission of the same key (i.e I didn’t generate a new OTP) gets the expected response that the Request is Replayed.

YubiKey Validation Failed.PNG

Summary

Using PowerShell we can negate the need to leverage any Yubico client libraries and validate a YubiKey against YubiCloud.

 

A synopsis of my first Microsoft (MVP) Summit

Last week I attended my first Microsoft Most Valuable Professional (MVP) Summit. Compared to a lot of the conferences I’ve been to over the years this was tiny with just over 2000 attendees. The difference however is that every attendee is an expert in their field (associated with at least one Microsoft technology) and they come from over 80 countries. It is the most diverse mix of attendees for the number of participants.

The event is also not the typical tech type conference that provides you details on current trends, public road maps and guidance on how to implement or migrate technology. Instead it is a look behind the development curtain and almost full transparent dialogue with the product and engineering teams determining and building the future for each technology stream. It also isn’t held at a sterile function center. It’s held on site at Microsoft’s headquarters in Redmond, Washington. Everywhere you look you can find nuggets of Microsoft’s history. Nightly activities are predominantly centered around Bellevue (a short distance from Redmond).

Hotmail500px

My MVP is associated with Identity & Access. Internally at Microsoft they refer to the small number of us in that category an Identity MVP’s. I spent the week in deep technical sessions around Identity and Access Management getting insights for the short, medium and longer term plans for all things Identity & Access Management related and conversing with my peers. I can’t say more than that, as privilege for that level of insight is only possible through a strict and enforced NDA (Non Disclosure Agreement) between each MVP and Microsoft.

IMG_E6455-Small

I thoroughly enjoyed my first MVP Summit. I reconnected with a number of old colleagues and acquaintances and made a bunch of new connections both within Microsoft and the Identity MVP community. It has prepared me with vision of what’s coming that will be directly applicable to many of the longer term projects I’m currently designing. It definitely filled in the detail between the lines associated with recent Microsoft announcements in the Identity and Access Management space.

Want to become an MVP? Looking to know what it takes to be awarded with MVP status? Want a full rundown on the benefits? Checkout this three-part blog post starting here by Alan about the MVP program.

Identifying Active Directory Users with Pwned Passwords using Microsoft/Forefront Identity Manager v2, k-Anonymity and Have I Been Pwned

 

Background

In August 2017 Troy Hunted released a sizeable list of Pwned Passwords. 320 Million in fact.

I subsequently wrote this post on Identifying Active Directory Users with Pwned Passwords using Microsoft/Forefront Identity Manager which called the API and sets a boolean attribute in the MIM Service that could be used with business logic to force users with accounts that have compromised passwords to change their password on next logon.

Whilst that was a proof of concept/discussion point of sorts AND  I had a disclaimer about sending passwords across the internet to a third-party service there was a lot of momentum around the HIBP API and I developed a solution and wrote this update to check the passwords locally.

Today Troy has released v2 of that list and updated the API with new features and functionality. If you’re playing catch-up I encourage you to read Troy’s post from August last year, and my two posts about checking Active Directory passwords against that list.

Leveraging V2 (with k-Anonymity) of the Have I Been Pwned API

With v2 of the HIBP passwod list and API the number of leaked credentials in the list has grown to half a billion. 501,636,842 Pwned Passwords to be exact.

With the v2 list in conjunction with Junade Ali from Cloudflare the API has been updated to be leveraged with a level of anonymity. Instead of sending a SHA-1 hash of the password to check if the password you’re checking is on the list you can now send a truncated version of the SHA-1 hash of the password and you will be returned a set of passwords from the HIBP v2 API. This is done using a concept called k-anonymity detailed brilliantly here by Junade Ali.

v2 of the API also returns a score for each password in the list. Basically how many times the password has previously been seen in leaked credentials lists. Brilliant.

Updated Pwned PowerShell Management Agent for Pwned Password Lookup

Below is an updated Password.ps1 script for the previous API version of my Pwned Password Management Agent for Microsoft Identity Manager. It functions by;

  • taking the new password received from PCNS
  • hashes the password to SHA-1 format
  • looks up the v2 HIBP API using part of the SHA-1 hash
  • updates the MIM Service with Pwned Password status

Checkout the original post with all the rest of the details here.

Summary

Of course you can also download (recommended via Torrent) the Pwned Password dataset. Keep in mind that the compressed dataset is 8.75 GB and uncompressed is 29.4 GB. Convert that into an On-Premise SQL Table(s) as I did in the linked post at the beginning of this post and you’ll be well in excess of that.

Awesome work from Tory and Junade.

 

Automating the generation of Microsoft Identity Manager Configuration Documentation

Introduction

Last year Microsoft released the Microsoft Identity Manager Configuration Documenter which is available here. It is a fantastic little tool from Microsoft that supersedes its predecessor from the Microsoft Identity Manager 2003 Resource Toolkit (which only documented the Sync Server Configuration).

Running the tool (a PowerShell Module) against a base out-of-the-box reference configuration for FIM/MIM Servers reconciled against an exported configuration from the MIM Sync and Service Servers from an implementation, generates an HTML Report document that details the existing configuration of the MIM Service and MIM Sync.

Overview

Last year I wrote this post based on an automated solution I implemented to perform nightly backups of a FIM/MIM environment during development.

This post details how I’ve automated another daily task for a large development environment where a number of changes are going on and I wanted to have documentation generated that detailed the configuration for each day. Partly to quickly be able to work out what has changed when needing to roll back/re-validate changes, and also to have the individual configs from each day so they could also be used if we need to rollback.

The process uses an Azure Function App that uses Remote PowerShell into MIM to;

  1. Leverage a modified (stream lined version) of my nightly backup Azure Function to generate the Schema.xml and Policy.xml MIM Service configuration files and the Lithnet MIIS Automation PowerShell Module installed on the MIM Sync Server to export of the MIM Sync Server Configuration
  2. Create a sub-directory for each day under the MIM Documenter Tool to hold the daily configs
  3. Execute the generation of the Report and have the Report copied to the daily config/documented solution

Obtaining and configuring the MIM Configuration Documenter

Download the MIM Configuration Documenter from here and extract it to somewhere like c:\FIMDoco on your FIM/MIM Sync Server. In this example in my Dev environment I have the MIM Sync and Service/Portal all on a single server.

Then update the Invoke-Documenter-Contoso.ps1 (or whatever you’ve renamed the script to) to make the following changes;

  • Update the following lines for your version and include the new variable $schedulePath and add it to the $pilotConfig variable. Create the C:\FIMDoco\Customer and C:\FIMDoco\Customer\Dev directories (replace Customer with something appropriate.
######## Edit as appropriate ####################################
$schedulePath = Get-Date -format dd-MM-yyyy
$pilotConfig = "Customer\Dev\$($schedulePath)" # the path of the Pilot / Target config export files relative to the MIM Configuration Documenter "Data" folder.
$productionConfig = "MIM-SP1-Base_4.4.1302.0" # the path of the Production / Baseline config export files relative to the MIM Configuration Documenter "Data" folder.
$reportType = "SyncAndService" # "SyncOnly" # "ServiceOnly"
#################################################################
  • Remark out the Host Settings as these won’t work via a WebJob/Azure Function
#$hostSettings = (Get-Host).PrivateData
#$hostSettings.WarningBackgroundColor = "red"
#$hostSettings.WarningForegroundColor = "white"
  • Remark out the last line as this will be executed as part of the automation and we want it to complete silently at the end.
# Read-Host "Press any key to exit"

It should then look something like this;

Azure Function to Automate execution of the Documenter

As per my nightly backup process;

  • I configured my MIM Sync Server to accept Remote PowerShell Sessions. That involved enabling WinRM, creating a certificate, creating the listener, opening the firewall port and enabling the incoming port on the NSG . You can easily do all that by following my instructions here. From the same post I setup up the encrypted password file and uploaded it to my Function App and set the Function App Application Settings for MIMSyncCredUser and MIMSyncCredPassword.
  • I created an Azure PowerShell Timer Function App. Pretty much the same as I show in this post, except choose Timer.
    • I configured my Schedule for 6am every morning using the following CRON configuration
0 0 6 * * *
  • I also needed to increase the timeout for the Azure Function as generation of the files to execute the report and the time to execute the report exceed the default timeout of 5 mins in my environment (19 Management Agents). I increased the timeout to the maximum of 10 mins as detailed here. Essentially added the following to the host.json file in the wwwroot directory of my Function App.
{
 "functionTimeout": "00:10:00"
}

Azure Function PowerShell Timer Script (Run.ps1)

This is the Function App PowerShell Script that uses Remote PowerShell into the MIM Sync/Service Server to export the configuration using the Lithnet MIIS Automation and Microsoft FIM Automation PowerShell modules.

Note: If your MIM Service is on a different host you will need to install the Microsoft FIM Automation PowerShell Module on your MIM Sync Server and update the script below to change references to http://localhost:5725 to whatever your MIM Service host is.

Testing the Function App

With everything configured, manually running the Function App and checking the output window if you’ve configured everything correct will show success in the Logs as shown below. In this environment with 19 Management Agents it takes 7 minutes to run.

Running the Azure Function.PNG

The Report

The outcome everyday just after 6am is I have (via automation);

  • an Export of the Policy and Schema Configuration from my MIM Service
  • an Export of the MIM Sync Server Configuration (the Metaverse and all Management Agents)
  • I have the MIM Configuration Documenter Report generated
  • If I need to rollback changes I have the ability to do that on a daily interval (either for a MIM Service change or an individual Management Agent change

Under the c:\FIMDoco\Data\Customer\Dev\\Report directory is the HTML Configuration Report.

Report Output.PNG

Opening the report in a browser we have the configuration of the MIM Sync and MIM Service.

Report