Automating the generation of Microsoft Identity Manager Configuration Documentation

Introduction

Last year Microsoft released the Microsoft Identity Manager Configuration Documenter which is available here. It is a fantastic little tool from Microsoft that supersedes its predecessor from the Microsoft Identity Manager 2003 Resource Toolkit (which only documented the Sync Server Configuration).

Running the tool (a PowerShell Module) against a base out-of-the-box reference configuration for FIM/MIM Servers reconciled against an exported configuration from the MIM Sync and Service Servers from an implementation, generates an HTML Report document that details the existing configuration of the MIM Service and MIM Sync.

Overview

Last year I wrote this post based on an automated solution I implemented to perform nightly backups of a FIM/MIM environment during development.

This post details how I’ve automated another daily task for a large development environment where a number of changes are going on and I wanted to have documentation generated that detailed the configuration for each day. Partly to quickly be able to work out what has changed when needing to roll back/re-validate changes, and also to have the individual configs from each day so they could also be used if we need to rollback.

The process uses an Azure Function App that uses Remote PowerShell into MIM to;

  1. Leverage a modified (stream lined version) of my nightly backup Azure Function to generate the Schema.xml and Policy.xml MIM Service configuration files and the Lithnet MIIS Automation PowerShell Module installed on the MIM Sync Server to export of the MIM Sync Server Configuration
  2. Create a sub-directory for each day under the MIM Documenter Tool to hold the daily configs
  3. Execute the generation of the Report and have the Report copied to the daily config/documented solution

Obtaining and configuring the MIM Configuration Documenter

Download the MIM Configuration Documenter from here and extract it to somewhere like c:\FIMDoco on your FIM/MIM Sync Server. In this example in my Dev environment I have the MIM Sync and Service/Portal all on a single server.

Then update the Invoke-Documenter-Contoso.ps1 (or whatever you’ve renamed the script to) to make the following changes;

  • Update the following lines for your version and include the new variable $schedulePath and add it to the $pilotConfig variable. Create the C:\FIMDoco\Customer and C:\FIMDoco\Customer\Dev directories (replace Customer with something appropriate.
######## Edit as appropriate ####################################
$schedulePath = Get-Date -format dd-MM-yyyy
$pilotConfig = "Customer\Dev\$($schedulePath)" # the path of the Pilot / Target config export files relative to the MIM Configuration Documenter "Data" folder.
$productionConfig = "MIM-SP1-Base_4.4.1302.0" # the path of the Production / Baseline config export files relative to the MIM Configuration Documenter "Data" folder.
$reportType = "SyncAndService" # "SyncOnly" # "ServiceOnly"
#################################################################
  • Remark out the Host Settings as these won’t work via a WebJob/Azure Function
#$hostSettings = (Get-Host).PrivateData
#$hostSettings.WarningBackgroundColor = "red"
#$hostSettings.WarningForegroundColor = "white"
  • Remark out the last line as this will be executed as part of the automation and we want it to complete silently at the end.
# Read-Host "Press any key to exit"

It should then look something like this;

Azure Function to Automate execution of the Documenter

As per my nightly backup process;

  • I configured my MIM Sync Server to accept Remote PowerShell Sessions. That involved enabling WinRM, creating a certificate, creating the listener, opening the firewall port and enabling the incoming port on the NSG . You can easily do all that by following my instructions here. From the same post I setup up the encrypted password file and uploaded it to my Function App and set the Function App Application Settings for MIMSyncCredUser and MIMSyncCredPassword.
  • I created an Azure PowerShell Timer Function App. Pretty much the same as I show in this post, except choose Timer.
    • I configured my Schedule for 6am every morning using the following CRON configuration
0 0 6 * * *
  • I also needed to increase the timeout for the Azure Function as generation of the files to execute the report and the time to execute the report exceed the default timeout of 5 mins in my environment (19 Management Agents). I increased the timeout to the maximum of 10 mins as detailed here. Essentially added the following to the host.json file in the wwwroot directory of my Function App.
{
 "functionTimeout": "00:10:00"
}

Azure Function PowerShell Timer Script (Run.ps1)

This is the Function App PowerShell Script that uses Remote PowerShell into the MIM Sync/Service Server to export the configuration using the Lithnet MIIS Automation and Microsoft FIM Automation PowerShell modules.

Note: If your MIM Service is on a different host you will need to install the Microsoft FIM Automation PowerShell Module on your MIM Sync Server and update the script below to change references to http://localhost:5725 to whatever your MIM Service host is.

Testing the Function App

With everything configured, manually running the Function App and checking the output window if you’ve configured everything correct will show success in the Logs as shown below. In this environment with 19 Management Agents it takes 7 minutes to run.

Running the Azure Function.PNG

The Report

The outcome everyday just after 6am is I have (via automation);

  • an Export of the Policy and Schema Configuration from my MIM Service
  • an Export of the MIM Sync Server Configuration (the Metaverse and all Management Agents)
  • I have the MIM Configuration Documenter Report generated
  • If I need to rollback changes I have the ability to do that on a daily interval (either for a MIM Service change or an individual Management Agent change

Under the c:\FIMDoco\Data\Customer\Dev\\Report directory is the HTML Configuration Report.

Report Output.PNG

Opening the report in a browser we have the configuration of the MIM Sync and MIM Service.

Report

 

Enabling and using Managed Service Identity to access an Azure Key Vault with Azure PowerShell Functions

Introduction

At the end of last week (14 Sept 2017) Microsoft announced a new Azure Active Directory feature – Managed Service Identity. Managed Service Identity helps solve the chicken and egg bootstrap problem of needing credentials to connect to the Azure Key Vault to retrieve credentials. When used in conjunction with Virtual Machines, Web Apps and Azure Functions that meant having to implement methods to obfuscate credentials that were stored within them. I touched on one method that I’ve used a lot in this post here whereby I encrypt the credential and store it in the Application Settings, but it still required a keyfile to allow reversing of the encryption as part of the automation process. Thankfully those days are finally behind us.

I strongly recommend you read the Managed Service Identity announcement to understand more about what MSI is.

This post details using Managed Service Identity in PowerShell Azure Function Apps.

Enabling Managed Service Identity on your Azure Function App

In the Azure Portal navigate to your Azure Function Web App. Select it and then from the main-pane select the Platform Features tab then select Managed service identity.

Enable Managed Service Identity
Enable Managed Service Identity

Turn the toggle the switch to On for Register with Azure Active Directory then select Save.

Enable Managed Service Identity
Enable Managed Service Identity

Back in Platform Features under General Settings select Application Settings. 

Azure Function App Settings
Azure Function App Settings

Under Application Settings you will see a subset of the environment variables/settings for your Function App. In my environment I don’t see the Managed Service Identity variables there. So lets keep digging.

Azure Function App Settings
Azure Function App Settings

Under Platform Features select Console.

Azure Function App Console
Azure Function App Console

When the Console loads, type Set. Scroll down and you should see MSI_ENDPOINT and MSI_SECRET.

NOTE: These variables weren’t immediately available in my environment. The next morning they were present. So I’m assuming there is a back-end process that populates them once you have enabled Managed Service Identity. And it takes more than a couple of hours 

MSI Variables
MSI Variables

Creating a New Azure Function App that uses Managed Service Identity

We will now create a new PowerShell Function App that will use Managed Service Identity to retrieve credentials from an Azure Key Vault.

From your Azure Function App, next to Functions select the + to create a New Function. I’m using a HttpTrigger PowerShell Function. Give it a name and select Create.

New Azure Function
New Azure Function

Put the following lines into the top of your function and select Save and Run.

# MSI Variables via Function Application Settings Variables
# Endpoint and Password
$endpoint = $env:MSI_ENDPOINT
$endpoint
$secret = $env:MSI_SECRET
$secret

You will see in the output the values of these two variables.

Managed Service Identity Variables
Managed Service Identity Variables

Key Vault

Now that we know we have Managed Service Identity all ready to go, we need to allow our Function App to access our Key Vault. If you don’t have a Key Vault already then read this post where I detail how to quickly get started with the Key Vault.

Go to your Key Vault and select Access Polices from the left menu list.

Azure Key Vault Access Policy
Azure Key Vault Access Policy

Select Add new, Select Principal and locate your Function App and click Select.

Azure Key Vault Access Policy
Azure Key Vault Access Policy

As my vault contains multiple credential types, I enabled the policy for Get for all types. Select Ok. Then select Save.

Azure Key Vault Access Policy
Azure Key Vault Access Policy

We now have our Function App enabled to access the Key Vault.

Azure Key Vault Access Policy
Azure Key Vault Access Policy

Finally in your Key Vault, select a secret you want to retrieve via your Function App and copy out the Secret Identifier from the Properties.

Azure Key Vault Secret Identifier URI
Azure Key Vault Secret Identifier URI

Function App Script

Here is my Sample PowerShell Function App script that will connect to the Key Vault and retrieve credentials. Line 12 should be the only line you need to update for your Key Vault Secret that you want to retrieve. Ensure you still have the API version at the end (which isn’t in the URI you copy from the Key Vault) /?api-version=2015-06-01

When run the output if you have everything correct will look below.

KeyVault Creds Output

Summary

We now have the basis of a script that we can use in our Azure Functions to allow us to use the Managed Service Identity function to connect to an Azure Key Vault and retrieve credentials. We’ve limited the access to the Key Vault to the Azure Function App to only GET the credential. The only piece of information we had to put in our Function App was the URI for the credential we want to retrieve. Brilliant.

Display Microsoft Identity Manager Sync Engine Statistics in the MIM Portal

Introduction

In the Microsoft / Forefront Identity Manager Synchronization Service Manager under Tools we have a Statistics Report. This gives a break down of each of the Management Agents and the Connectors on each MA.

I had a recent requirement to expose this information for a customer but I didn’t want them to have to connect to the Synchronization Server (and be given the permissions to allow them to). So I looked into another way of providing a subset of this information in the MIM Portal itself.  This post details that solution.

MIM / FIM Synchronization Server Management Agent & Metaverse Statistics
MIM / FIM Synchronization Server Management Agent & Metaverse Statistics

Overview

I approached this in a similar way I did for the User Object Report I recently developed. The approach is;

  • Azure PowerShell Function App that uses Remote PowerShell to connect to the MIM Sync Server and leverage the Lithnet MIIS Automation PowerShell Module to enumerate all Management Agents and build a report on the information required in the report
  • A NodeJS WebApp calls the Azure PowerShell Function App onload to generate the report and display it
  • The NodeJS WebApp is embedded in the MIM Portal as a new Nav Bar Resource and Page

The graphic below details the basic logical integration.

MVStatsReportOverview

Prerequisites

The prerequisites to perform this I’ve covered in other posts. In concept as described above it is similar to the User Object report, that has the same prerequisites and I did a pretty good job on detailing those here. To implement this then that post is the required reading to get you ready.

Azure PowerShell Function App

Below is the raw script from my Function App that connects to the MIM Sync Server and retrieves the Management Agent Statistics for the report.

NodeJS Web App

The NodeJS Web App is the app that gets embedded in the MIM Portal that calls the Azure Function to retreive the data and then display it. To get started you’ll want to start with a based NodeJS WebApp. This post will get you started. Implementing a NodeJS WebApp using Visual Studio Code 

The only extension I’m using on top of what is listed there is JQuery. So once you have NodeJS up and running in your VSCode Terminal type npm install jquery and then npm install.

I’ve kept it simple and contained all in a single HTML file using JQuery.

In you NodeJS project you will need to reference your report.html file. It should look like this (assuming you name your report report.html)

var express = require('express');
var router = express.Router();
/* GET - Report page */
router.get('/', function(req, res, next) {
   res.sendFile('report.html', { root:'./public'});
});

module.exports = router;

The Embedded Report

This is what my report looks like embedded in the MIM Portal.

Microsoft Identity Manager Statistics Report
Microsoft Identity Manager Statistics Report

Summary

Integration of FIM / MIM with Azure Platform as a Service Services opens a world of functionality including the ability to expose information that was previously only obtainable by the FIM / MIM Administrator.

Quickly creating and using an Azure Key Vault with PowerShell

 

Introduction

A couple of weeks back I was messing around with the Azure Key Vault looking to centralise a bunch of credentials for my ever-growing list of Azure Functions that are automating numerous tasks. What I found was getting an Azure Key Vault setup and getting credentials in and out was a little more cumbersome than what I thought it should be. At that same point via Twitter this tweet appeared in my timeline from a retweet. I’m not too sure why, but maybe because I’m been migrating to VSCode myself I checked out Axel’s project.

Axel Agazoth tweet
Axel Agazoth tweet

Axel’s PowerShell Module simplifies creating and integrating with the Azure Key Vault. After messing with it and suggesting a couple of enhancements that Axel graciously entertained, I’m creating vaults, adding and removing credentials in the simplified way I’d wanted.

This quickstart guide to using this module will get you started too.

Create an Azure Key Vault

This is one of the beauties of Axel’s module. If the Resource Group and/or Storage Group you want associated with your Key Vault doesn’t exist then it creates them.

Update the following script for the location (line 8) and the name (line 10) that will be given to your Storage Account, Resource Group and Vault. Modify if you want to use different names for each.

Done, Key Vault created.

Create Azure KeyVault
Create Azure KeyVault
Azure Key Vault Created
Azure Key Vault Created

Connect to the Azure Key Vault

This script assumes you’re now in a new session and wanting to connect to the Key Vault. Again, a simplified version whereby the SG, RG and KV names are all the same.  Update for your location and Key Vault name.

Connected.

Connect to Azure Key Vault
Connect to Azure Key Vault

Add a Certificate to the Azure Key Vault

To add a certificate to our new Key Vault use the command below. It will prompt you for your certificate password and add the cert to the key vault.

Add Cert to Vault
Add Cert to Vault

Certificate added to Key Vault.

Cert Added to Vault
Cert Added to Vault

Retrieve a Certificate from the Azure Key Vault

To retrieve a certificate from the Key Vault is just as simple.

$VaultCert = Get-AzureCertificate -Name "AADAppCert" -ResourceGroupName $name -StorageAccountName $name -VaultName $name
Retrieve a Cert
Retrieve a Cert

Add Credentials to the Azure Key Vault

Adding username/password or clientID/clientSecret to the Key Vault is just as easy.

# Store credentials into the Azure Key Vault
Set-AzureCredential -UserName "serviceAccount" -Password ($pwd = Read-Host -AsSecureString) -VaultName $name -StorageAccountName $name -Verbose

Credentials added to vault

Add Creds to Key Vault
Add Creds to Key Vault
Creds Added to Vault
Creds Added to Vault

Retrieve Credentials from the Azure Key Vault

Retrieving credentials is just as easy.

# Get credentials from the Azure Key Vault
$AzVaultCreds = Get-AzureCredential -UserName "serviceAccount" -VaultName $name -StorageAccountName $name -Verbose

Credentials retrieved.

Retrieve Account Creds
Retrieve Account Creds

Remove Credentials from the Azure Key Vault

Removing credentials is also a simple cmdlet.

# Remove credentials from the Azure Key Vault
Remove-AzureCredential -UserName "serviceAccount" -VaultName $name -StorageAccountName $name -Verbose

Credentials removed.

Remove Credentials from Key Vault
Remove Credentials from Key Vault

Summary

Hopefully this gets you started quickly with the Azure Key Vault. Credit to Axel for creating the module. It’s now part of my toolkit that I’m using a lot.

Receive Push Notifications from Microsoft/Forefront Identity Manager on your Mobile/Tablet/Computer

Background

Recently in a FIM/MIM environment a daily automated process was executing but the task it was performing was dependent on an upstream process that generates a feed, and the schedule for that feed had changed (without notice to me). Needless to say FIM/MIM wasn’t getting the information it needed to process. This got me thinking about notifications.

If you’re anything like me you probably have numerous email accounts and your subconscious has all but programmed itself to ignore “new email” notifications. However Push Notifications I typically do notice. Whilst in the example above I did have some error handling in place if the process completely failed (it is a development environment), I didn’t have anything for partial failures. Anyway it did get me thinking that I’d like to receive a notification if something that should happen didn’t.

Overview

This post details using push notifications to advise when expected events don’t transpire. In this particular example, I have an Azure Function App that connects once a day to a FTP Server and retrieves a series of exports and puts them on my FIM/MIM Synchronisation Sever. The Push Notification service I am using is Push Bullet. Push Bullet for free accounts (without a Pro subscription) are limited to 500 pushes per month. That should be more than enough. If I’ve got errors in excess of 500 per month I’ve got much bigger problems.

Getting Started

First up you will need to sign up for Push Bullet. It is very straight forward if you have a Facebook or Google account. As you’re probably wanting multiple people to receive the notifications it would pay to set up a shared Google Account that your team can use to connect to with their devices. Now you have an account head to your new Account Settings page and create an Access Token. Record it for use in the scripts below.

Connecting to the API

Test you can access the Push Bullet API using your Access Token and PowerShell. Update the following script for your Access Token in line 3 and execute. You should see information returned associated with your new Push Bullet account.

Next you will want to install the Push Bullet App on the device(s) you want to get the notification(s) on. I installed it on my Apple iPhone and also installed the Chrome Browser extension.

Using PowerShell we can then query to get the devices connected to the account. In the same PowerShell session you tested the API with above run this API call

$devices = Invoke-RestMethod -Method Get -Headers $header -Uri ($apiURI +"v2/devices")
$devices

This will return your registered devices.

Apple iOS iPhone Push Notification registered devices
Apple iOS iPhone Push Notification registered devices

If we want a notification to target a particular device we need to provide the Iden value associated with that device. If we don’t specify a target, the push notification will hit all devices. In my example above with two devices registered my iPhone was device two. So the target Iden I could get with;

$iphoneIden = $devices.devices[1].iden

Push Bullet allows for different notification types (Note, Link and File). Note is the one that’ll I’ll be using. More info on the other types here.

Sending Test Notification

To perform a notification test, update the following script for your Access Token (line 3). I’ve omitted the Device Identifier to send the message to all devices. I also had to logout of the iOS Push Bullet App and log in again to get the notifications to show.

Success. I received the notification on my iPhone and also in my Chrome browser.

Apple iOS iPhone Push Notification from FIM/MIM Identity Manager
Apple iOS iPhone Push Notification from FIM/MIM Identity Manager

Implementation

Getting back to my requirement of being notified when a job didn’t find what it expected, I updated my PowerShell Function App that is based off this blog post here to evaluate what it processed and if it didn’t find what is expected, it sends me a notification. I already had some error handling in my implementation based off that blog post but it was based on full failure, not partial (which is what I was experiencing whereby only one part of the process wasn’t returning data).

NOTE: I had to also add the ServerCertificateValidationCallback line into my Function App script before calling the API POST to send the notification as I was getting the dreaded following PowerShell Invoke-RestMethod / Invoke-WebRequest error when sending the notification via the Function App. I didn’t get that error on my dev workstation which is a bit weird.

Invoke-WebRequest : The underlying connection was closed: Could not establish trust relationship for the SSL/TLS secure 
channel.

If you also receive the error above (or you will be sending Push Notifications via Azure Function Apps) insert this line before your invoke-restmethod call.

 [System.Net.ServicePointManager]::ServerCertificateValidationCallback = {$true}

Summary

Essentially this is my first foray into enabling anything for Push Notifications and this post is food for thought on what can be easily enabled within FIM/MIM to give timely visibility to automated scheduled functions when they don’t perform as expected. It was incredibly simple to set up and get working. I see myself enabling more FIM/MIM functions with Push Notifications in the future.

Creating an AzureAD WebApp using PowerShell to leverage Certificate Based Authentication

Introduction

Previously I’ve posted about using PowerShell to access the Microsoft AzureAD/Graph API in a number of different ways. Two such examples I’ve listed below. The first uses a Username and Password method for Authentication, whilst the second uses a registered application and therefore ClientID and Client Secret.

As time has gone on I have numerous WebApp’s doing all sorts of automation. However they all rely on accounts with a username and password, or clientid and secret, where the passwords and secrets expire. Granted the secrets have a couple of years of life and are better than passwords which depending on the environment roll every 30-45 days.

However using Certificates would allow for a script that is part of an automated process to run for much longer than the key lifetime available for WebApps and definitely longer than passwords. Obviously there is security around the certificate to be considered so do keep that in mind.

Overview

This post is going to detail a couple of simple but versatile scripts;

  1. Using PowerShell we will;
    1.  Configure AzureAD
      1. Create a Self Signed 10yr Certificate
      2. Create an AzureAD WebApp and assign the Certificate to it
      3. Apply permissions to the WebApp (this is manual via the Azure Portal)
      4. Record the key parameters for use in the second script
    2. Connect to AzureAD using our Certificate and new WebApp

Creating the AzureAD WebApp, Self Signed Certificate and Assigning Application Permissions

The script below does everything required. Run it line by line, or in small chunks as you step through the process. You will need the AzureRM and Azure AD Utils Powershell Modules installed on the machine you run this script on.

Change;

  • Lines 3 & 4 if you want a certificate with a time-frame other than 10yrs
  • Line 5 for the password you want associated with the certificate for exporting/importing the private key
  • Line 6 for the certificate subject name and location it’ll be stored
  • Line 8 for a valid location to export it too
  • Line 11 for the same path as provided in Line 8
  • Lines 24 & 25 for an account to automatically connect to AAD with
  • Line 31 for the name of your WebApp

Before running line 37 login to the Azure Portal and assign permissions to the WebApp. e.g. AzureAD Directory Permissions. When you then run Line 37 it will trigger a GUI for AuthN and AuthZ to be presented. Sign in as an Admin and accept the oAuth2 Permission Authorizations for whatever you have request on the WebApp.
Permissions1.PNG

e.g Graph API Read/Write Permissions

Permissions2.PNG

Connecting to AzureAD using our Certificate and new WebApp

Update lines 3, 4, 6 and 7 as you step through lines 40-43 from the configuration script above which copies key configuration settings to the clipboard.

The following script then gets our certificate out of the local store and takes the Tenant and WebApp parameters and passes them to Connect-AzureAD in Line 15 which will connect you to AAD and allow you to run AzureAD cmdlets.

If you wish to go direct to the GraphAPI, lines 20 and 23 show leveraging the AzureADUtils Module to connect to AzureAD via the GraphAPI.

Notes on creating your Self-Signed Certificate in PowerShell

I’m using the PowerShell New-SelfSignedCertificate cmdlet to create the self signed certificate. If when you run New-SelfSignedCertificate you get the error as shown below, make sure you have Windows Management Framework 5.1 and if you don’t have Visual Studio or the Windows 8.1/10 SDK, get the Windows 8.1 SDK from here and just install the base SDK as shown further below.

No Makecert.PNG

Win8.1 SDK.PNG

Once the install is complete copy C:\Program Files (x86)\Windows Kits\8.1\bin\x86\makecert.exe to C:\windows\system32

Summary

The two scripts above show how using PowerShell we can quickly create a Self Signed Certificate, Create an Azure AD WebApp and grant it some permissions. Then using a small PowerShell script we can connect and query AAD/GraphAPI using our certificate and not be concerned about passwords or keys expiring for 10yrs (in this example which can be any timeframe you wish).

Synchronizing Passwords from Active Directory to the IBM/Lotus Domino Identity Vault using Microsoft Identity Manager – Part 3

Introduction

As the title suggests this is Part 3, and the final part in a three-part post on configuring FIM/MIM to synchronise users passwords from AD to the Domino ID Vault via PCNS and FIM/MIM.
Part 1 here detailed the creation of a PowerShell Management Agent to join users from Domino to the MIM Sync Metaverse.
Part 2 here detailed the creation and configuration of the Domino Agents to receive password changes via the PS MA into the ID Vault.

This post will wrap it all up with the details on calling the Domino Agents on password sync events (from PCNS via MIM)

Prerequisites

You will need the IBM Notes client installed and configured on your MIM Sync Server in order to put a document in the database we created in Part 2 and start the agent to process the document(s).

Overview

Essentially this is the process;

  • Password changed for a user (either by an admin, or by the user via their domain joined workstation, password reset or any other password change mechanism)
  • Password change is captured by the AD PCNS Filter installed and configured on each (writeable) Domain Controller
  • The DC using the PCNS Config in the domain locates the MIM Sync Server to send the password change too
  • The MIM Sync Server has the associated AD Domain configured as a Password Sync Source
  • Our new PowerShell ID Vault Notes MA is configured as a Password Target
  • MIM Sync passes off the password change event for MIM joined users to the ID Vault Password Change MA which initiates the Password.ps1 script (below)
  • The password.ps1 script creates a document (that contains the details for the password change) in our ID Vault Password Sync Database we created in Part 2 of this series and then tells the MIMPwdTrigger Agent to start
  • The MIMPwdTrigger Agent picks up the document, passes it to the MIMPasswordSync Agent which sends the password change to the ID Vault

Domino PowerShell Management Agent Password.ps1 Script

Put this Password.ps1 script in the same location you put the Schema, Import and Export scripts earlier.

Testing Password Sync End to End (Active Directory to the ID Vault)

The following screen shots show me tracing through the logs for a password change as it makes it way from the AD Domain Controller to MIM Sync to the MA to the MA Password script to the Notes DB as a document triggered to be process by the Notes Agent and the user updated in the ID Vault.

First the password change event is initiated to the MIM Sync Service by the Domain Controller that captured the password change.

Password Change1

PCNS provides all the details for the password change.

Password Change2

The MIM Sync Server determines where to send the change which includes our PS Notes MA.

Password Change3

Our PS Notes MA logged the process.

Notes MA LOG

=============================================================

Display Name: Jane XXX/xxx/xxxxx-Aus

Action: Set

Old pwd:

New pwd: Password123456

Unlock: False

Force change: False

Validate: False

Database: System.__ComObject

As did the Notes Agent as it process the change.

Notes Agent Log

MIMPasswordSync|mimpasswordsync: 08/03/2017 02:56:22 PM: Reseting password …

MIMPasswordSync|mimpasswordsync: 08/03/2017 02:56:22 PM: Server: xxxNotes1/xxxxx-Aus User:Jane xxx/xxx/xxxxx-Aus

MIMPasswordSync|mimpasswordsync: 08/03/2017 02:56:23 PM: Return value: true

MIMPasswordSync|mimpasswordsync: 08/03/2017 02:56:23 PM: Removed User ID Vault change document from ‘xxxNotes1/xxxxx-Aus’

And finally we see the change reflected in the ID Vault. Looking at the time-stamps along the way we see that it all happened in approximately 2 seconds.

Password Change4

Summary

This three-part blog post has shown how to get passwords from Active Directory to the MIM Sync connected source across to IBM Domino and into the ID Vault using the Granfeldt PowerShell Management Agent and some configuration with a Database in Domino with two Domino Agents.

What have you synchronised passwords too using FIM/MIM ?

Synchronizing Passwords from Active Directory to the IBM/Lotus Domino Identity Vault using Microsoft Identity Manager – Part 1

Introduction

Recently I wrote about getting started with the latest IBM/Lotus Notes/Domino Management Agent for Microsoft Identity Manager. In a recent engagement we are using that MA to provision and manage identities into Domino. We are also using the MA to synchronise passwords via PCNS and MIM to the Notes users’ Internet (HTTP) password.

What you may or may not be aware of is that IBM introduced a new feature with Domino 8.5 called the ID Vault. The ID Vault is a Domino based application that holds protected copies of Notes user IDs. Now here’s the twist. The Microsoft Domino MA only supports password synchronisation to the HTTP password, not to the ID Vault.

My customer is using the ID Vault and naturally we need to synchronise password changes to both the HTTP Password and the ID Vault (for users Notes IDs). This post is the first in a series that details how I recently accomplished synchronising passwords to the Domino ID Vault.

  1. This post provides the introduction and the creation of a PowerShell Management Agent into Domino to join identities into the MIM Metaverse
  2. Post two details Creating Domino Agents that will handle taking requests from the MIM PS MA to change users ID Vault password
  3. Post three will detail calling the Domino Agents on password sync events (from PCNS via MIM)

Overview

The following diagram shows a high-level overview of password synchronisation using FIM/MIM from AD to Domino. Password changes/resets can be initiated using a number of methods. The FIM/MIM Self Service Password Reset functionality, users changing their password via their domain joined workstations as defined by AD Group Password Polic(y)ies, using the AD FS Password Change function, or even on behalf of users by a Service Desk/Administrator. In each scenario implementing Microsoft’s Password Change Notification Service will get the password change to FIM/MIM. I’m not going to cover PCNS as it is out of the box and straight forward to install and configure. This MS PFE PCNS implementation document covers it quite well.

Password Sync Overview

Likewise I’m not going to go into any detail about password sync to the HTTP Password. That’s out of the box functionality, that is pretty much the same as any other MA configured as a Password Sync Target. That said in my environment I did have to configure the MS Domino MA like this to get password events out to Domino.

ID Vault FIM/MIM PowerShell Management Agent

First up, we are going to need a Management Agent to join Notes users to our users from Active Directory in the Metaverse. I’ve gone to my favourite PowerShell Management Agent (Granfeldt) for this.

The Granfeldt PS MA will be configured to;

  • Import and join Domino Users to the Metaverse. The MA will be slimline in the number of attributes it brings in. Enough to perform the join and have enough information about the users context in Domino to be able to perform the password sync event
  • be a target for Password Synchronisation
  • send the password change event to the Domino Agent we will build to perform the password change. A Domino Agent is required as the ID Vault will only accept password changes from a process run on the Domino Server(s). More on this in parts 2 and 3

The integration of the MIM Sync Engine with Domino with the PowerShell Management Agent is done using LDAP. The Name and Address Book is easily accessed via LDAP.

To get started I looked up the Server Document for the Domino Server I wanted to connect to that had the Name and Address Book. Selecting the Directory tab I could see that LDAP(S) 389/636 was enabled.

LDAP Domino

I then went to the Name and Address Book and looked up my Admin Notes ID to get its context so I could translate it to LDAP format. Darren Kloud Robinson/OrgU/Org-AUS translates to CN=Darren Kloud Robinson,OU=Org,O=Org-AUS“.

NAB Info

Knowing my Notes ID Password I used LDP from Windows Server to bind to the Domino Directory. You could use any LDAP Browser/Tool.

Bind

Once I validated I could connect and browse the tree, I knew I had my connection details sorted, I translated that to PowerShell.

That looked something like this;

I then wrapped this into a FIM/MIM PowerShell Management Agent. The Granfeldt PS MA Scripts are below.

Domino PowerShell Management Agent Schema Script

As described above the Schema Script is very light on the number of attributes specified. Basically the minimum required to get a join and give us the context of the user to process password sync events.

Domino PowerShell Management Agent Import Script

As detailed above the Import brings through enough metadata to perform the join and give us the attributes needed for the user context to be able to sync passwords through.

Domino PowerShell Management Agent Export Script

File just needs to exist. Not used in this scenario.

Domino PowerShell Management Agent Password Script

See Part 3 in this series for the Password.ps1 script. But if you are following sequentially, copy the empty Export.ps1 script for now and name it Password.ps1 and have it located in the same directory as the other PS MA scripts.

Wiring it all together

As for creating the PS MA, I’ve detailed this in-depth many times. Check out this post (or the many other similar I’ve posted) and the Getting Started section if you are new to the Granfeldt PowerShell Management Agent. Copy the above scripts to the directory you create, and when creating the MA provide the paths to the scripts (in 8.3 format).

A key item though is to configure the PS MA as a Password Sync Target as per the screenshot below. You will also need to configure where passwords are coming from to send to this new MA. If it is Active Directory, open the Properties of your AD MA select Configure Directory Partitions then under Password Synchronization enable the checkbox Enable this partition as a password synchronization source. Select Targets and select your newly created Notes ID Vault Password MA. Select Ok then Ok again.

PWD Target

After creating a Run Profile and doing a Stage and Import, based on your Join rule (probably email address) you should have a heap of connectors. In my environment displayName contains the context of the user. Eg. Full Name/OrgU/Org We’ll need this to send the password change event to the ID Vault.

CS

Summary

Through the PowerShell MA as detailed above we have been able to enumerate users from Domino and join them to existing users in the MIM Sync Metaverse. We can now set about creating Domino Agents to take password sync events from this MA and change users passwords in the Domino ID Vault.

Part 2 in this series here details creating the Domino Agents and configuring Domino to accept the changes.

Integration of Microsoft Identity Manager with Azure Platform-as-a-Service Services

Overview

This isn’t an out of the box solution. This is a bespoke solution that takes a number of elements and puts them together in a unique way. I’m not expecting anyone to implement this specific solution (but you’re more than welcome to) but to take inspiration from it to implement solutions relevant to your environment(s). This post supports a presentation I did to The MIM Team User Group on 14 June 2017.

This post describes a solution that;

  • Leverages an Azure WebApp (NodeJS) to present a simple website. That site can be integrated easily in the FIM/MIM Portal
  • The NodeJS website leverages an Azure Function App to get a list of users from the FIM/MIM Synchronization Server and allows the user to use typeahead functionality to find the user they want to generate a FIM/MIM object report on
  • On selection of a user, a request will be sent to another Azure Function App to generate and return the report to the user in a new browser window

This is shown graphically below.

 

Report Request UI

The NodeJS WebApp is integrated into the FIM/MIM portal. Bootstrap Typeahead is used to find the user to generate a report on. The Typeahead userlist if fulfilled by an Azure Function into the MIM Sync Metaverse. The Generate Report button fires off a call to FIM/MIM via another Azure Function into the MIM Sync and MIM Service to generate the report.

The returned report opens in a new tab in the users browser. The report contains details of the FIM/MIM connectors the user is represented on.

The values of all attributes for the users hologram from the Metaverse are displayed along with the MA the value came from and the last modified date.

Finally the metadata report from the MIM Service MA Connector Space and the MIM Service.

Prerequisites

These are numerous, but I’ve previously posted about them. You will need;

I encourage you to digest those posts to understand how to configure the prerequisites for this solution.

Additional Solution Requirements

To bring all the individual components together, there are a few additional tasks to enable this solution.

  • Enable CORS on your Azure Function App Configuration (see details further below)
  • If you want to display User Object Photos as part of the report, you will likely need to synchronize them into FIM/MIM from an authoritative source (e.g. Office365/Exchange Online)   Checkout this post  and additional details further below
  • In order to embed the NodeJS WebApp into the FIM/MIM Portal, this post provides the details. Change the target URL from PowerBI URL to your NodeJS site
  • Object Report Request WebApp (see below for sample site)

Azure Functions Cross Origin Resource Sharing (CORS)

You will need to configure CORS to allow the NodeJS WebApp to access the Azure Functions (from both local and Azure). Reflect your port number if it is different from 3000, and use the DNS name for your Azure WebApp.

Sample UI NodeJS HTML

Here is a sample HTML file for your NodeJS WebApp with the UI to provide Input for LoginID fulfilled by the NodeJS Javascript file further below.

Sample UI NodeJS JavaScript

The following NodeJS JavaScript supports the HTML UI above. It populates the LoginID typeahead box and takes the Submit Report button to fulfill the report for the desired object(s). Yes if you use the UI to select (individually) multiple different objects all will be returned in their separate output windows.

As the HTML file above indicates you will need to obtain and make available as part of your NodeJS project the typeahead.bundle.js library.

Azure PowerShell Trigger Function App for AccountNames Lookup

The following Azure Function takes the call from the load of the NodeJS WebApp to populate the typeahead userlist.

Azure PowerShell Trigger Function App for User Object Report

Similar in structure to the Username List Lookup Azure Function above, but in the ScriptBlock you embed the Report Generation Script that is detailed here. Modify for what you want to report on.

Photos in the Report

If you want to display images in your report, you will need to determine if the user has an image during the MV metadata report generation part of the script. Add the following lines (updating for the name of your Image attribute; mine is named EXOPhoto) after the Try {} Catch {} in this section $obj = @() ; foreach ($attr in $attributes.Keys)

 # Display the Objects Photo rather than Base64 string
 if ($attr.equals("EXOPhoto")){
     $objectphoto = "<img src=$([char]0x22)data:image/jpeg;base64,$($attributes.$attr.Values.Valuestring)$([char]0x22)>"
     $val = "System.Byte[]"
 }

Then in the output of the HTML report at the end of the report generation insert the $objectphoto variable into the HTML stream.

# Output MIM Service Object Data
 $MIMServiceObjOut = $MIMServiceObjectMetaData | Sort-Object -Property Attribute | ConvertTo-Html -Fragment
 $htmlreport = ConvertTo-HTML -Body "$htmlcss<h1>Microsoft Identity Manager User Object Report</h1><h2>Query</h2>$sourcequery</br><b><center>$objectphoto</br>NOTE: Only attributes with values are displayed.</center></b><h2>Connector(s) Summary</h2>$connectorsummary<h2>MetaVerse Data</h2>$objectmetadata <h2>MIM Service CS Object Data</h2>$MIMServiceCSobjectmetadata <h2>MIM Service Object Data</h2>$MIMServiceObjOut" -Title "MIM Object Report"

As you can see above I’ve also injected the CSS ($htmlcss) into the output stream at the beginning of the Body section.  Somewhere in your script block you will need to define your CSS values. e.g.

 # StyleSheet for nice pretty output
 $htmlcss = "<style>
    h1, h2, th { text-align: center; }
    table { margin: auto; font-family: Segoe UI; box-shadow: 10px 10px 5px #888; border: thin ridge grey; }
    th { background: #0046c3; color: #fff; max-width: 400px; padding: 5px 10px; }
    td { font-size: 11px; padding: 5px 20px; color: #000; }
    tr { background: #b8d1f3; }
    tr:nth-child(even) { background: #dae5f4; }
    tr:nth-child(odd) { background: #b8d1f3; }
 </style>"

Summary

An interesting solution integrating Azure PaaS Services with Microsoft Identity Manager via PowerShell and the extremely versatile Lithnet FIM/MIM PowerShell Modules.

Please share your implementations enhancing your FIM/MIM Solution.

How to access Microsoft Identity Manager Hybrid Report data using PowerShell, Graph API and oAuth2

Hybrid Reporting is a great little feature of Microsoft Identity Manager. A small agent installed on the MIM Sync Server will send reporting data to Azure for MIM SSPR and MIM Group activities. See how to install and configure it here.

But what if you want to get the reporting data without going to the Azure Portal and looking at the Audit Reports ? Enter the Azure AD Reports and Events REST API that is currently in preview.  It took me a couple of cracks and getting this working, because documentation is a little vague especially when accessing it via PowerShell and oAuth2. So I’ve written it up and hope it helps for anyone else looking to go down this route.

Gotchas

Accessing the Reports via the API has a couple of caveats that I had to work through:

  • Having the correct permissions to access the report data. Pretty much everything you read tells you that you need to be a Global Admin. Once I had my oAuth tokens I messed around a little and a was able to also get the following from back from the API when purposely using an identity that didn’t have the right permissions. The key piece is “Api request is not from global admin or security admin or security reader role”. I authorized the WebApp using an account that is in the Security Reader Role, and can successfully access the report data.

  • Reading the documentation here on MSDN I incorrectly assumed each category was the report name. Only when I called the “https://graph.windows.net//$metadata?api-version=beta”  and looked at the list of reports I noticed each report was plural.The three that I wanted to access (and report on) are obviously the MIM Hybrid Reports;
"Name":  "mimSsgmGroupActivityEvents",
"Name":  "mimSsprActivityEvents",
"Name":  "mimSsprRegistrationActivityEvents",

Here is the full list of Reports available as of 24 May 2017.

{
    "Name":  "b2cAuthenticationCountSummary",
    "LicenseRequired":  "False"
}
{
    "Name":  "b2cMfaRequestCount",
    "LicenseRequired":  "False"
}
{
    "Name":  "b2cMfaRequestEvent",
    "LicenseRequired":  "False"
}
{
    "Name":  "b2cAuthenticationEvent",
    "LicenseRequired":  "False"
}
{
    "Name":  "b2cAuthenticationCount",
    "LicenseRequired":  "False"
}
{
    "Name":  "b2cMfaRequestCountSummary",
    "LicenseRequired":  "False"
}
{
    "Name":  "tenantUserCount",
    "LicenseRequired":  "False"
}
{
    "Name":  "applicationUsageDetailEvents",
    "LicenseRequired":  "False"
}
{
    "Name":  "applicationUsageSummaryEvents",
    "LicenseRequired":  "True"
}
{
    "Name":  "b2cUserJourneySummaryEvents",
    "LicenseRequired":  "False"
}
{
    "Name":  "b2cUserJourneyEvents",
    "LicenseRequired":  "False"
}
{
    "Name":  "cloudAppDiscoveryEvents",
    "LicenseRequired":  "False"
}
{
    "Name":  "mimSsgmGroupActivityEvents",
    "LicenseRequired":  "True"
}
{
    "Name":  "ssgmGroupActivityEvents",
    "LicenseRequired":  "True"
}
{
    "Name":  "mimSsprActivityEvents",
    "LicenseRequired":  "True"
}
{
    "Name":  "ssprActivityEvents",
    "LicenseRequired":  "True"
}
{
    "Name":  "mimSsprRegistrationActivityEvents",
    "LicenseRequired":  "True"
}
{
    "Name":  "ssprRegistrationActivityEvents",
    "LicenseRequired":  "True"
}
{
    "Name":  "threatenedCredentials",
    "LicenseRequired":  "False"
}
{
    "Name":  "compromisedCredentials",
    "LicenseRequired":  "False"
}
{
    "Name":  "auditEvents",
    "LicenseRequired":  "False"
}
{
    "Name":  "accountProvisioningEvents",
    "LicenseRequired":  "False"
}
{
    "Name":  "signInsFromUnknownSourcesEvents",
    "LicenseRequired":  "False"
}
{
    "Name":  "signInsFromIPAddressesWithSuspiciousActivityEvents",
    "LicenseRequired":  "True"
}
{
    "Name":  "signInsFromMultipleGeographiesEvents",
    "LicenseRequired":  "False"
}
{
    "Name":  "signInsFromPossiblyInfectedDevicesEvents",
    "LicenseRequired":  "True"
}
{
    "Name":  "irregularSignInActivityEvents",
    "LicenseRequired":  "True"
}
{
    "Name":  "allUsersWithAnomalousSignInActivityEvents",
    "LicenseRequired":  "True"
}
{
    "Name":  "signInsAfterMultipleFailuresEvents",
    "LicenseRequired":  "False"
}
{
    "Name":  "applicationUsageSummary",
    "LicenseRequired":  "True"
}
{
    "Name":  "userActivitySummary",
    "LicenseRequired":  "False"
}
{
    "Name":  "groupActivitySummary",
    "LicenseRequired":  "True"
}

How to Access the Reporting API using PowerShell

What you need to do is;

  • Register a WebApp
    • Assign a reply to URL of https://localhost
    • Assign it Read.Directory permissions
  • Get an oAuth2 Authentication Code using an account that is either Global Admin or in the Security Admin or Security Reader Azure Roles
  • Use your Bearer and Refresh tokens to query for the reports you’re interested in

Register your WebApp

In the Azure Portal create a new Web app/API app and assign it https://localhost as the Reply URL. Record the Application ID for use in the PowerShell script.

Assign the Read Directory data permission as shown below

Obtain a key from the Keys option on your new Web App.  Record it for use in the PowerShell script.

Generate an Authentication Code, get a Bearer and Refresh Token

Update the following script, changing Lines 5 & 6 for the ApplicationID/ClientId and Client Secret for the WebApp you created above.

Run the script and you will be prompted to authenticate. Use an account in the tenant where you created the Web App that is a Global Admin or in the Security Admin or Security Reader Azure Roles. You will need to change the location where you want the refresh.token stored (line 18).

If you’ve done everything correctly you have authenticated, got an AuthCode which was then used to get your Authorization Tokens. The value of the $Authorization variable should look similar to this;

Now you can use the Refresh token to generate new Authorization Tokens when they time out, simply by calling the Get-NewTokens function included in the script above.

Querying the Reporting API

Now that you have the necessary prerequisites sorted you can query the Reporting API.

Here are a couple of simple queries to return some data to get you started. Update the script for the tenant name of your AzureAD. With the $Authorization values from the script above you can get data for the MIM Hybrid Reports.