Automating the generation of Microsoft Identity Manager Configuration Documentation

Introduction

Last year Microsoft released the Microsoft Identity Manager Configuration Documenter which is available here. It is a fantastic little tool from Microsoft that supersedes its predecessor from the Microsoft Identity Manager 2003 Resource Toolkit (which only documented the Sync Server Configuration).

Running the tool (a PowerShell Module) against a base out-of-the-box reference configuration for FIM/MIM Servers reconciled against an exported configuration from the MIM Sync and Service Servers from an implementation, generates an HTML Report document that details the existing configuration of the MIM Service and MIM Sync.

Overview

Last year I wrote this post based on an automated solution I implemented to perform nightly backups of a FIM/MIM environment during development.

This post details how I’ve automated another daily task for a large development environment where a number of changes are going on and I wanted to have documentation generated that detailed the configuration for each day. Partly to quickly be able to work out what has changed when needing to roll back/re-validate changes, and also to have the individual configs from each day so they could also be used if we need to rollback.

The process uses an Azure Function App that uses Remote PowerShell into MIM to;

  1. Leverage a modified (stream lined version) of my nightly backup Azure Function to generate the Schema.xml and Policy.xml MIM Service configuration files and the Lithnet MIIS Automation PowerShell Module installed on the MIM Sync Server to export of the MIM Sync Server Configuration
  2. Create a sub-directory for each day under the MIM Documenter Tool to hold the daily configs
  3. Execute the generation of the Report and have the Report copied to the daily config/documented solution

Obtaining and configuring the MIM Configuration Documenter

Download the MIM Configuration Documenter from here and extract it to somewhere like c:\FIMDoco on your FIM/MIM Sync Server. In this example in my Dev environment I have the MIM Sync and Service/Portal all on a single server.

Then update the Invoke-Documenter-Contoso.ps1 (or whatever you’ve renamed the script to) to make the following changes;

  • Update the following lines for your version and include the new variable $schedulePath and add it to the $pilotConfig variable. Create the C:\FIMDoco\Customer and C:\FIMDoco\Customer\Dev directories (replace Customer with something appropriate.
######## Edit as appropriate ####################################
$schedulePath = Get-Date -format dd-MM-yyyy
$pilotConfig = "Customer\Dev\$($schedulePath)" # the path of the Pilot / Target config export files relative to the MIM Configuration Documenter "Data" folder.
$productionConfig = "MIM-SP1-Base_4.4.1302.0" # the path of the Production / Baseline config export files relative to the MIM Configuration Documenter "Data" folder.
$reportType = "SyncAndService" # "SyncOnly" # "ServiceOnly"
#################################################################
  • Remark out the Host Settings as these won’t work via a WebJob/Azure Function
#$hostSettings = (Get-Host).PrivateData
#$hostSettings.WarningBackgroundColor = "red"
#$hostSettings.WarningForegroundColor = "white"
  • Remark out the last line as this will be executed as part of the automation and we want it to complete silently at the end.
# Read-Host "Press any key to exit"

It should then look something like this;

Azure Function to Automate execution of the Documenter

As per my nightly backup process;

  • I configured my MIM Sync Server to accept Remote PowerShell Sessions. That involved enabling WinRM, creating a certificate, creating the listener, opening the firewall port and enabling the incoming port on the NSG . You can easily do all that by following my instructions here. From the same post I setup up the encrypted password file and uploaded it to my Function App and set the Function App Application Settings for MIMSyncCredUser and MIMSyncCredPassword.
  • I created an Azure PowerShell Timer Function App. Pretty much the same as I show in this post, except choose Timer.
    • I configured my Schedule for 6am every morning using the following CRON configuration
0 0 6 * * *
  • I also needed to increase the timeout for the Azure Function as generation of the files to execute the report and the time to execute the report exceed the default timeout of 5 mins in my environment (19 Management Agents). I increased the timeout to the maximum of 10 mins as detailed here. Essentially added the following to the host.json file in the wwwroot directory of my Function App.
{
 "functionTimeout": "00:10:00"
}

Azure Function PowerShell Timer Script (Run.ps1)

This is the Function App PowerShell Script that uses Remote PowerShell into the MIM Sync/Service Server to export the configuration using the Lithnet MIIS Automation and Microsoft FIM Automation PowerShell modules.

Note: If your MIM Service is on a different host you will need to install the Microsoft FIM Automation PowerShell Module on your MIM Sync Server and update the script below to change references to http://localhost:5725 to whatever your MIM Service host is.

Testing the Function App

With everything configured, manually running the Function App and checking the output window if you’ve configured everything correct will show success in the Logs as shown below. In this environment with 19 Management Agents it takes 7 minutes to run.

Running the Azure Function.PNG

The Report

The outcome everyday just after 6am is I have (via automation);

  • an Export of the Policy and Schema Configuration from my MIM Service
  • an Export of the MIM Sync Server Configuration (the Metaverse and all Management Agents)
  • I have the MIM Configuration Documenter Report generated
  • If I need to rollback changes I have the ability to do that on a daily interval (either for a MIM Service change or an individual Management Agent change

Under the c:\FIMDoco\Data\Customer\Dev\\Report directory is the HTML Configuration Report.

Report Output.PNG

Opening the report in a browser we have the configuration of the MIM Sync and MIM Service.

Report

 

Provisioning Hybrid Exchange/Exchange Online Mailboxes with Microsoft Identity Manager

Introduction

Working for Kloud all our projects involve Cloud services, and all our customers have varying and unique requirements. Recently one of our customers embarked on their migration from On-Premise Exchange to Exchange Online. Nothing really groundbreaking there though, however they had a number of unique requirements including management of Litigation Hold. And that needed to be integrated with their existing Microsoft Identity Manager implementation (that currently provisions new users to their Exchange 2013 environment). They also required that management of the Exchange environment still be possible via the Exchange Management Console against a local Exchange server. This post details how I integrated the environments using MIM.

Overview

In order to integrate the Provisioning and Lifecycle management of Exchange Online Mailboxes in a Hybrid Exchange with Microsoft Identity Manager I created a custom PowerShell Management Agent simply because it was going to provide the flexibility I needed.

Provisioning is based on the following process;

  1. MIM Creates new user in Active Directory (no changes to existing MIM provisioning process)
  2. Azure Active Directory Connect synchronises the user to Azure Active Directory
  3. The Exchange Online MIM Management Agent sees the corresponding AAD account for the new user
  4. MIM Declarative Rules trigger the creation of a new Remote Mailbox for the AD/AAD user against the local Exchange 2013 On Premise Server. This allows the EMC to be used to manage mailboxes On Premise even though the mailbox resides in Office365/Exchange Online
  5. AADC/Exchange synchronises the information as part of the Hybrid Exchange topology
  6. MIM sees the EXO Mailbox configuration for the new user and enables Litigation Hold against the EXO Mailbox (if required)

The following diagram graphically depicts this process.

EXO IDM Provisioning Solution.png

Exchange Online PowerShell MA

As always I’m using my favourite PowerShell Management Agent, the Grandfeldt PS MA now available on Github here.

Schema Script

The Schema script configures the schema required for current and future EXO management requirements. The Schema is based on a single Object Class “MailUser” but pulls the information from a combination of Azure AD User and Exchange Online Mailbox object classes for an associated account. Azure AD User objects are prefixed by ‘AAD’. Non AAD prefixed attributes are EXO Mailbox attributes.

Import Script

The Import script connects to both Azure AD and Exchange Online to retrieve Azure AD User accounts and if present the associated mailbox for a user.

It retrieves all Member AAD User Accounts and puts them into a Hash Table. Connectivity to AAD is via the AzureADPreview PowerShell module. It retrieves all Mailboxes and puts them into a Hash Table. It then processes all the mailboxes first including the associated AAD User account (utilising a join via userPrincipalName).

Following processing all mailboxes the remainder of the AAD Accounts (without mailboxes) are processed.

Export Script

The Export script performs the necessary integration against OnPremise Exchange Server 2013 for Provisioning and Exchange Online for the rest of management. Both utilise Remote Powershell. It also leverages the Lithnet MIIS Automation PowerShell Module to query the Metaverse to validate current object statuses.

Wiring it all up

The scripts above will allow you to integrate a FIM/MIM implementation with AAD/EXO for management of users EXO Mailboxes. You’ll need connectivity from the MIM Sync Server to AAD/O365 in order to manage them.  Everything else I wired up using a few Sets, Workflows, Sync Rules and MPR’s.

Geographically Visualizing your workforce using Microsoft Identity Manager, xMatters and Power BI

 

Introduction

In the last couple of weeks I’ve posted about visualizing relationships of data from Microsoft Identity Manager using Power BI. Earlier this week I posted about building a Management Agent for Microsoft Identity Manger to integrate with xMatters.

In this post I combine data from the last two in order to allow us to visualise the geographic office locations for an organisation and then summary data about it (how many employees are located there, and what departments).

Prerequisites

You’ll need an Azure AD and Office 365 subscription to allow you to create a Power BI Application. Too create a Power BI Application see Registering a Power BI Application in this post here.

You’ll also need the Power BI PowerShell Module. I’m using 2.0.0.9 available from the PowerShell Gallery here and of course the Lithnet MIIS PowerShell Module available from here.

Overview

Using our registered Power BI Application we’ll create a Dataset consisting of two tables. One for the xMatters Sites (that we also get the geographic co-ordinates of from the xMatters Management Agent), and the other with our xMatters Users that contains the officeLocation that maps to an xMatters Site.

I create a relationship between the two tables on xMattersSite displayName (which is the location name) and the xMattersUsers officeLocation. We can then create a nice visual using data from both tables.

Create the Dataset (two tables with relationship)

Initially I tried to create the dataset with a relationship as I’ve previously shown here. However that didn’t work. After some debugging I got the result I wanted after some trial and error using the Power BI API Explorer. So I’ll provide you with the raw JSON format for creating a New Dataset, Two Tables (xMattersSites and xMattersUsers) and a relationship between them (where xMattersSites\displayName joins with xMattersUsers\officeLocation) as per my xMatters Management Agent detailed here.

Start by authenticating to the Power BI API Explorer with an account in the environment where you created your Power BI Application and navigate to the Create Dataset section here.

Create Dataset

Update this JSON formatted object that details the Dataset, Tables and Relationships for your environment.

Paste your validated JSON object into the Body section of the API Explorer and select Call Resource.

Dataset Body

If your JSON object is formatted corrected you’ll get a 201 response and your DataSet and Tables with Relationship will be created.

Create Success

Switching over to Power BI you’ll see the xMatters Dataset in the bottom left, then the two tables in on the right hand side with their columns.

xMatters DataSet PBI.PNG

Load xMatters User Data into Power BI

Now that we have somewhere to put the data, lets populate the dataset. I’m using the Lithnet MIIS Automation PowerShell Module (detailed in the prerequsites to query the Metaverse and return all users. Then I refine the list down to those that are Active (based on my employeeActive Boolean attribute) then finally, only those users that are connected on the xMatters Management Agent (see lines 14 & 18).

The script will drop any existing values from the xMatters Users table then upload what we have retrieved from the Metaverse (and refined).

Upload Users.PNG

Load xMatters Site Data into Power BI

Again I’m also using the Lithnet MIIS Automation PowerShell Module to query the Metaverse and return all xMatters Sites.

The script will drop any existing values from the xMatters Sites table then upload what we have retrieved from the Metaverse.

Upload Sites.PNG

Creating the Power BI Visual

Now we have data we can build the visual. I’m using the ArcGIS Maps for Power BI visual which is available in the default set of visuals. Then by selecting displayName and geo the map will automagically show all xMatters Sites in their respective co-ordinates.

xMatters Sites to Map

We can then add a Card Visual and choose officeLocation and then configure the visual for Count of officeLocation and we’ll get a count of the employees at that location. As we can see below with the Sydney location selected from the map the card updates to tell me there are 665 Employees at that officeLocation.

Count of Employees at Selected Location

Pretty quickly we can also expand out other data points, like departments at a location, employees etc as shown below (I’ve obfuscated the departments and a number of the other office locations).

Summary.PNG

Conclusion

We haven’t generated any new data. We’ve taken information we already have in Microsoft Identity Manager from connected systems and quickly visualized it via Power BI. However providing this to the business and with the ability for consumers of the information to export it from the visual can be pretty powerful.

Building a FIM/MIM Management Agent for xMatters

Introduction

A couple of weeks ago one of my customers had a requirement to provision and manage identities into xMatters. The xMatters API Documentation looked straight-forward and I figured it would be pretty quick to knock up an PowerShell Management Agent.

The identification of users (People) in xMatters was indeed pretty quick. I was quickly able to enumerate all users (that had initially been seeded independent of FIM/MIM) and join them to corresponding users in the MetaVerse.

It was then as I started digging deeper that the relationship between Sites (Locations) and Email/Mobile (Devices) attributes became apparent. This post details how I approached it and a base xMatters MA that should get you started if you need to do something similar.

Overview

A key concept to keep in mind is that at the simplest level there are 3 key Object Types in xMatters;

  • People
    • User Objects along with basic naming attributes
  • Device
    • Each contact medium is a device. Email Address, Mobile Phone, Home Phone, Text Phone (SMS) etc.
  • Site
    • Location of the entity (person)

Associated with each is an id which can be either dynamically created on provisioning (by xMatters) or specified. For People there is also targetName which is the equivalent of UID/sAMAccountName. When using the API (for people) you can use either their ID or their targetName. For all other entities you need to use the ID.

For each entity as you’d expect there are different API URI’s. They are;

  • Base URI https://customer.hosted.xmatters.com
  • People URI https://customer.hosted.xmatters.com/api/xm/1/people
  • Devices URI https://customer.hosted.xmatters.com/api/xm/1/devices
  • Sites URI (legacy API) https://customer.hosted.xmatters.com/reapi/2015-04-01/sites

Finally to retrieve devices for a person use;

  • Devices associated with a person https://customer.hosted.xmatters.com/api/xm/1/people/{ID}/devices

Other key points to consider that I uncovered are;

  • if you are updating a Device (e.g. someones Email Address or Phone Number) don’t specify the owner attribute (as you do when you create the Device). It considers that you are trying to change the owner and won’t allow it.
  • to update a Device you need to know the ID of the Device. I catered for this on my Import by bringing through People and Device ID’s.
  • When creating/updating a users location you need to specify the Site ID and Site Name. I brought these through as a separate ObjectClass into FIM/MIM and query the MV for them when Exporting
  • In my initial testing the API returned a number of different errors 400 (Bad Request), 409 Conflict (when trying to Add a Device that already exists), 404 (Not Found) along with API Timeouts. You need to account for these and perform processing appropriately
  • On success of Update, Create or Delete the API returns the full object that you performed the operation on. You need to capture this and let MIM know that on Success a full object being returned is Success and not an error
  •  xMatters expects phone numbers to be in E164 format (e.g +61 400 123 456). I catered for this on an import on another Management Agent
  • xMatters timezone is in the format of Country/Region. For Australia these are as follows. Correct, it doesn’t accept Australia/Canberra for ACT;
    • “NSW”  = “Australia/Sydney”
      “VIC”  = “Australia/Melbourne”
      “QLD”  = “Australia/Brisbane”
      “ACT”  = “Australia/Sydney”
      “WA”  = “Australia/Perth”
      “TAS”  = “Australia/Hobart”
      “NT”  = “Australia/Darwin”

xMatters PowerShell Management Agent

With all that introduction, here is a base xMatters PowerShell MA (implemented using the Granfeldt PowerShell MA) to get you started. You’ll need to tailor for your environment and trigger Provisioning, Deletes and Flow Rules for your environment and look to handle the xMatters API for your integration.

Schema Script

I’ve created two Object Classes. User and Site. User incorporates User Devices. Site is the locations (Sites) from xMatters.

Import Script

Credentials for the Import script to connect to xMatters are flowed in from the Management Agent Username and Password attributes. This isn’t using Paged Imports. If you have a large number of users you may want to consider that. After retrieving all of the People entities each is queried to obtain their Devices. I’m only bringing through SMS and Email Devices. You’ll need to modify for additional Devices.

Ensure that you flow into the MetaVerse (onto custom attributes) the IDs associated with your Devices (e.g MobileID and EmailID). That will allow you to use the ID when updating those attributes.

For Sites, I created a custom ObjectClass (Site) in the MV and used objectID of the SiteID and displayName for the Site Name (as shown below).

Attribute Flows.png

Export Script

This is where it gets a little more complicated. As PowerShell is not good at reporting webrequest responses we have to deal with the return from each API call and determine if we were successful or not. Then let FIM/MIM know so it can report that via the UI.

The Export script below deals with Adding, Deleting and Updating users. Update line 31 for your API URI for xMatters.

Summary

The detail above will get you started and give you a working Management Agent to import Users and Sites. You’ll need to do the usual steps (Set, Workflow, Sync Rule and MPR) to trigger Provisioning on the MA along with how you handle deletes.

Graphically Visualizing Identity Hierarchy and Relationships

Almost 15 years ago Microsoft released Microsoft Identity Integration Server (MIIS) 2003. Microsoft also released a couple of Resource Toolkits for MIIS to assist customers and IT Integrators’ implement the product as up to that time it’s predecessor (Microsoft Metadirectory Services) was only available as part of a Microsoft Consulting engagement.

At the same time Microsoft provided a Beta product – Microsoft PolyArchy Server. For someone who’s brain is wired in highly visually way, this was a wow moment. PolyArchy Server took a dataset from the Synchronisation Server and wrapped a small IIS website around it to expose intersecting relationships between data. When you selected a datapoint the visual would flip to the new context and display a list of entities associated with that relationship.

Microsoft proposed to deliver PolyArchy Server in calendar year 2006. However the product never made it to market. The concept of visualizing identity data was seeded in my brain and something I’ve always surfaced in one method or another as part of many Identity Management projects.

In this post I’ll detail how I’ve recently used Power BI to visualize relationship data from Microsoft Identity Manager.  The graphic below is an example (with node labels turned off) that represents Managers by Department by State.

Managers by Dept by State - Graphical.png

Using filters in the same report allows whoever is viewing the report to refine the visual based on State and Dept. By selecting a State from the map the visual will dynamically update to show that state only. Selecting a department only will show that department in each state.

Managers by Dept by State - Filtered.png

Hovering over the nodes will display the detail. I’ve turned off the node labels that show each nodes label to not expose the source of my dataset.

Managers by Dept by State - NSW Detail.png

Getting MIM MV User MetaData into Power BI

My recent post here details the necessary steps to get started publishing data directly in a Power BI Dataset using PowerShell. Follow the details listed there to register a Power BI Application.

Creating the DataSet

With that done the script below will create a DataSet in Power BI. My dataset is obviously specific to the environment I developed it in. You probably won’t have some of the attributes so you will need to update accordingly. The script is desinged to run on the MIM Sync Server. The MIM Sync Server will need to be able to connect to Azure and Power BI.

Publish data to the DataSet

Now that we have a Power BI DataSet (Table) we need to extract the data from the MIM MV and push it into the table. Using the Lithnet MIIS Automation PowerShell Module makes this extremely simple. Using the table schema created above I retrieve the values for each Active User, build a PowerShell Object and use the Power BI PowerShell Module to push the data to Power BI.

Creating the Power BI Visualization

The visualisation I’m using is the Journey Chart by MAQ Software which is available in the Power BI Store (free).

Journey Visual.PNG

With the Journey Visualization selected and dropped in we just have to select the attributes we want to visualize and the order of the relationships. The screenshot below shows the data sorted by State => managerName => accountName with Measure Data being accountName.

Visual Config.PNG

Conclusion

We never got PolyArchy Server from Microsoft, but we can quickly visualize basic relationship data from MIM with Power BI.

Automate the update of the data into Power BI, embed the Power BI Reports into your MIM Portal and provide access to the appropriate personnel.

 

A modern way to track FIM/MIM Attribute Value History utilizing Power BI

Introduction

Microsoft Identity Manager is fantastic for keeping data consistent between connected systems. Often however you want to know what a previous value of an attribute was. FIM/MIM however can only tell you the current value and the Management Agent it was received on and when.

In the past where I’ve had to provide a solution to either make sure an attribute has a unique value forever (e.g email address or loginID (don’t reuse email addresses or loginID)) or just attribute value history I’ve used two different approaches;

  • Store previous values in an SQL Table and have an SQL MA that flows out the values
  • Store historical values in a Multi-Valued attribute on the user object in the Metaverse

Both are valid approaches but often fall down when you want to quickly get a report on that metadata.

Recently we had a similar request to be able to know when Employees EndDates were updated in HR. Specifically useful for contractors who have their contracts extended. Instead of stuffing the info into a Multi-Valued attribute or an SQL DB this time I used Power BI. This provides the benefit of being able to quickly develop a graphical report and embed it in the FIM/MIM Portal.

Such a report looks like the screenshot below. Power BI Report

Using the filters on the right hand side of the report you can find a user (by EmployeeID or DisplayName), select them and see attribute value history details for that user in the main part of the report. As per the screenshot below Andrew’s EndDate was originally the 8th of December (as received on the 5th of November), but was changed to the 24th of November on the 13th of November.

End Date History

In this Post I describe how I quickly built the solution.

Overview

The process to do this involves;

  • creating a Power BI Application
  • creating a Power BI Dataset
  • creating a script to retrieve the data from the MV and inject it into the Power BI Dataset
  • creating a Power BI Report for the data
  • embedding the Report in the MIM Portal

Registering a Power BI Application

Head over to Power BI for Developers and Register an Application for Power BI. Login to Power BI with an account for the tenant you’ll be reporting data for. Give your Application a name and choose Native Application. Set the Redirect URL to https://localhost

CreatePBIApp

Choose the permissions for you Application. As we’ll be writing data into Power BI you’ll need a minimum of Read and Write all Datasets. Select Register App.

Create PBI App Permissions

Record your Client ID for your Application. We’ll need this to connect to Power BI.

Register the App

We need to authenticate to Power BI the first time using a UI to provide Authorization for our Application. In order to do that we need to add another Reply URL to our application. Head to the Apps Dev Portal, select your application and Edit the Application Manifest. Add an additional Reply URL for https://login.live.com/oauth20_desktop.srf as shown below.

Add Reply URL for AuthZ

The following PowerShell commands will then allow us to Authenticate utilizing the Power BI PowerShell module. If you don’t have the Power BI PowerShell Module installed un-comment Install-Module PowerBIPS -RequiredVersion 1.2.0.9 -Force  to install the PowerShell Power BI PowerShell Module.

Update for your Client ID for the App you registered in the previous steps.

# Install-Module PowerBIPS -RequiredVersion 1.2.0.9 -Force
Import-Module PowerBIPS -RequiredVersion 1.2.0.9

# PowerBI App
$clientID = "4036df76-4de6-43cb-afe6-1234567890"

$authtoken = Get-PBIAuthToken -ClientId $clientID

Sign in with an account for the Tenant where you created the Power BI App.

Interactive Login for Dataset Creation

Accept the permissions you chose when registering the Power BI App.

Authorize PowerBI App

Creating the Power BI Dataset

Now we will create the Power BI Table (Dataset) that we will use when we insert the records.

My table is named Employee and the DataSet EmployeeEndDateReport.  I’m keeping the table slim to enough info for our purpose. Date added to the dataset, employees Accountname, Displayname, Active state, EndDate and EndDateReceived. The following script will create the Dataset.

Populating the Dataset

With our table created, lets populate the table with employees that have an EndDate. As this is the first time we run it, we set a watermark date to add people from. I’ve gone with the previous year.  I then query the MV for Employees with an EndDate within the last 365 days, build a PowerShell Object with the columns from our table and insert them into Power BI. I also set a watermark of the last time we had an EndDate Received from the MA and output that to the watermark file. This is so next time we can quickly get only users that have an EndDate that was received since the last time we ran the process.

NOTE: for full automation you’ll need to change line 6 for your secure method of choice of providing credentials to scripts. 

Create a Power BI Report

Now in Power BI select your Data Set and design your report. Here is a sample one that I’ve put together. I simply selected the columns from the dataset and updated the look and feel. I then added in a column (individually for AccountName, DisplayName and Active) and chose it as Filter so that I have various ways of filtering whoever I’m looking for.

Power BI Report.png

Once you have run the process for a while and you have changed values for the attribute you are keeping history for, you will see when you select a user with changed values, you will see the history.

End Date History

Summary

To complete the solution you’ll want to automate the script that queries the MV for changes (probably after each run from the MA that provides the attribute you are recording history for), and you’ll want to embed the report in the MIM Portal. In this post here I detail how to do that step by step.

 

 

Validate Your Authoritative Sources – Creating a Fuse for FIM/MIM Import and Sync run cycles

 

Introduction

The Microsoft Identity Manager Synchronisation Engine has been around for close to 20 years and is highly functional and very reliable.

The Achilles heal though for any IDAM Sync Engine will always be an authoritative source and the information it provides to the Sync Engine.

I’m seeing more and more SaaS services being used as the Authoritative Source for identity management systems. Think Success Factors and Workday. Connecting across the internet to these and the rate of change within organisations means the amount of change data I’m seeing as well as the common human factor of changes en-mass means it is even more important to validate your import feeds before processing through your Sync Engines business logic.

Overview

This post details the foundation of a little logic that will call an Import from the Authoritative Source and analyse what is returned, evaluate it against the previous Import to understand the number of objects expected and determine if it is within an acceptable tolerance (I’m using 1% changes of total managed objects). If it doesn’t checkout, don’t run a Sync and send an email to someone to check it out and make a rational human decision (and maybe a manual sync). If the Import is valid then run the Sync.

Simply put;

  • Query the Authoritative Management Agent and get the last Import Run
    • set variables for the total number of objects processed; Adds, Renames, Updates and Deletes (and Total)
  • Run an Import cycle only
    • set variables for the total number of objects processed; Adds, Renames, Updates and Deletes (and Total)
    • Evaluate each of the Staging Adds, Renames, Updates and Deletes and see if the number of changes is less that the tolerance (1%)
      • if yes proceed with a Sync Run
      • if no send a notification and don’t run the Sync

Enhancements

This will need to be tailored for each environment. What is normal for the number of changes expected in your environment? You may see a lot of Updates and the global 1% tolerance I have here doesn’t work for that. So you may want a tolerance per Adds, Renames, Updates and Deletes.

Implementation

Where I’ve used this, I’ve saved the PowerShell script below into the same directory as the other scripts that automate the MIM Sync Run Sequence. I’ve updated the previous automation script and removed the Authoritative Delta Import / Delta Sync, Full Import / Delta Sync etc and called this Auth Fuse Script instead.

The Script

As always this uses the awesome LithnetMIISAutomation PowerShell Module from Ryan. Update;

  • lines 5-11 for your Auth Source.
  • lines 20-24 for SMTP Server and Notification settings
  • $smtpBody lines for what you want the notification emails to say

Summary

A simple piece of logic to check and validate your imports can save hours/days of work.

If your Auth Source doesn’t provide a full dataset and you haven’t checked your import before processing you could be deleting a LOT of accounts.

If HR changed the Org Structure and didn’t inform you or take into account IDAM Business Logic you could be about to process a lot of AD Account Moves. If it involved redundancies and they haven’t informed anyone yet, you could be exposing that information to an entire organisation. CHECK AND VALIDATE YOUR AUTHORITATIVE SOURCE IMPORTS !!

Display Microsoft Identity Manager Sync Engine Statistics in the MIM Portal

Introduction

In the Microsoft / Forefront Identity Manager Synchronization Service Manager under Tools we have a Statistics Report. This gives a break down of each of the Management Agents and the Connectors on each MA.

I had a recent requirement to expose this information for a customer but I didn’t want them to have to connect to the Synchronization Server (and be given the permissions to allow them to). So I looked into another way of providing a subset of this information in the MIM Portal itself.  This post details that solution.

MIM / FIM Synchronization Server Management Agent & Metaverse Statistics
MIM / FIM Synchronization Server Management Agent & Metaverse Statistics

Overview

I approached this in a similar way I did for the User Object Report I recently developed. The approach is;

  • Azure PowerShell Function App that uses Remote PowerShell to connect to the MIM Sync Server and leverage the Lithnet MIIS Automation PowerShell Module to enumerate all Management Agents and build a report on the information required in the report
  • A NodeJS WebApp calls the Azure PowerShell Function App onload to generate the report and display it
  • The NodeJS WebApp is embedded in the MIM Portal as a new Nav Bar Resource and Page

The graphic below details the basic logical integration.

MVStatsReportOverview

Prerequisites

The prerequisites to perform this I’ve covered in other posts. In concept as described above it is similar to the User Object report, that has the same prerequisites and I did a pretty good job on detailing those here. To implement this then that post is the required reading to get you ready.

Azure PowerShell Function App

Below is the raw script from my Function App that connects to the MIM Sync Server and retrieves the Management Agent Statistics for the report.

NodeJS Web App

The NodeJS Web App is the app that gets embedded in the MIM Portal that calls the Azure Function to retreive the data and then display it. To get started you’ll want to start with a based NodeJS WebApp. This post will get you started. Implementing a NodeJS WebApp using Visual Studio Code 

The only extension I’m using on top of what is listed there is JQuery. So once you have NodeJS up and running in your VSCode Terminal type npm install jquery and then npm install.

I’ve kept it simple and contained all in a single HTML file using JQuery.

In you NodeJS project you will need to reference your report.html file. It should look like this (assuming you name your report report.html)

var express = require('express');
var router = express.Router();
/* GET - Report page */
router.get('/', function(req, res, next) {
   res.sendFile('report.html', { root:'./public'});
});

module.exports = router;

The Embedded Report

This is what my report looks like embedded in the MIM Portal.

Microsoft Identity Manager Statistics Report
Microsoft Identity Manager Statistics Report

Summary

Integration of FIM / MIM with Azure Platform as a Service Services opens a world of functionality including the ability to expose information that was previously only obtainable by the FIM / MIM Administrator.

Receive Push Notifications from Microsoft/Forefront Identity Manager on your Mobile/Tablet/Computer

Background

Recently in a FIM/MIM environment a daily automated process was executing but the task it was performing was dependent on an upstream process that generates a feed, and the schedule for that feed had changed (without notice to me). Needless to say FIM/MIM wasn’t getting the information it needed to process. This got me thinking about notifications.

If you’re anything like me you probably have numerous email accounts and your subconscious has all but programmed itself to ignore “new email” notifications. However Push Notifications I typically do notice. Whilst in the example above I did have some error handling in place if the process completely failed (it is a development environment), I didn’t have anything for partial failures. Anyway it did get me thinking that I’d like to receive a notification if something that should happen didn’t.

Overview

This post details using push notifications to advise when expected events don’t transpire. In this particular example, I have an Azure Function App that connects once a day to a FTP Server and retrieves a series of exports and puts them on my FIM/MIM Synchronisation Sever. The Push Notification service I am using is Push Bullet. Push Bullet for free accounts (without a Pro subscription) are limited to 500 pushes per month. That should be more than enough. If I’ve got errors in excess of 500 per month I’ve got much bigger problems.

Getting Started

First up you will need to sign up for Push Bullet. It is very straight forward if you have a Facebook or Google account. As you’re probably wanting multiple people to receive the notifications it would pay to set up a shared Google Account that your team can use to connect to with their devices. Now you have an account head to your new Account Settings page and create an Access Token. Record it for use in the scripts below.

Connecting to the API

Test you can access the Push Bullet API using your Access Token and PowerShell. Update the following script for your Access Token in line 3 and execute. You should see information returned associated with your new Push Bullet account.

Next you will want to install the Push Bullet App on the device(s) you want to get the notification(s) on. I installed it on my Apple iPhone and also installed the Chrome Browser extension.

Using PowerShell we can then query to get the devices connected to the account. In the same PowerShell session you tested the API with above run this API call

$devices = Invoke-RestMethod -Method Get -Headers $header -Uri ($apiURI +"v2/devices")
$devices

This will return your registered devices.

Apple iOS iPhone Push Notification registered devices
Apple iOS iPhone Push Notification registered devices

If we want a notification to target a particular device we need to provide the Iden value associated with that device. If we don’t specify a target, the push notification will hit all devices. In my example above with two devices registered my iPhone was device two. So the target Iden I could get with;

$iphoneIden = $devices.devices[1].iden

Push Bullet allows for different notification types (Note, Link and File). Note is the one that’ll I’ll be using. More info on the other types here.

Sending Test Notification

To perform a notification test, update the following script for your Access Token (line 3). I’ve omitted the Device Identifier to send the message to all devices. I also had to logout of the iOS Push Bullet App and log in again to get the notifications to show.

Success. I received the notification on my iPhone and also in my Chrome browser.

Apple iOS iPhone Push Notification from FIM/MIM Identity Manager
Apple iOS iPhone Push Notification from FIM/MIM Identity Manager

Implementation

Getting back to my requirement of being notified when a job didn’t find what it expected, I updated my PowerShell Function App that is based off this blog post here to evaluate what it processed and if it didn’t find what is expected, it sends me a notification. I already had some error handling in my implementation based off that blog post but it was based on full failure, not partial (which is what I was experiencing whereby only one part of the process wasn’t returning data).

NOTE: I had to also add the ServerCertificateValidationCallback line into my Function App script before calling the API POST to send the notification as I was getting the dreaded following PowerShell Invoke-RestMethod / Invoke-WebRequest error when sending the notification via the Function App. I didn’t get that error on my dev workstation which is a bit weird.

Invoke-WebRequest : The underlying connection was closed: Could not establish trust relationship for the SSL/TLS secure 
channel.

If you also receive the error above (or you will be sending Push Notifications via Azure Function Apps) insert this line before your invoke-restmethod call.

 [System.Net.ServicePointManager]::ServerCertificateValidationCallback = {$true}

Summary

Essentially this is my first foray into enabling anything for Push Notifications and this post is food for thought on what can be easily enabled within FIM/MIM to give timely visibility to automated scheduled functions when they don’t perform as expected. It was incredibly simple to set up and get working. I see myself enabling more FIM/MIM functions with Push Notifications in the future.

Synchronizing Passwords from Active Directory to the IBM/Lotus Domino Identity Vault using Microsoft Identity Manager – Part 3

Introduction

As the title suggests this is Part 3, and the final part in a three-part post on configuring FIM/MIM to synchronise users passwords from AD to the Domino ID Vault via PCNS and FIM/MIM.
Part 1 here detailed the creation of a PowerShell Management Agent to join users from Domino to the MIM Sync Metaverse.
Part 2 here detailed the creation and configuration of the Domino Agents to receive password changes via the PS MA into the ID Vault.

This post will wrap it all up with the details on calling the Domino Agents on password sync events (from PCNS via MIM)

Prerequisites

You will need the IBM Notes client installed and configured on your MIM Sync Server in order to put a document in the database we created in Part 2 and start the agent to process the document(s).

Overview

Essentially this is the process;

  • Password changed for a user (either by an admin, or by the user via their domain joined workstation, password reset or any other password change mechanism)
  • Password change is captured by the AD PCNS Filter installed and configured on each (writeable) Domain Controller
  • The DC using the PCNS Config in the domain locates the MIM Sync Server to send the password change too
  • The MIM Sync Server has the associated AD Domain configured as a Password Sync Source
  • Our new PowerShell ID Vault Notes MA is configured as a Password Target
  • MIM Sync passes off the password change event for MIM joined users to the ID Vault Password Change MA which initiates the Password.ps1 script (below)
  • The password.ps1 script creates a document (that contains the details for the password change) in our ID Vault Password Sync Database we created in Part 2 of this series and then tells the MIMPwdTrigger Agent to start
  • The MIMPwdTrigger Agent picks up the document, passes it to the MIMPasswordSync Agent which sends the password change to the ID Vault

Domino PowerShell Management Agent Password.ps1 Script

Put this Password.ps1 script in the same location you put the Schema, Import and Export scripts earlier.

Testing Password Sync End to End (Active Directory to the ID Vault)

The following screen shots show me tracing through the logs for a password change as it makes it way from the AD Domain Controller to MIM Sync to the MA to the MA Password script to the Notes DB as a document triggered to be process by the Notes Agent and the user updated in the ID Vault.

First the password change event is initiated to the MIM Sync Service by the Domain Controller that captured the password change.

Password Change1

PCNS provides all the details for the password change.

Password Change2

The MIM Sync Server determines where to send the change which includes our PS Notes MA.

Password Change3

Our PS Notes MA logged the process.

Notes MA LOG

=============================================================

Display Name: Jane XXX/xxx/xxxxx-Aus

Action: Set

Old pwd:

New pwd: Password123456

Unlock: False

Force change: False

Validate: False

Database: System.__ComObject

As did the Notes Agent as it process the change.

Notes Agent Log

MIMPasswordSync|mimpasswordsync: 08/03/2017 02:56:22 PM: Reseting password …

MIMPasswordSync|mimpasswordsync: 08/03/2017 02:56:22 PM: Server: xxxNotes1/xxxxx-Aus User:Jane xxx/xxx/xxxxx-Aus

MIMPasswordSync|mimpasswordsync: 08/03/2017 02:56:23 PM: Return value: true

MIMPasswordSync|mimpasswordsync: 08/03/2017 02:56:23 PM: Removed User ID Vault change document from ‘xxxNotes1/xxxxx-Aus’

And finally we see the change reflected in the ID Vault. Looking at the time-stamps along the way we see that it all happened in approximately 2 seconds.

Password Change4

Summary

This three-part blog post has shown how to get passwords from Active Directory to the MIM Sync connected source across to IBM Domino and into the ID Vault using the Granfeldt PowerShell Management Agent and some configuration with a Database in Domino with two Domino Agents.

What have you synchronised passwords too using FIM/MIM ?