Synchronizing Exchange Online/Office 365 User Profile Photos with FIM/MIM

Introduction

This is Part Two in the two-part blog post on managing users profile photos with Microsoft FIM/MIM. Part one here detailed managing users Azure AD/Active Directory profile photo. This post delves deeper into photos, specifically around Office 365 and the reason why you may want to manage these via FIM/MIM.

Background

User profile photos should be simple to manage. But in a rapidly moving hybrid cloud world it can be a lot more complex than it needs to be. The best summary I’ve found of this evolving moving target is from Paul Ryan here.

Using Paul’s sound advice we too are advising our customers to let users manage their profile photo (within corporate guidelines) via Exchange Online. However as described in this article photos managed in OnPremise Active Directory are synchronized to Azure AD and on to other Office365 services only once. And of course we want them to be consistent across AD DS, Azure AD, Exchange Online and all other Office365 Services.

This post details synchronizing user profile photos from Exchange Online to MIM for further synchronization to other systems. The approach uses a combination of Azure GraphAPI and Exchange Remote PowerShell to manage Exchange Online User Profile Photos.

The following graphic depicts the what the end goal is;

Current State

  • Users historically had a photo in Active Directory. DirSync/ADSync/AzureADConnect then synchronized that to Azure AD (and once only into Office 365).
  • Users update their photo in Office365 (via Exchange Online and Outlook Web Access)
    • the photo is synchronized across Office365 Services

Desired State

  • An extension of the Current State is the requirement to be able to take the image uploaded by users in Exchange Online, and synchronize it back to the OnPremise AD, and any other relevant services that leverage a profile photo
  • Have AzureADConnect keep AzureAD consistent with the new photo obtained from Office365 that is synchronized to the OnPrem Active Directory
  • Sync the current photo to the MIM Portal

Synchronizing Office365 Profile Photos

Whilst Part-one dealt with the AzureAD side of profile photos as an extension to an existing AzureAD PowerShell Management Agent for FIM/MIM, I’ve separated out the Office365 side to streamline it and make it as efficient as possible. More on that later. As such I’ve created a new PowerShell Management Agent specifically for Office365 User Profile Photos.

I’m storing the Exchange Online photo in the MIM Metaverse as a binary object just as I did for the AzureAD photo (but in a different attribute ). I’m also storing a checksum of the photos (as I did for the AzureAD Photo, but also in a different attribute) to make it easier for comparing what is in Azure AD and Exchange Online, to then be used to determine if changes have been made (eg. user updated their profile photo).

Photo Checksum

For generating the hash of the profile photos I’m using Get-Hash from the Powershell Community Extensions.  Whilst PowerShell has Get-FileHash I don’t want to write the profile photos out to disk and read them back in just to get the checksum. That slows the process up by 25%. You can get the checksum using a number of different methods and algorithms. Just be consistent and use the same method across both profile photos and you’ll be comparing apples with apples and the comparison logic will work.

Some notes on Photos and Exchange Online (and MFA)

This is where things went off on a number of tangents. Initially I tried accessing the photos using Exchange Online Remote PowerShell.

CAVEAT 1: If your Office365 Tenant is enabled for Multi-Factor Authentication (which it should be) you will need to get the Exchange Online Remote PowerShell Module as detailed here. Chances are you won’t have full Office365 Admin access though, so as long as the account you will be using is in the Recipient Management Role you should be able to go to the Exchange Control Panel using a URL like https://outlook.office365.com/ecp/?realm=<tenantname>&wa=wsignin1.0 where tenantname is something like customer.com.au From the Hybrid menu on in the right handside pane you will then be able to download the Microsoft.Online.CSE.PSModule.Client.application I had to use Internet Explorer to download the file and get it installed successfully. Once installed I used a few lines from this script here to load the Function and start my RPS session from within PowerShell ISE during solution development.

CAVEAT 2: The EXO RPS MFA PS Function doesn’t allow you to pass it your account password. You can pass it the identity you want to use, but not the password. That makes scheduled process automation with it impossible.

CAVEAT 3: The RPS session exposes the Get-UserPhoto cmdlet which is great. But the RPS session leverages the GraphAPI. The RPS PS Module doesn’t refresh it’s tokens, so if the import takes longer than 60 minutes then using this method you’re a bit stuffed.

CAVEAT 4: Using the Get-UserPhoto cmdlet detailed above, the syncing of photos is slow. As in I was only getting ~4 profile photos per minute slow. This also goes back to the token refresh issue as for pretty much any environment of the size I deal with, this is too slow and will timeout.

CAVEAT 5: You can whitelist the IP Address (or subnet) of your host so MFA is not required using Contextual IP Addressing Whitelisting. At that point there isn’t really a need to use the MFA Enabled PREVIEW EXO RPS function anyway. That said I still needed to whitelist my MIM Sync Server(s) from MFA to allow integration into the Graph API. I configured just the single host. The whitelist takes CIDR format so that looks like /32 (eg. 11.2.33.4/32)

Performance Considerations

As I mentioned above,

  • using the Get-UserPhoto cmdlet was slow. ~4 per minute slow
  • using the GraphAPI into Exchange Online and looking at each user and determining if they had a photo then downloading it, was also slow. Slow because at this customer only ~50% of their users have a photo on their mailbox. As such I was only able to retrieve ~145 photos in 25 minutes. *Note: all timings listed above were during development and actually outputting the images to disk to verify functionality. 

Implemented Solution

After all my trial and error on this, here is my final approach and working solution;

  1. Use the Exchange Online Remote PowerShell (non-MFA version) to query and return a collection of all mailboxes with an image *Note, add an exception for your MIM Sync host to the white-listed hosts for MFA (if your Office365 Tenant is enabled for MFA) so the process can be automated
  2. Use the Graph API to obtain those photos
    • with this I was able to retrieve ~1100 profile photos in ~17* minutes (after ~2 minutes to query and get the list of mailboxes with a profile photo)

Pre-requisites

There’s a lot of info above, so let me summarize the pre-requisties;

  • The Granfeldt PowerShell MA
  • Whitelist your FIM/MIM Sync Server from MFA (if your Office 365 environment is enabled for MFA)
  • Add the account you will run the MA as, that will in turn connect to EXO via RPS to the Recipient Management Role
  • Create a WebApp for the PS MA to use to access users Profile Photos via the Graph API (fastest method)
  • Powershell Community Extensions to generate the image checksum

Creating the WebApp to access Office365 User Profile Photos

Go to your Azure Portal and select the Azure Active Directory Blade from the Resource Menu bar on the left. Then select App Registrations and from the Manage Section of the Azure Active Directory menu, and finally from the top of the main pane select “New Application Registration“.

Give it a name and select Web app/API as the type of app. Make the sign-in URL https://localhost and then select Create.

Record the ApplicationID that you see in the Registered App Essentials window. You’ll need this soon.

Now select All Settings => Required Permissions. Select Read all users basic profiles in addition to Sign in and read user profile. Select Save.

Under Required Permissions select Add and then select 1 Select an API, and select Office 365 Exchange Online then click Select.

Choose 2 Select Permissions and then select Read user profiles and Read all users’ basic profiles. Click Select.

Select Grant Permissions

From Settings select Keys, give your key a Description, choose a key lifetime and select Save. RECORD the key value. You’ll need this along with the WebApp ApplicationID/ClientID for the Import.ps1 script.

Using the information from your newly registered WebApp, we need to perform the first authentication (and authorization of the WebApp) to the Graph API. Taking your ApplicationID, Key (Client Secret) and the account you will use on on the Management Agent (and that you have assigned the Recipient Management Role in Exchange Online) and run the script detailed in this post here. It will authenticate you to your new WebApp via the GraphAPI after asking you to provide the account you will use on the MA and Authorizing the permissions you selected when registering the app. It will also create a refresh.token file which we will give to the MA to automate our connection. The Authorization dialog looks like this.

Creating the Management Agent

Now we can create our Management Agent using the Granfeldt PowerShell Management Agent. If you haven’t created one before checkout a post like this one, that further down the post shows the creation of a Granfeldt PSMA. Don’t forget to provide blank export.ps1 and password.ps1 files on the directory where you place the PSMA scripts.

PowerShell Management Agent Schema.ps1

PowerShell Management Agent Import.ps1

As detailed above the PSMA will leverage the WebApp to read users Exchange Profile Photos via the Graph API. The Import script also leverages Remote Powershell into Exchange Online (for reasons also detailed above). The account you run the Management Agent as will need to be added to the Recipient Management Role Group in order to use Remote PowerShell into Exchange Online and get the information required.

Take the Import.ps1 script below and update;

  • Update lines 11, 24 and 42 for the path to where you have put your PSMA. Mine is under the Extensions directory in a directory named EXOPhotos.
  • copy the refresh.token generated when authenticating and authorizing the WebApp earlier into the directory you specified in line 42 above.
  • Create a Debug directory under the directory you specified in lines 11,24 and 42 above so you can see what the MA is doing as you implement and debug it the first few times.
  • I’ve written the Import to use Paged Imports, so make sure you tick the Paged Imports checkbox on the configuration of the MA
  •  Update Lines 79 and 80 with your ApplicationID and Client Secret that you recorded when creating your WebApp

 

Running the Exchange User Profile Photos MA

Now that you have created the MA, you should have select the EXOUser ObjectClass and the attributes defined in the schema. You should also create the EXOPhoto (as Binary) and EXOPhotoChecksum (as String) attributes in the Metaverse on the person ObjectType (assuming you are using the built-in person ObjectType).

Configure your flow rules to flow the EXOPhoto and EXOPhotoChecksum on the MA to their respective attributes in the MV.

Create a Stage Only run profile and run it. If you have done everything correctly you will see photos come into the Connector Space.

Looking at the Connector Space, I can see EXOPhoto and EXOPhotoChecksum have been imported.

After performing a Synchronization to get the data from the Connector Space into the Metaverse it is time to test the image that lands in the Metaverse. That is quick and easy via PowerShell and the Lithnet MIIS Automation PowerShell Module.

$me = Get-MVObject -ObjectType person -Attribute accountName -Value "drobinson"
$me.Attributes.EXOPhoto.Values.ValueBinary
[System.Io.File]::WriteAllBytes("c:\temp\myOutlookphoto.jpg" ,$me.Attributes.EXOPhoto.Values.ValueBinary )

The file is output to the directory with the filename specified.

Opening the file reveals correctly my Profile Photo.

Summary

In Part one we got the AzureAD/Active Directory photo. In this post we got the Office365 photo.

Now that we have the images from Office365 we need to synchronize any update to photos to Active Directory (and in-turn via AADConnect to Azure AD). Keep in mind the image size limits for Active Directory and that we retrieved the largest photo available from Office365 when synchronizing the photo on. There are a number of PowerShell modules for photo manipulation that will allow you to resize accordingly.

 

 

 

How to Synchronize users Active Directory/Azure Active Directory Photo using Microsoft Identity Manager

Introduction

Whilst Microsoft FIM/MIM can be used to do pretty much anything your requirements dictate, dealing with object types other than text and references can be a little tricky when manipulating them the first time. User Profile Photos fall into that category as they are stored in the directory as binary objects. Throw in Azure AD and obtaining and synchronizing photos can seem like adding a double back-flip to the scenario.

This post is Part 1 of a two-part post. Part two is here. This is essentially the introduction to the how-to piece before extending the solution past a users Active Directory Profile Photo to their Office 365 Profile Photo. Underneath the synchronization and method for dealing with the binary image data is the same, but the API’s and methods used are different when you are looking to implement the solution for any scale.

As for why you would want to do this, refer to Part two here. It details why you may want to do this.

Overview

As always I’m using my favourite PowerShell Management Agent (the Granfeldt PSMA). I’ve updated an existing Management Agent I had for Azure AD that is described here. I highly recommend you use that as the basis for the extra photo functionality that I describe in this post. Keep in mind the AzureADPreview, now AzureAD Powershell Module has change the ADAL Helper Libraries. I detail the changes here so you can get AuthN to work with the new libraries.

Therefore the changes to my previous Azure AD PowerShell MA are to add two additional attributes to the Schema script, and include the logic to import users profile photo (if they have one) in the Import script.

Schema.ps1

Take the schema.ps1 from my Azure AD PSMA here and add the following two lines to the bottom (before the $obj in the last line where I’ve left an empty line (29)).

$obj | Add-Member -Type NoteProperty -Name "AADPhoto|Binary" -Value 0x20 
$obj | Add-Member -Type NoteProperty -Name "AADPhotoChecksum|String" -Value "23973abc382373"

The AADPhoto attribute of type Binary is where we will store the photo. The AADPhotoChecksum attribute of type String is where we will store a checksum of the photo for use in logic if we need to determine if images have changed easily during imports.

 

Import.ps1

Take the import.ps1 from my Azure AD PSMA here and make the following additions;

  • On your MIM Sync Server download/install the Pscx PowerShell Module.
    • The Pscx Powershell Module is required for Get-Hash (to calculate Image checksum) based on variables vs a file on the local disk
    • You can get the module from the Gallery using Install-Module Pscx -Force
    • Add these two lines up the top of the import.ps1 script. Around line 26 is a good spot
# Powershell Module required for Get-Hash (to calculate Image checksum)
Import-Module Pscx
  • Add the following lines into the Import.ps1 in the section where we are creating the object to pass to the MA. After the $obj.Add(“AADCity”,$user.city) line is a good spot. 
  • What the script below does is create a WebClient rather than use Invoke-RestMethod or Invoke-WebRequest to get the users Azure AD Profile image only if the ‘thumbnailPhoto@odata.mediaContentType’ attribute exists which indicates the user has a profile photo. I’m using the WebClient over the PowerShell Invoke-RestMethod or Invoke-WebRequest functions so that the returned object is in binary format (rather than being returned as a string), which saves having to convert it to binary or output to a file and read it back in. The WebClient is also faster for transferring images/data.
  • Once the image has been returned (line 8 below) the image is added to the object as the attribute AADPhoto to be passed to the MA (line 11)
  • Line 14 gets the checksum for the image and adds that to the AADPhotoChecksum attribute in line 16.

Other changes

Now that you’ve updated the Schema and Import scripts, you will need to;

  • Refresh your schema on your Azure AD PSMA to get the new attributes (AADPhoto and AADPhotoChecksum) added
  • Select the two new attributes in the Attributes section of your Azure AD PSMA
  • Create in your MetaVerse via the MetaVerse Designer two new attributes on the person (or whatever ObjectType you are using for users), for AADPhoto and AADPhotoChecksum. Make sure that AADPhoto is of type Binary and AADPhotoChecksum is of type string.

  • Configure your Attribute Flow on your Azure AD PSMA to import the AADPhoto and AADPhotoChecksum attributes into the Metaverse. Once done and you’ve performed an Import and Sync you will have Azure AD Photos in your MV.

  • How do you know they are correct ? Let’s extract one from the MV, write it to a file and have a look at it. This small script using the Lithnet MIIS Automation PowerShell Module makes it easy. First I get my user object from the MV. I then have a look at the text string version of the image (to make sure it is there), then output the binary version to a file in the C:\Temp directory.
$me = Get-MVObject -ObjectType person -Attribute accountName -Value "drobinson"
[string]$myphoto = $me.Attributes.AADPhoto.Values.ValueString
[System.Io.File]::WriteAllBytes("c:\temp\UserPhoto.jpg" ,$me.Attributes.AADPhoto.Values.ValueBinary )
  • Sure enough. The image is valid.

Conclusion

Photos are still just bits of data. Once you know how to get them and manipulate them you can do what ever you need to with them. See Part two that takes this concept and extends it to Office 365.

Using the Lithnet PowerShell Modules to generate full object metadata FIM/MIM HTML Reports

How many times have you wanted a consolidated report out of FIM/MIM for an object? What connectors does it have, what are the values of the attributes, which Management Agent contributed the value(s) and when? Individually of course you can get that info using the Metaverse Search and looking at the object in MIM Portal. But what if you wanted it all with a single query? This blog post provides an approach to doing just that. The graphic above shows a screenshot of a sample output. Click this Sample Report for full resolution version of the screenshot above. Note: The updated version of the script below outputs DisplayName for the ExpectedRulesList attribute so it actually provides valuable information. 

Overview

The approach is quite simple. It is;

  • Query the FIM/MIM Metaverse for an object
  • Take the response from the Metaverse to build the Connectors and Metaverse Hologram reports
  • Use the connector information to query the MIM Service MA (this example assumes it is on the same server. If not add the following line into the script with the appropriate values) and get the objects MIM Service Connector Space info
    Set-ResourceManagementClient -BaseAddress http://fimsvc:5727;
  • Take information retrieved above to then query the MIM Service and return the information for the object.
  • Format all the output for HTML, apply a simple style sheet, output to file and display in the default browser

NOTE: If you combine this with the Get-MVObject query building script detailed here it can be a relatively simple solution. That script even uses the same variables $queries and $query as outputs from the search and input into the HTML Report.

NOTE: You could possibly run it remotely from the MIM Sync Server too, if you leverage Remote Powershell to your FIM/MIM Sync server as detailed here.

The Script

Here it is. Lines 23 and 24 contain a hard-coded query. Update for your search criteria, or as detailed above combine this with the Get-MVObject query building script detailed here .  The Output directory specified in Line 7 is where the stylesheet and the resultant HTML file will be placed. Update for your needs.

For the Expected Rules List (unlike the screenshot as I’ve modified the script afterwards), the script gets the DisplayName for them and puts that in the report. DisplayName is more valuable than an ERE ObjectID.

Scripting queries for Lithnet Get-MVObject searches into the Microsoft Identity Manager Metaverse

It probably seems obvious by now, but I seem to live in PowerShell and Microsoft Identity Manager. I’m forever looking into the Microsoft Identity Manager Metaverse for objects.

However, sometimes I get tripped up by the differences in Object Classes between the FIM/MIM Service and the Metaverse, the names of the Object Classes (obviously not Person, Group and Contact) and in situations where they are case-sensitive.  If you’re using the Sync Service Manager Metaverse Search function though you get a pick list. But getting the data out to do something else with isn’t an option.

Solution

I’ve looked to quickly provide a similar function to the pick lists in the Metaverse Search GUI via Powershell which then gets executed by the Get-MVObject PowerShell Module.

UPDATE: 17 May 2017 The Lithnet MIIS Automation PowerShell Module has been updated for Get-MVObject to support the ObjectType Scope. I’ve updated the script to include the scope parameter based on the ObjectClass selected at the beginning of the script. 

I’ve defaulted the ObjectClass to Person so you can just press enter. But if you have custom ObjectClasses in your Metaverse you may need to change the index number in Line 48 from 5 to whichever index Person appears in your environment. Same goes for the default attribute of AccountName in the Attribute list. It appears at index 5 (Line 77) in my attribute list.

Process

Basically just run the PowerShell script and choose your options. The script needs interaction with the FIM/MIM Sync server, so you run it from the FIM/MIM Sync server. If you want to run it remotely (of course you do), then Remote PowerShell is your friend. Checkout how to do that to the FIM/MIM Server in this post here.

The Script itself will query the FIM/MIM MV Schema and return a list of Object Classes. As detailed above, in Line 48 of the script I have ‘index 5’ as the default which in my environment is Person and as such you can just hit enter if that is the Object Class you want to choose attributes from in the next step. Otherwise type the name of the ObjectClass you want. You don’t have to worry about case sensitivity as the script handles that. You can only choose a single ObjectClass obviously, but the menu ui I’ve used allows for multiple selections. Just press enter when prompted for another option for ObjectClass.

You’ll then be presented with a list of attributes from the chosen Object Class above. Again as detailed above I have it defaulting to ‘accountName’ which is index 5 in my list. Change (Line 77) for the default you want. This means you can just hit enter if accountName is what you’re querying on (which is common). Or choose another option. This then also allows you to also choose multiple attributes (which will be added to an array). This means you can use this for complex queries such as;

accountName startsWith 'dar'
sn startsWith 'rob'
mail contains '@kloud'

If you want to choose multiple attributes for your query and one of them is the default option, make sure you specify one of the attributes that is not the default first so that you get the option to specify more. When you’ve chosen all the attributes you are going to use in your query hit enter and the script will take an empty response as the end of your choices.

Now for each attribute chosen you will be prompted for an Operator. Pretty simple. Just choose from the available options. Note: all operators are shown but not all operators can be used for all attribute types. e.g. Don’t select ‘EndsWith’ for a Boolean attribute type and expect it to work. If you choose an operator other than the default (equals in my example) hit enter when prompted for the second time and the script will take an empty response as the end of your choices.

Finally provide what you the value is for the search term for the attribute. If the value has spaces, don’t worry about putting the value in quotes. The script takes care of that.

The last two steps will iterate through, for queries where you have chosen multiple attributes.

And you’re done. $query is the variable that contains the results. In line 115 I’m using Show-Object from the PowershellCookBook PSM. That then gives you a GUI representation of the result as shown below. If the query returns multiple results this will only show the last.

Line 114 outputs the value of the attributes ($query.attributes) to the console as well. If you have multiple objects returned $query will show them as shown below.

Finally if you want to run the query again, or just make a subtle change, you shouldn’t have to go through that again. Get the value of $querytxt and you’ll get the query and the command to execute it. $querytxt is also output to the console as shown below. Copy and paste it into Powershell ISE, update and execute.

The Script

Here is the raw script. Hardly any error handling etc, but enough to get you started and tailor it for your requirements. Enjoy.

Scripting the generation & creation of Microsoft Identity Manager Sets/Workflows/Sync & Management Policy Rules with the Lithnet Resource Management PowerShell Module

Introduction

Yes, that title is quite a mouthful. And this post is going to be quite long. But worth the read if you are having to create a number of rules in Microsoft/Forefront Identity Manager, or even more so the same rule in multiple environments (eg. Dev, Staging, Production).

My colleague David Minnelli introduced using the Lithnet RMA PowerShell Module and the Import-RMConfig cmdlet recently for bulk creation of MIM Sets and MPR’s. David has a lot of the background on Import-RMConfig and getting started with it. Give that a read for a more detailed background.

In this post I detail using Import-RMConfig to create a Set, Workflow, Synchronization Rule and Management Policy Rule to populate a Development AD Domain with Users from a Production AD Domain. This process is designed to run on a combined MIM Service/Sync Server. If your roles a separated (as they likely will be in a Production environment) you will need to run these scripts on the MIM Sync Server (so it can query the Management Agents, and you will need to add in a line to connect to the MIM Service (eg. Set-ResourceManagementClient ) at the beginning of the script.

In my environment I have two Active Directory Management Agents, each connected to an AD Forest as shown below.

On each of the AD MA’s I have a Constant Flow Attribute (named Source) configured to flow in a value representing the source AD Forest. I’m doing this in my environment as I have more than one production forest (hence the need for automation). You could simply use the Domain attribute for the Set criteria. That attribute is used in the Set later on. Mentioning it up front so it make sense.

Overview

The Import-RMConfig cmdlet uses XML and XOML files that contain the configuration required to create the Set, Workflow, Sync Rule and MPR in the FIM/MIM Service. The order that I approach the creation is, Sync Rule, Workflow, Set and finally the MPR.

Each of these objects as indicated above leverage an XML and/or XOML input file. I’ve simplified base templates and included them in the scripts.

The Sync Rule Script includes a prompt to choose a folder (you can create one through the GUI presented) to store the XML/XOML files to allow the Import-RMConfig to use them. Once generated you can simply reference the files with Import-RMConfig to replicate the creation in another environment.

Creating the Synchronization Rule

For creation of the Sync Rule we need to define which Management Agent will be the target for our Sync Rule. In my script I’ve automated that too (as I have a number to do). I’m querying the MIM Sync Server for all its Active Directory MA’s and then providing a dialog to allow you to choose the target MA for the Sync Rule. That dialog simply looks like the one below.

Creating the Sync Rule will finally ask you to give the Rule a name. This name will then be used as the base Display Name for the Set, MPR and Workflow (and a truncated version as the Rule ID’s).

The script below in the $SyncRuleXML section defines the rules of the Sync Rule. Mine is an Outbound Sync Rule, with a base set of attributes and transforming the users UPN and DN (for the differing Development AD namespace). Update lines 42 and 45 for the users UPN and DN your namespace.

Creating the Workflow

The Workflow script is pretty self-explanatory. A simple Action based workflow and is below.

Creating the Set

The Set is the group of objects that will be synchronized to the target management agent. As my Sync Rule is only for Users my Set is also only contains users. As stated in the Overview I have an attribute that defines the authoritative source for the objects. I’m also using that in my Set criteria.

Creating the Management Policy Rule

The MPR ties everything together. Here’s that part of the script.

Tying them all together

Here is the end to end automation, and the raw script that you could use as the basis for automating similar rule generation. The Sync Rule could easily be updated for Contacts or Groups. Remember the attributes and object classes are case sensitive’.

  • Through the Browse for Folder dialog I created a new folder named ProvisionDevAD

  • I provided a Display Name for the rules

  • I chose the target Management Agent

  • The SyncRule, Workflow, Set and MPR are created. The whole thing takes seconds.

  • Script Complete

Let’s take a look at the completed objects through the MIM Portal.

Sync Rule

The Sync Rule is present as we named it. Including the !__ prefix so it appears at the top of the list.

Outbound Sync Rule based on a MPR, Set and Workflow

The Resources will be created and if deleted de-provisioned.

And our base attribute flows.

Set

Our Set was created.

Our naming aligns with what we input

And a Criteria based Set. As per the Overview I have an attribute populated by a Constant flow rule that I based my set on. You’ll want to update for you environment.

Workflow

The Action Workflow was created

All looks great

And it applies our Sync Rule

MPR

And finally our MPR. Created as a Transition In MPR with Action Workflow

Set Transition and naming all aligned

The Transition Set configured for the Set that was created

And the Workflow configured with the Workflow that was just created

Summary

When you have a lot of Sync Rules to create, or you know you will need to re-create them numerous times, potentially in different environments automation is key. This just scratches the surface on what can be achieved, and made so much easier using the Lithnet PowerShell Modules.

Here’s is the full script. Note: You’ll need to make a couple of minor changes as indicated earlier, but you should be able to create a Provisioning Rule end to end pretty quick to validate the process. Then customize accordingly for your environment and requirements. Enjoy.

How to create a PowerShell FIM/MIM Management Agent for AzureAD Groups using Differential Sync and Paged Imports

Introduction

I’ve been working on a project where I must have visibility of a large number of Azure AD Groups into Microsoft Identity Manager.

In order to make this efficient I need to use the Differential Query function of the AzureAD Graph API. I’ve detailed that before in this post How to create an AzureAD Microsoft Identity Manager Management Agent using the MS GraphAPI and Differential Queries. Due to the number of groups and the number of members in the Azure AD Groups I needed to implement Paged Imports on my favourite PowerShell Management Agent (Granfeldt PowerShell MA). I’ve previously detailed that before too here How to configure Paged Imports on the Granfeldt FIM/MIM PowerShell Management Agent.

This post details using these concepts together specifically for AzureAD Groups.

Pre-Requisites

Read the two posts linked to above. They will detail Differential Queries and Paged Imports. My solution also utilises another of my favourite PowerShell Modules. The Lithnet MIIS Automation PowerShell Module. Download and install that on the MIM Sync Server where you be creating the MA.

Configuration

Now that you’re up to speed, all you need to do is create your Granfeldt PowerShell Management Agent. That’s also covered in the post linked above  How to create an AzureAD Microsoft Identity Manager Management Agent using the MS GraphAPI and Differential Queries.

What you need is the Schema and Import PowerShell Scripts. Here they are.

Schema.ps1

Two object classes on the MA as we need to have users that are members of the groups on the same MA as membership is a reference attribute. When you bring through the Groups into the MetaVerse and assuming you have an Azure AD Users MA using the same anchor attribute then you’ll get the reference link for the members and their full object details.

Import.ps1

Here is my PSMA Import.ps1 that performs what is described in the overview. Enumerate AzureAD for Groups, import the active ones along with group membership.

Summary

This is one solution for managing a large number of Azure AD Groups with large memberships via a PS MA with paged imports showing progress thanks to differential sync which then allows for subsequent quick delta-sync run profiles.

I’m sure this will help someone else. Enjoy.

Follow Darren on Twitter @darrenjrobinson

Automate the nightly backup of your Development FIM/MIM Sync and Portal Servers Configuration

Last week in a customer development environment I had one of those oh shit moments where I thought I’d lost a couple of weeks of work. A couple of weeks of development around multiple Management Agents, MV Schema changes etc. Luckily for me I was just connecting to an older VM Image, but it got me thinking. It would be nice to have an automated process that each night would;

  • Export each Management Agent on a FIM/MIM Sync Server
  • Export the FIM/MIM Synchronisation Server Configuration
  • Take a copy of the Extensions Folder (where I keep my PowerShell Management Agents scripts)
  • Export the FIM/MIM Service Server Configuration

And that is what this post covers.

Overview

My automated process performs the following;

  1. An Azure PowerShell Timer Function WebApp is triggered at 2330 each night
  2. The Azure Function App initiates a Remote PowerShell session to my Dev MIM Sync Server (which is also a MIM Service Server)
  3. In the Remote PowerShell session the script;
    1. Creates a new subfolder under c:\backup with the current date and time (dd-MM-yyyy-hh-mm)

  1. Creates further subfolders for each of the backup elements
    1. MAExports
    2. ServerExport
    3. MAExtensions
    4. PortalExport

    1. Utilizes the Lithnet MIIS Automation PowerShell Module to;
      1. Enumerate each of the Management Agents on the FIM/MIM Sync Server and export each Management Agent to the MAExports Folder
      2. Export the FIM/MIM Sync Server Configuration to the ServerExport Folder
    2. Copies the Extensions folder and subfolder contexts to the MAExtensions Folder
    3. Utilizes the FIM/MIM Export-FIMConfig cmdlet to export the FIM Server Configuration to the PortalExport Folder

Implementing the FIM/MIM Backup Process

The majority of the setup to get this to work I’ve covered in other posts, particularly around Azure PowerShell Function Apps and Remote PowerShell into a FIM/MIM Sync Server.

Pre-requisites

  • I created a C:\Backup Folder on my FIM/MIM Server. This is where the backups will be placed (you can change the path in the script).
  • I installed the Lithnet MIIS Automation PowerShell Module on my MIM Sync Server
  • I configured my MIM Sync Server to accept Remote PowerShell Sessions. That involved enabling WinRM, creating a certificate, creating the listener, opening the firewall port and enabling the incoming port on the NSG . You can easily do all that by following my instructions here. From the same post I setup up the encrypted password file and uploaded it to my Function App and set the Function App Application Settings for MIMSyncCredUser and MIMSyncCredPassword.
  • I created an Azure PowerShell Timer Function App. Pretty much the same as I show in this post, except choose Timer.
    • I configured my Schedule for 2330 every night using the following CRON configuration

0 30 23 * * *

  • I set the Azure Function App Timezone to my timezone so that the nightly backup happened at the correct time relative to my timezone. I got my timezone index from here. I set the  following variable in my Azure Function Application Settings to my timezone name AUS Eastern Standard Time.

    WEBSITE_TIME_ZONE

The Function App Script

With all the pre-requisites met, the only thing left is the Function App script itself. Here it is. Update lines 2, 3 & 6 if your variables and password key file are different. The path to your password keyfile will be different on line 6 anyway.

Update line 25 if you want the backups to go somewhere else (maybe a DFS Share).
If your MIM Service Server is not on the same host as your MIM Sync Server change line 59 for the hostname. You’ll need to get the FIM/MIM Automation PS Modules onto your MIM Sync Server too. Details on how to achieve that are here.

Running the Function App I have limited output but enough to see it run. The first part of the script runs very quick. The Export-FIMConfig is what takes the majority of the time. That said less than a minute to get a nice point in time backup that is auto-magically executed nightly. Sorted.

Summary

The script itself can be run standalone and you could implement it as a Scheduled Task on your FIM/MIM Server. However I’m using Azure Functions for a number of things and having something that is easily portable and repeatable and centralised with other functions (pun not intended) keeps things organised.

I now have a daily backup of the configurations associated with my development environment. I’m sure this will save me some time in the near future.

Follow Darren on Twitter @darrenjrobinson

How to configure Paged Imports on the Granfeldt FIM/MIM PowerShell Management Agent

Introduction

In the last 12 months I’ve lost count of the number of PowerShell Management Agents I’ve written to integrate Microsoft Identity Manager with a plethora of environments. The majority though have not been of huge scale (<50k objects) and the import of the managed entities into the Connector Space/Metaverse runs through pretty timely.

However this week I’ve been working on a AzureAD Groups PS MA for an environment with 40k+ groups. That in itself isn’t that large, but when you start processing Group Memberships as well, the Import process can take an hour for a Full Sync. During this time before the results are passed to the Sync Engine you don’t have any visual of where the Import is up to (other than debug logging). And the ability to stop the MA requires a restart of the Sync Engine Server.

I’ve wanted to mess with Paging the Imports for sometime, but it hadn’t been a necessity. Now it is, so I looked to working out how to achieve it. The background information on Paged Imports is available at the bottom of the PSMA documentation page here.  However there are no working examples. I contacted Soren and he had misplaced his demo scripts for the time being. With some time at hand (in between coats of paint on the long weekend renovation)  I therefore worked it out for myself. I detail how to implement Paged Imports in this blogpost.

This post uses an almost identical Management Agent to the one described in this post. Review that post to get an understanding of the AzureAD Differential Queries. I’m not going to cover those elements in this post or setting up the MA at all.

Getting Started

There are two things you need to do in preparation for enabling Paged Imports on your PowerShell Management Agent;

  1. Enable Paged Imports (if your Import.ps1 is checking for this setting)
  2. Configure Page Size on your Import Run Profiles

The first is as simple as clicking the checkbox on the Global Parameters tab on your PS MA as shown below.

The 2nd is in your Run Profile. By default this will be 100. For my “let’s figure this out” process I dropped my Run Profiles to 50 on one Run Profile and 10 on another.

Import Script

With Paged Imports setup on the MA the rest of the logic goes into your Import Script. In your param section at the start of the script $usepagedimport and $pagesize are the variables that reflect the items from the two enablement components you did above.

$usepageimport is either True or False. Your Import.ps1 script can check to see if it is set and process accordingly. In this example I’m not even checking if it is set and doing Paged Imports anyway. For completeness in a production example you should check to see what the intention of the MA is.

$pagesize is the pagesize from the Run Profile (100 by default, or whatever you changed your’s too).

param (
    $Username,
    $Password,
    $Credentials,
    $OperationType,
    [bool] $usepagedimport,
    $pagesize
 )

 

An important consideration to keep in mind is that the Import.ps1 will be called multiple times (ie. #of_times = #ofObjects / pagesize).

So anything that you would normally expect in any other MA to only process once when the Import.ps1 runs you need to limit to only running once.

Essentially the way I’ve approached it is, retrieve all the objects that will be processed and put them in a Global variable. If the variable does not have any values/data then it is the first run, so go and get our source data. If the Global variable has values/data in it then we must be on a subsequent loop so no need to go process that part, just page through our import.

In my example below this appears as;

if (!$global:tenantObjects) {
    # Authenticate
    # Search and get the users
    # Do some rationalisation on the results (if required)
    # setup some global variables so we know where we are with processing the data
} # Finish the one time tasks

As you’ll see in the full import.ps1 script below there are more lines that could be added into this section so they don’t get processed each time. In a production implementation I would.

For the rest of the Import.ps1 script we are expecting it to run multiple times. This is where we do our logic and process our objects to send through to the Sync Engine/Connector Space. We need to keep track of where we are up to processing the dataset and continue on from where we left off. We also need to know how many objects we have processed in relation to the ‘pagesize’ we get from the Run Profile so we know when we’ve finished.

When we reach the pagesize but know we have more objects to process we set the $global:MoreToImport  to $true and break out of the foreach loop.

When we have processed all our objects we set $global:MoreToImport = $false and break out of the foreach loop to finish.

With that explanation out of the way here is a working example. I’ve left in debugging output to a log file so you can see what is going on.

You can get the associated relevant Schema.ps1 from the Management Agent described in this post. You’ll need to update your Tenant name on line 29, your directory paths on lines 10 and 47. If you are using a different version of the AzureADPreview PowerShell Module you’ll need to change line 26 as well.

Everything else is in the comments within the example script below and should make sense.

Summary

For managing a large number of objects on a PS MA we can now see progress as the import processes the objects, and we can now stop an MA if required.

I’m sure this will help someone else. Enjoy.

Follow Darren on Twitter @darrenjrobinson

 

 

 

 

 

 

How to create an AzureAD Microsoft Identity Manager Management Agent using the MS GraphAPI and Differential Queries

Introduction

In August 2016 I wrote this post on how to use PowerShell to leverage the Microsoft GraphAPI and use Differential Queries. The premise behind that post was I required a Microsoft Identity Manager Management Agent to synchronize identity information from AzureAD into Microsoft Identity Manager. However the environment it was intended for has a large AzureAD implementation and performing a Full Sync every-time is just to time consuming. Even more so with this limitation that still exists today in MIM 2016 with SP1.

In this blog post I’ll detail how to implement a PowerShell Management Agent for FIM/MIM to use the MS GraphAPI to synchronize objects into FIM/MIM, supporting Delta and Full Synchronization run profiles. I’m also using my favourite PowerShell Management Agent, the Granfeldt PowerShell Management Agent.

Pre-requsites

I’m using the ADAL helper library from the AzureADPreview PowerShell Module. Install that module on you MIM Sync Server via PowerShell (WMF5 or later) using the PowerShell command;

Install-Module AzureADPreview

Getting Started with the Granfeldt PowerShell Management Agent

If you don’t already have it, what are you waiting for. Go get it from here. Søren’s documentation is pretty good but does assume you have a working knowledge of FIM/MIM and this blog post is no different.

Three items I had to work out that I’ll save you the pain of are;

  • You must have a Password.ps1 file. Even though we’re not doing password management on this MA, the PS MA configuration requires a file for this field. The .ps1 doesn’t need to have any logic/script inside it. It just needs to be present
  • The credentials you give the MA to run this MA are the credentials for the account that has permissions to the AzureAD/Office365 Tenant. Just a normal account is enough to enumerate it, but you’ll need additional permissions if you intend to write-back to AzureAD.
  • The path to the scripts in the PS MA Config must not contain spaces and be in old-skool 8.3 format. I’ve chosen to store my scripts in an appropriately named subdirectory under the MIM Extensions directory. Tip: from a command shell use dir /x to get the 8.3 directory format name. Mine looks like C:\PROGRA~1\MICROS~4\2010\SYNCHR~1\EXTENS~2\AzureAD

Schema.ps1

My Schema is based around enumerating and managing users from AzureAD. You’ll need to create a number of corresponding attributes in the Metaverse Schema on the Person ObjectType to flow the attributes into. Use the Schema info below for a base set of attributes that will get you started. You can add more as required. I’ve prefixed most of them with AAD for my implementation.

If you want to manage Groups or Contacts or a combination of object types, you will need to update the Schema.ps1 script accordingly.

Import.ps1

The logic that the Import.ps1 implements is the same as detailed here in my post using Differential Queries. Essentially, perform a full import and create a file with the cookie/watermark. Allow Delta Sync run profiles to be performed by requesting the GraphAPI to return only changes since the cookie/watermark.

You’ll need to update the script for your AzureAD Tenant name on Line 28. Also the path to where the cookie file will go and the debug file if your path is different to mine. Lines 11, 46 and 47.

Importing of the attributes is based around the names in the Schema.ps1 scripts. Any changes you made there will need to be reflected in the import.ps1.

Password Script (password.ps1)

Empty as not implemented

Export.ps1

Empty as not implemented in this example. If you are going to write information back to AzureAD you’ll need to put the appropriate logic into this script.

Management Agent Configuration

With the Granfeldt PowerShell Management Agent installed on your FIM/MIM Synchronisation Server, in the Synchronisation Server Manager select Create Management Agent and choose “PowerShell” from the list of Management Agents to create.

As this example is for Users, I’ve named my MA accordingly.

As per the tips above, the format for the script paths must be without spaces etc. I’m using 8.3 format and I’m using an Office 365 account to connect to AzureAD/Office365 and import the user data.

Paths to the Import, Export and Password scripts. Note: the Export and Password PS1 scripts files exist but are empty.

Object Type as configured in the Schema.ps1 file.

Attributes as configured in the Schema.ps1 file.

Anchor as per the Schema.ps1 file.

The rest of the MA configuration is up to your implementation. What you are going to join on and what attributes flow into the MV will vary based on your needs and solution. At a minimum you’d probably be looking to do a join on immutableID (after some manipulation) or UPN and flow in attributes such as AADAccountEnabled etc.

Completing the Configuration

To finalise the MA you’ll need to do the usual tasks of creating run profiles, staging the connector space from AzureAD/Office365 and syncing into the Metaverse. Once you’ve done your initial Stage/Full Sync you can perform Delta Sync’s.

Summary

A “Full Import” on a small AzureAD (~8500 Users) took 2 minutes.
A subsequent “Delta Import” with no changes took 6 seconds.

A similar implementation of the logic, but for Groups gives similar results/performance.
A  “Full Import” on a small AzureAD (~9800 Groups) took 5 minutes.
A subsequent “Delta Import” with 7 Adds (new Groups) and 157 Updates took 1 minute.

 

Follow Darren on Twitter @darrenjrobinson

How to embed Power BI Reports into the Microsoft Identity Manager Portal

About seven years ago at a conference in Los Angeles I attended I remember a session where a consultant from Oxford Computer Group gave a presentation on integrating Quest Identity Manager (now Dell One Identity Manager) with the Forefront Identity Manager Portal. I’ve recently had a requirement to do something similar and Carol pointed me in the direction of her experiments with doing something similar based off inspiration from that same presentation/session.

Well it is now 2017 and FIM and SharePoint have all moved through a few versions and doing something similar has changed. So now that I’ve got it working I thought I’d share how I’ve done it, and also to solicit any improvements. I’ve done this with SharePoint 2013.

Overview

In this post I’ll detail;

  • Publishing a Power BI Report
  • Creating new Microsoft Identity Manager Navigation Bar Resources
  • Embedding as an IFrame the published Power BI Report in the Microsoft Identity Manager Portal so that it appears like below

Pre-requisites

Obviously to follow this verbatim you are going to need to have a Power BI workspace and a Power BI Report. But you could embed any page you want to test it out.

You’ll also need;

Publish a Power BI Report

In Power BI select a Report you are looking to embed in the MIM Portal. I selected License Plans under Reports from my Power BI Worksapce.

From the File menu select Publish to Web.

Select Create embed code.

Copy the link to your report somewhere where you can retrieve it easily later. Don’t worry about the HTML line or the size.

 

SharePoint Designer

Download and install with the defaults SharePoint Designer 2013 from the link above. I’m using the 64-bit version. I installed it on my Development MIM Portal Server. I’m using 2013 as my MIM Portal is using SharePoint 2013 Foundation (with SP1).

Once installed start SharePoint Designer and select Open Site.

Enter the URL for your MIM Portal and select Open.

Note: In order for SharePoint Designer to successfully load your MIM Portal Site, the URL you provide above must be in your SharePoint Alternate Access Mappings. If it isn’t you will probably get the error “The server could not complete your request. For more specific information, click the Details button.”

And in your Windows Application Event Log Event ID 3 – WebHost

WebHost failed to process a request.

 Sender Information: System.ServiceModel.ServiceHostingEnvironment+HostingManager/42194754

 Exception: System.ServiceModel.ServiceActivationException: The service '/_vti_bin/client.svc' cannot be activated due to an exception during compilation.

 

Select Microsoft Identity Management, then All Files. You should then see a list of all the files in the MIM Portal website.

Locate the aspx folder, right click on it and select New => Folder. Create a new folder under the aspx directory named ‘reports’.

Right click on your new Reports Folder and select New => ASPX. Create an aspx file named reports.aspx.

Repeat to create another aspx file named report.aspx.

Click on the Reports.aspx file form the main pane and put the following contents in it overwriting everything else. Select Save.

<%@ Page Language="C#" %>
<html dir="ltr">

<head runat="server">
<meta name="WebPartPageExpansion" content="full" />
<title>Reports</title>

 window.open("report.aspx",target="_self")


</head>
<body/>
</html>

Click on the report.aspx file and replace the contents with the following and select Save.

Replace <yourreportlink> in https://app.powerbi.com/view?r=<yourreportlink> with your Power BI link.

<%@ Page masterpagefile="~masterurl/custom.master" Title="Reports" language="C#" inherits="Microsoft.SharePoint.WebPartPages.WebPartPage, Microsoft.SharePoint, Version=12.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c" meta:progid="SharePoint.WebPartPage.Document" UICulture="auto" Culture="auto" meta:webpartpageexpansion="full" %>
<%@ Register Tagprefix="SharePoint" Namespace="Microsoft.SharePoint.WebControls" Assembly="Microsoft.SharePoint, Version=12.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c" %> <%@ Register Tagprefix="Utilities" Namespace="Microsoft.SharePoint.Utilities" Assembly="Microsoft.SharePoint, Version=12.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c" %> <%@ Import Namespace="Microsoft.SharePoint" %> <%@ Register Tagprefix="WebPartPages" Namespace="Microsoft.SharePoint.WebPartPages" Assembly="Microsoft.SharePoint, Version=12.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c" %>

<asp:Content ContentPlaceHolderID="PlaceHolderTitleBar" Visible="true" runat="server">
</asp:Content>

<asp:Content id="content1" ContentPlaceHolderID="PlaceHolderMain" runat="server">
https://app.powerbi.com/view?r=

</asp:Content>

MIM Portal Navigation Resources

Now we need to create the MIM Portal Navigation Resources to link to our new files.

In the MIM Portal Select Navigation Bar Resources. Select New.

 

Provide a Display Name, Description and select Next. Ignore Usage Keyword for now. More on that later.

Make the Parent Order 8 to have it at the bottom of the Left Nav bar. Order is 0 as this is going to be our Group header. Select Next.

Provide the path to the Reports.aspx file  ~/IdentityManagement/aspx/reports/reports.aspx Select Next.

Provide the Localised Display name, select Finish and then Submit.

Repeat, this time for linking to ~/IdentityManagement/aspx/reports/report.aspx and name it Licensing Report or whatever makes sense for your report. Also make the Order 1 so it nests under Reports.

Perform an IISReset.

Refresh you MIM Portal Page and you should see your new menu items on the left Navigatin Bar at the bottom.

Click on Reports and your Licensing Report will auto-magically load. Same as if you click on Licensing Report. You can now add as many reports as you need. And change which report you want to be the default by updating the Reports.aspx file in SharePoint Designer.

You will probably also want to limit who see’s what reports. You can do that through Usage Keywords and Sets etc. By default as described here the reports will only be visible to Administrators.  Details to get you started on changing who can see what can be found here.

Let me know if you have any improvements.

 

Follow Darren Robinson on Twitter