Adapting to the changes in the AzureAD Preview PowerShell Module ADAL Helper Library

I’m a big proponent of using PowerShell for integration and automation of Azure Active Directory Services using the Azure AD GraphAPI. You may have seen many of my posts leverage the evolving Azure AD Preview PowerShell Module helper libraries. Lines in my scripts that use this look like the one below. In this case using preview version 2.0.0.52.

# the default path to where the ADAL GraphAPI PS Module puts the Libs
Add-Type -Path 'C:\Program Files\WindowsPowerShell\Modules\AzureADPreview\2.0.0.52\Microsoft.IdentityModel.Clients.ActiveDirectory.dll'

The benefit of using this library is the simplification of Authentication to AzureAD, from which we can then receive a token and interact with the GraphAPI via PowerShell using Invoke-RestMethod.

Earlier this week it was bought to my attention that implementation of some of my scripts were failing when using the latest v2 releases of the AzureAD PowerShell Module (v2.0.0.98).  Looking into it the last version I had working is v2.0.0.52. v2.0.0.55 doesn’t work with my scripts either.  So anything after v2.0.0.52 the following will not work

What’s Changed?

First up the PowerShell Module has been renamed. It is no longer AzureADPreview, it is just AzureAD. So the path it gets installed into (depending on the version you have) is now;

'C:\Program Files\WindowsPowerShell\Modules\AzureAD\2.0.0.98\Microsoft.IdentityModel.Clients.ActiveDirectory.dll'

Looking into the updated PowerShell Module there has been a change to the Microsoft.IdentityModel.Clients.ActiveDirectory.dll library.

A number of the methods in the library have changed. I believe this is part of Microsoft transitioning the endpoint to use GraphAPI. With that understanding I approached using PowerShell to integrate with the GraphAPI more akin to the way I do when not using the helper library.

User PowerShell and the ADAL Helper Library to connect to AzureAD via the GraphAPI

Here is the updated script to connect (and retrieve a batch of users). You will need to update lines 4, 17 & 18 for your Tenant name and the username and password (non-MFA enabled) you will be connecting with.

 

Getting started configuring the latest Microsoft Identity Manager IBM Notes Management Agent with Domino v9.x

Lotus Notes. My old nemesis as both a user and as an Administrator is back to haunt me again.

There’s a reasonable amount written by others on the trials and tribulations of getting the FIM/MIM Notes MA configured and working. However they are all referencing older versions of the MA and older versions of Domino. (If you are looking details on the previous versions checkout Michael’s great post here). The info on permissions is still valid, so make sure you’re on top of that.

The latest Notes MA (March 2017) is available from here and supports v9.x of Domino and v9.x of the Notes Client.

Here’s a couple of the differences straight up;

  • The v9 Lotus Notes Client does not require the v8 client requirements of setting up the MA using x86 architecture. Use “Process” as MA Architecture as per the documentation
  • I managed to get the v9.0.0 and v9.0.1 FP4 clients to work. The latest is v9.0.1 FP8 available from here
  • There are two technical references from Microsoft for the Notes MA. This is the old version. After some hunting I located the documentation for the latest release here. Yes the new version of the MA and the associated documentation only become available mid-March 2017

Here’s a couple of quick tips for getting up and running with the latest MIM Notes MA;

  • Install the v9.x Lotus Notes Client on the MIM Sync Server
    • Select Single User Install
  • Open the names.nsf and schema.nsf databases using the Notes ID you will be supplying to the Notes MA configuration

Enable Logging

Like anything, knowing what is going on, helps diagnose issues. This is how I enabled event log logging for the latest IBM Notes Management Agent

  • Edit the miiserver.exe.config file in the BIN folder. More than likely that will be in this path C:\Program Files\Microsoft Forefront Identity Manager\2010\Synchronization Service\Bin
  • Add the following lines as shown in the graphic below

<source name="ConnectorsLog" switchValue="Information">
   <listeners>
     <add name="ConnectorsLogListener" 
     type="System.Diagnostics.EventLogTraceListener, System, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089"
     initializeData="ForefrontIdentityManager.ManagementAgent"
     traceOutputOptions="LogicalOperationStack, DateTime, Timestamp, Callstack" />
  </listeners>
</source>

Restart the Forefront Identity Manager Synchronization Service in services and when you run the Notes MA you will then see output in the Event Log under Forefront Identity Manager Management Agent.

Change the following line substituting Error, WarningInformation or Verbose for what you want logged.

<source name="ConnectorsLog" switchValue="Information">

If you want it to log to the Application Log then change the initializeData value to whatever you want. I used IBM Domino MA;

<source name="ConnectorsLog" switchValue="Information">
  <listeners>
    <add name="ConnectorsLogListener" 
    type="System.Diagnostics.EventLogTraceListener, System, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089"
    initializeData="IBM Domino MA"
    traceOutputOptions="LogicalOperationStack, DateTime, Timestamp, Callstack" />
  </listeners>
</source>

… restart the Forefront Identity Manager Synchronization Service in services and when you run the Notes MA the logged data will land in the Application Log.

If you’ve used the Notes MA before, this should get you up and running with the latest.

Joining Identities between Active Directory and Azure Active Directory using Microsoft Identity Manager

Introduction

One of the foundations of Identity Management is the ability to join an identity between disparate connected systems. As we extend our management of identities into cloud services this adds a few twists.

A key concept is to use an anchor that is persistent. Something that doesn’t change through a users life-cycle. A user’s Security IDentifier (SID) in Active Directory is perfect. It doesn’t change when a user or group may get renamed.  What gets interesting is how the SID is represented when returned using different methods. That is what I quickly cover in this post.

Overview

The defacto method for connecting your OnPremise Active Directory to Azure Active Directory is to use Azure Active Directory Connect. AADC will synchronise users and groups SID’s to the corresponding object in AAD into the onPremisesSecurityIdentifier attribute. 

When the onPremisesSecurityIdentifier  attribute is retrieved via the GraphAPI the format looks like this: S-1-5-21-3878594291-2115959936-132693609-65242

Using an Active Directory Management Agent for FIM/MIM and synchronizing the objectSID from the OnPremise AD represents the value in the Metaverse in binary format which when viewed as text looks like this:  15000005210002431664623112825230126105190232781820

Translating SID formats so we can join identities

What we need to do is translate the string representation of the SID returned from the GraphAPI and AzureAD so that we have it in a binary format. Then we can then use those attributes in join rules to match users/groups between AzureAD and our OnPremise Active Directory.

Overview-1

 

In my environments I’m using the out of the box FIM/MIM Active Directory Management Agent. For Azure AD/Office 365 I’m using the Granfeldt PowerShell Management Agent to integrate with Azure AD via the GraphAPI.

On my AzureAD PowerShell Management Agent I have an attribute named AADonPremiseSID configured with the format as Binary in my PSMA Schema.ps1 as shown below.

$obj | Add-Member -Type NoteProperty -Name "AADonPremiseSID|Binary" -Value 0x10

On my Azure AD PSMA I have the following lines in my Import.ps1 which essentially takes the value retrieved from the GraphAPI S-1-5-21-3878594291-2115959936-132693609-65242 and converts it to a binary array that in text looks something like 15000005210002431664623112825230126105190232721825400 and stores it in the AADonPremiseSID binary attribute in the connector space.

# Create SID .NET object using SID string from AAD S-1-500-........ 
$sid = New-Object system.Security.Principal.SecurityIdentifier $user.onPremisesSecurityIdentifier
 
#Create a byte array for the length of the users SID
$BinarySid = new-object byte[]($sid.BinaryLength)

#Copy the binary SID into the byte array, starting at index 0
$sid.GetBinaryForm($BinarySid, 0)

#Add the SID to the user in the connector space
$obj.Add("AADonPremiseSID",$BinarySid)

This then lets me join my users (and groups using the same method) between AD and AAD.  Essentially a line to put it into Security Identifier format, two lines to convert it to a binary array and a line to store it in the connector space. Simple when you don’t over think it.

I’m posting this because I know I’m going to need to do this often. Hope it helps someone else too.

Migrating a VirtualBox (Linux) Windows VDI Virtual Machines to Azure

Overview

Over the years I’ve transitioned through a number of laptops and for whatever reason they never fully get put out to pasture. Two specific laptops are used semi-regularly for functions associated with a few virtual machines they hold. Over the last 10 years or so, I’ve been a big proponent of VirtualBox. It’s footprint and functionality aligned with my needs. The downside these days is needing to sometimes carry two laptops just to use an application or two contained inside a Virtual Machine on VirtualBox.

It’s 2017 and time to get with the times. Dedicate an evening of working through the process of migrating those VM’s.

DISCLAIMER and CONSIDERATIONS Keep in mind that if you are migrating legacy operating systems, you’ll need a method to remote into them once they are in Azure. Check the configuration of them before  you convert and migrate them. Do they have firewalls? Is the network interface on the VM configured for dynamic or static addressing? Do the VM have remote access configured, VNC, RDP, SSH. As they are also likely to be less secure my process below includes a Network Security Group as part of the Azure Resource Group with no rules specified. You’ll need to add some inbound rules for the method you’ll be using to remote into your Virtual Machine. And I STRONGLY suggest locking those rules down to a single host or home subnet.

The VM Conversion Process

This blog post covers the migration of a Windows Virtual Machine in VDI format from VirtualBox on SUSE Linux to Azure.

  • With the VM Started un-install the VBox Guest Additions from the virtual machine

RemoveExtensions2

  • Shutdown the VM
  • In VirtualBox Manager select the VM and Settings
    1. Select Storage. If the VBoxGuestAdditions CD/DVD is attached then remove it.
    2. Take note of the VM’s disk(s) location (WinXPv2.vdi in my case) and naming. Mine just had a single hard disk. You’ll need the path for the conversion utility.

RemoveVBoxAdditions

RemoveAdditionsDVD

  • Virtual Box includes a utility named vboxmanage. We can use that to convert the VM virtual hard disk from VDI to VHD format. Simply run vboxmanage clonehd –format VHD –variant Fixed
    • You will need to make sure you have enough space on your laptop hard disk for the VHD which will be about the same size as your VDI Hard Disk
      • If you don’t on Linux you’ll get a slightly cryptic message like
        • Could not create the clone medium (VERR_EOF)
        • VBOX_E_FILE_ERROR (0x80bb0004)
      • the –variant Fixed switch is not shown in the virtual disk conversion screenshot (three images further down the page). One of my other VM’s was Dynamic. Size needs to be Fixed for the VHD to be associated with a VM in Azure
      • Below shows determining an existing disk that is Dynamic and needs to be converted to Fixed

DynamicDisk

  • Below shows determining an existing disk that is Fixed and doesn’t need to be converted

FixedDisk

  • Converting the VDI virtual disk to VHD

Convert60Percent

Preparing our Azure Environment for our new Virtual Machine

  • Whilst the conversion was taking place I logged into the Azure Portal and created a new Resource Group for my VM to go into. I also created a new Storage Account in that Resource Group to put the VM’s VHD into. Basically I’m keeping these specific individual VM’s that serve a very specific purpose in their own little compartment.

RGandSG

  • Using the fantastic Azure Storage Explorer which works on Linux, Mac and Windows I created a Blob Container in my newly created Storage Account named vhds.

CreateBlobContainer

Upload the Converted Virtual Hard Disk

  • By now my virtual disk had converted. Using the Azure Storage Explorer I uploaded my converted virtual disk. NOTE Make sure you have the ‘upload vhd/vhdx files as Page Blobs’ selected. 

UploadVHD

For a couple of other VM’s I wrote a little PowerShell script to upload the VHD’s to blob storage.

Create the Azure VM

The following script follows on from the Resource Group, Storage Account and the Virtual Machine Virtual Disk we created and uploaded to Azure and creates the VM to attached to the virtual disk.

All the variables are up front, we create the Network Security Group, Subnet and Virtual Network. Then the Public IP and Network Interface. Finally we define the details for the VM with the networking and the uploaded VHD before creating the VM.

And we’re done. VM created and started.

VMCreated
Happy days and good bye to a number of old laptops.

Standalone installation of the MIM Self Service Password Reset Portals ends prematurely

Today I was performing a standalone installation of the MIM Self Service Password Reset Portals (Enrollment and Reset). These Portals rely on IIS and not the normal prerequisites associated with the MIM Service Portal (SharePoint etc).  As such using PowerShell I’d only installed the Web Server Role with the usual dependencies.

On starting the MIM Service and Portal installation I got the dreaded Microsoft Identity Manager Service and Portal Setup Wizard ended prematurely dialog. So straight into debug mode running with an installation log as per the command below.

msiexec /i “f:\Service and Portal\Service and Portal.msi” /l*v c:\temp\install.log

InstallTerminatedPrematurely

Into the install.log file created from the installation process I see the following.

Looking into the install log I notice the following error.

InstallLogError

In a more readable form

SFXCA: Ensure that the proper version of the .NET Framework is installed, or that there is a matching supportedRuntime element in CustomAction.config. If you are binding to .NET 4 or greater add useLegacyV2RuntimeActivationPolicy=true to the element.
CustomAction GetIISVersionFromRegistry returned actual error code 1603 (note this may not be 100% accurate if translation happened inside sandbox)

I have .NET Framework 4.5 installed. Could it be MIM 2016 SP1 still needed .NET Framework 3.5 ?

DotNETVersions

Taking a punt I installed .NET 3.5.

DOTNET35Install

Restarted the Install. Success.

Success

There you go. Even when you’re only looking to have the SSPR Portals installed you need to have .NET Framework 3.5 installed. And FYI I needed to have the Windows Server 2012 R2 media handy to get the sources for .NET Framework 3.5.

A workaround for the Microsoft Identity Manager limitation of not allowing simultaneous Management Agents running Synchronisation Profiles

Why ?

For those of you that may have missed it, in early 2016 Microsoft released a hotfix for Microsoft Identity Manager that included a change that removed the ability for multiple management agents on a Microsoft Identity Manager Synchronization Server to simultaneously run synchronization run profiles. I detailed the error you get in this blog post.

At the time it didn’t hurt me too much as I didn’t require any other fixes that were incorporated into that hotfix (and the subsequent hotfix). That all changed with MIM 2016 SP1 which includes functionality that I do require. The trade-off however is I am now coming up against the inability to perform simultaneous delta/full synchronization run profiles.

Having had this functionality for all the previous versions of the product that I’ve worked with for the last 15+ years, you don’t realise how much you need it until its gone. I understand Microsoft introduced this constraint to protect the integrity of the system, but my argument is that it should be up to the implementor. Where I use this functionality a LOT is with Management Agents processing different object types from different connected systems (eg. Users, Groups, Contacts, Photos, other entity types). For speed of operation I want my management agents running synchronization run profiles simultaneously.

Overview of my workaround

First up this is a workaround not a solution to actually running multiple sync run profiles at once. I really hope Microsoft remove this limitation in the next hotfix. What I’m doing is trying to have my run sequences run as timely as possible.

What I’m doing is:

  • splitting my run profiles into individual steps
    • eg. instead of doing a Delta Import and a Delta Sync in a single multi-step run profile I’m doing a Delta Import as one run profile and the sync as a separate one. Same for Full Import Full Synchronization
  • running my imports in parallel. Essentially the Delta or Full imports are all running simultaneously
  • utilising the Lithnet MIIS Automation Powershell Module for the Start-ManagementAgent and Wait-ManagementAgent cmdlets to run the synchronisation run profiles

 

Example Script

This example script incorporates the three principles from above. The environment I built this example in has multiple Active Directory Forests. The example queries the Sync Server for all the Active Directory MA’s then performs the Imports and Sync’s. This shows the concept, but for your environment potentially without multiple Active Directories you will probably want to change the list of MA’s you use as input to the execution script.

Prerequisites

You’ll need;

The following script takes each of the Active Directory Management Agents I have (via a dynamic query in Line 9) and performs a simultaneous Delta Import on them. If your Run Profiles for Delta Imports aren’t named DI then update Line 22 accordingly.

-RunProfileName DI

 

With imports processed we want to kick into the synchronisation run profiles. Ideally though we want to kick these off as soon as the first import has finished.  The logic is;

  • We loop through each of the MA’s looking for the first MA that has finished its Import (lines 24-37)
  • We process the Sync on the first completed MA that completed its import and wait for it to complete (lines 40-44)
  • Enumerate through the remaining MA’s, verifying they’ve finished their Import and run the Sync Run Profile (lines 48-62)

If your Run Profiles for Delta Sync’s aren’t named DS then update Line 40 and 53 accordingly.

-RunProfileName DS

Summary

While we don’t have our old luxury of being able to choose for ourselves when we want to execute synchronisation run profiles simultaneously (please allow us to Microsoft), we can come up with band-aid workarounds to still get our synchronisation run profiles to execute as quickly as possible.

Let me know what solutions you’ve come up with to workaround this constraint.

 

 

How to create a PowerShell FIM/MIM Management Agent for AzureAD Groups using Differential Sync and Paged Imports

Introduction

I’ve been working on a project where I must have visibility of a large number of Azure AD Groups into Microsoft Identity Manager.

In order to make this efficient I need to use the Differential Query function of the AzureAD Graph API. I’ve detailed that before in this post How to create an AzureAD Microsoft Identity Manager Management Agent using the MS GraphAPI and Differential Queries. Due to the number of groups and the number of members in the Azure AD Groups I needed to implement Paged Imports on my favourite PowerShell Management Agent (Granfeldt PowerShell MA). I’ve previously detailed that before too here How to configure Paged Imports on the Granfeldt FIM/MIM PowerShell Management Agent.

This post details using these concepts together specifically for AzureAD Groups.

Pre-Requisites

Read the two posts linked to above. They will detail Differential Queries and Paged Imports. My solution also utilises another of my favourite PowerShell Modules. The Lithnet MIIS Automation PowerShell Module. Download and install that on the MIM Sync Server where you be creating the MA.

Configuration

Now that you’re up to speed, all you need to do is create your Granfeldt PowerShell Management Agent. That’s also covered in the post linked above  How to create an AzureAD Microsoft Identity Manager Management Agent using the MS GraphAPI and Differential Queries.

What you need is the Schema and Import PowerShell Scripts. Here they are.

Schema.ps1

Two object classes on the MA as we need to have users that are members of the groups on the same MA as membership is a reference attribute. When you bring through the Groups into the MetaVerse and assuming you have an Azure AD Users MA using the same anchor attribute then you’ll get the reference link for the members and their full object details.

Import.ps1

Here is my PSMA Import.ps1 that performs what is described in the overview. Enumerate AzureAD for Groups, import the active ones along with group membership.

Summary

This is one solution for managing a large number of Azure AD Groups with large memberships via a PS MA with paged imports showing progress thanks to differential sync which then allows for subsequent quick delta-sync run profiles.

I’m sure this will help someone else. Enjoy.

Follow Darren on Twitter @darrenjrobinson

An Azure Timer Function App to retrieve files via FTP and Remote PowerShell

Introduction

In an age of Web Services and API’s it’s an almost a forgotten world where FTP Servers exist. However most recently I’ve had to travel back in time and interact with a FTP server to get a set of files that are produced by other systems on a daily basis. These files are needed for some flat-file imports into Microsoft Identity Manager.

Getting files off a FTP server is pretty simple. But needing to do it across a number of different environments (Development, Staging and Production) meant I was looking for an easy approach that I could also replicate quickly across multiple environments. As I already had Remote PowerShell setup on my MIM Servers for other Azure Function Apps I figured I’d use an Azure Function for obtaining the FTP Files as well.

Overview

My PowerShell Timer Function App performs the following:

  • Starts a Remote PowerShell session to my MIM Sync Server
  • Imports the PSFTP PowerShell Module
  • Creates a local directory to put the files into
  • Connects to the FTP Server
  • Gets the files and puts them into the local directory
  • Ends the session

Pre-requisites

From the overview above there are a number of pre-requites that other blog posts I’ve written detail nicely the steps involved to appropriately setup and configure. So I’m going to link to those. Namely;

  • Configure your Function App for your timezone so the schedule is correct for when you want it to run. Checkout the WEBSITE_TIME_ZONE note in this post.

    WEBSITE_TIME_ZONE

  • You’ll need to configure your Server that you are going to put the files onto for Remote PowerShell. Follow the Enable Powershell Remoting on the FIM/MIM Sync Server section of this blogpost.
  • The credentials used to connect to the MIM Server are secured as detailed in the Using an Azure Function to query FIM/MIM Service section of this blog post.
  • Create a Timer PowerShell Function App. Follow the Creating your Azure App Service section of this post but choose a Timer Trigger PowerShell App.
    • I configured my Schedule for 1030 every day using the following CRON configuration
      0 30 10 * * *
  • On the Server you’ll be connecting to in order to run the FTP processes you’ll need to copy the PSFTP Module and files to the following directories. I unzipped the PSFTP files and copied the PSFTP folder and its contents to;
    • C:\Program Files\WindowsPowerShell\Modules
    • C:\Windows\System32\WindowsPowerShell\v1.0\Modules

Configuring the Timer Trigger Function App

With all the pre-requisites in place it’s time to configure the Timer Function App that you created in the pre-requisites.

The following settings are configured in the Function App Application Settings;

  • FTPServer (the server you will be connecting to, to retrieve files)
  • FTPUsername (username to connect to the FTP Sever with)
  • FTPPassword (password for the username above)
  • FTPSourceDirectory (FTP directory to get the files from)
  • FTPTargetDirectory (the root directory under which the files will be put)

ApplicationSettings

You’ll also need Application Settings for a Username and Password associated with a user that exists on the Server that you’ll be connecting to with Remote PowerShell. In my script below these application settings are MIMSyncCredUser and MIMSyncCredPassword

Function App Script

Finally here is a raw script. You’ll need to add appropriate error handling for your environment. You’ll also want to change lines 48 and 51 for the naming of the files you are looking to acquire. And line 59 for the servername you’ll be executing the process on.

Summary

A pretty quick and simple little Azure Function App that will run each day and obtain daily/nightly extracts from an FTP Server. Cleanup of the resulting folders and files I’m doing with other on-box processes.

This post is cross-blogged on both the Kloud Blog and Darren’s Blog.

Automate the nightly backup of your Development FIM/MIM Sync and Portal Servers Configuration

Last week in a customer development environment I had one of those oh shit moments where I thought I’d lost a couple of weeks of work. A couple of weeks of development around multiple Management Agents, MV Schema changes etc. Luckily for me I was just connecting to an older VM Image, but it got me thinking. It would be nice to have an automated process that each night would;

  • Export each Management Agent on a FIM/MIM Sync Server
  • Export the FIM/MIM Synchronisation Server Configuration
  • Take a copy of the Extensions Folder (where I keep my PowerShell Management Agents scripts)
  • Export the FIM/MIM Service Server Configuration

And that is what this post covers.

Overview

My automated process performs the following;

  1. An Azure PowerShell Timer Function WebApp is triggered at 2330 each night
  2. The Azure Function App initiates a Remote PowerShell session to my Dev MIM Sync Server (which is also a MIM Service Server)
  3. In the Remote PowerShell session the script;
    1. Creates a new subfolder under c:\backup with the current date and time (dd-MM-yyyy-hh-mm)

  1. Creates further subfolders for each of the backup elements
    1. MAExports
    2. ServerExport
    3. MAExtensions
    4. PortalExport

    1. Utilizes the Lithnet MIIS Automation PowerShell Module to;
      1. Enumerate each of the Management Agents on the FIM/MIM Sync Server and export each Management Agent to the MAExports Folder
      2. Export the FIM/MIM Sync Server Configuration to the ServerExport Folder
    2. Copies the Extensions folder and subfolder contexts to the MAExtensions Folder
    3. Utilizes the FIM/MIM Export-FIMConfig cmdlet to export the FIM Server Configuration to the PortalExport Folder

Implementing the FIM/MIM Backup Process

The majority of the setup to get this to work I’ve covered in other posts, particularly around Azure PowerShell Function Apps and Remote PowerShell into a FIM/MIM Sync Server.

Pre-requisites

  • I created a C:\Backup Folder on my FIM/MIM Server. This is where the backups will be placed (you can change the path in the script).
  • I installed the Lithnet MIIS Automation PowerShell Module on my MIM Sync Server
  • I configured my MIM Sync Server to accept Remote PowerShell Sessions. That involved enabling WinRM, creating a certificate, creating the listener, opening the firewall port and enabling the incoming port on the NSG . You can easily do all that by following my instructions here. From the same post I setup up the encrypted password file and uploaded it to my Function App and set the Function App Application Settings for MIMSyncCredUser and MIMSyncCredPassword.
  • I created an Azure PowerShell Timer Function App. Pretty much the same as I show in this post, except choose Timer.
    • I configured my Schedule for 2330 every night using the following CRON configuration

0 30 23 * * *

  • I set the Azure Function App Timezone to my timezone so that the nightly backup happened at the correct time relative to my timezone. I got my timezone index from here. I set the  following variable in my Azure Function Application Settings to my timezone name AUS Eastern Standard Time.

    WEBSITE_TIME_ZONE

The Function App Script

With all the pre-requisites met, the only thing left is the Function App script itself. Here it is. Update lines 2, 3 & 6 if your variables and password key file are different. The path to your password keyfile will be different on line 6 anyway.

Update line 25 if you want the backups to go somewhere else (maybe a DFS Share).
If your MIM Service Server is not on the same host as your MIM Sync Server change line 59 for the hostname. You’ll need to get the FIM/MIM Automation PS Modules onto your MIM Sync Server too. Details on how to achieve that are here.

Running the Function App I have limited output but enough to see it run. The first part of the script runs very quick. The Export-FIMConfig is what takes the majority of the time. That said less than a minute to get a nice point in time backup that is auto-magically executed nightly. Sorted.

Summary

The script itself can be run standalone and you could implement it as a Scheduled Task on your FIM/MIM Server. However I’m using Azure Functions for a number of things and having something that is easily portable and repeatable and centralised with other functions (pun not intended) keeps things organised.

I now have a daily backup of the configurations associated with my development environment. I’m sure this will save me some time in the near future.

Follow Darren on Twitter @darrenjrobinson

How to configure Paged Imports on the Granfeldt FIM/MIM PowerShell Management Agent

Introduction

In the last 12 months I’ve lost count of the number of PowerShell Management Agents I’ve written to integrate Microsoft Identity Manager with a plethora of environments. The majority though have not been of huge scale (<50k objects) and the import of the managed entities into the Connector Space/Metaverse runs through pretty timely.

However this week I’ve been working on a AzureAD Groups PS MA for an environment with 40k+ groups. That in itself isn’t that large, but when you start processing Group Memberships as well, the Import process can take an hour for a Full Sync. During this time before the results are passed to the Sync Engine you don’t have any visual of where the Import is up to (other than debug logging). And the ability to stop the MA requires a restart of the Sync Engine Server.

I’ve wanted to mess with Paging the Imports for sometime, but it hadn’t been a necessity. Now it is, so I looked to working out how to achieve it. The background information on Paged Imports is available at the bottom of the PSMA documentation page here.  However there are no working examples. I contacted Soren and he had misplaced his demo scripts for the time being. With some time at hand (in between coats of paint on the long weekend renovation)  I therefore worked it out for myself. I detail how to implement Paged Imports in this blogpost.

This post uses an almost identical Management Agent to the one described in this post. Review that post to get an understanding of the AzureAD Differential Queries. I’m not going to cover those elements in this post or setting up the MA at all.

Getting Started

There are two things you need to do in preparation for enabling Paged Imports on your PowerShell Management Agent;

  1. Enable Paged Imports (if your Import.ps1 is checking for this setting)
  2. Configure Page Size on your Import Run Profiles

The first is as simple as clicking the checkbox on the Global Parameters tab on your PS MA as shown below.

The 2nd is in your Run Profile. By default this will be 100. For my “let’s figure this out” process I dropped my Run Profiles to 50 on one Run Profile and 10 on another.

Import Script

With Paged Imports setup on the MA the rest of the logic goes into your Import Script. In your param section at the start of the script $usepagedimport and $pagesize are the variables that reflect the items from the two enablement components you did above.

$usepageimport is either True or False. Your Import.ps1 script can check to see if it is set and process accordingly. In this example I’m not even checking if it is set and doing Paged Imports anyway. For completeness in a production example you should check to see what the intention of the MA is.

$pagesize is the pagesize from the Run Profile (100 by default, or whatever you changed your’s too).

param (
    $Username,
    $Password,
    $Credentials,
    $OperationType,
    [bool] $usepagedimport,
    $pagesize
 )

 

An important consideration to keep in mind is that the Import.ps1 will be called multiple times (ie. #of_times = #ofObjects / pagesize).

So anything that you would normally expect in any other MA to only process once when the Import.ps1 runs you need to limit to only running once.

Essentially the way I’ve approached it is, retrieve all the objects that will be processed and put them in a Global variable. If the variable does not have any values/data then it is the first run, so go and get our source data. If the Global variable has values/data in it then we must be on a subsequent loop so no need to go process that part, just page through our import.

In my example below this appears as;

if (!$global:tenantObjects) {
    # Authenticate
    # Search and get the users
    # Do some rationalisation on the results (if required)
    # setup some global variables so we know where we are with processing the data
} # Finish the one time tasks

As you’ll see in the full import.ps1 script below there are more lines that could be added into this section so they don’t get processed each time. In a production implementation I would.

For the rest of the Import.ps1 script we are expecting it to run multiple times. This is where we do our logic and process our objects to send through to the Sync Engine/Connector Space. We need to keep track of where we are up to processing the dataset and continue on from where we left off. We also need to know how many objects we have processed in relation to the ‘pagesize’ we get from the Run Profile so we know when we’ve finished.

When we reach the pagesize but know we have more objects to process we set the $global:MoreToImport  to $true and break out of the foreach loop.

When we have processed all our objects we set $global:MoreToImport = $false and break out of the foreach loop to finish.

With that explanation out of the way here is a working example. I’ve left in debugging output to a log file so you can see what is going on.

You can get the associated relevant Schema.ps1 from the Management Agent described in this post. You’ll need to update your Tenant name on line 29, your directory paths on lines 10 and 47. If you are using a different version of the AzureADPreview PowerShell Module you’ll need to change line 26 as well.

Everything else is in the comments within the example script below and should make sense.

Summary

For managing a large number of objects on a PS MA we can now see progress as the import processes the objects, and we can now stop an MA if required.

I’m sure this will help someone else. Enjoy.

Follow Darren on Twitter @darrenjrobinson