A workaround for the Microsoft Identity Manager limitation of not allowing simultaneous Management Agents running Synchronisation Profiles

Why ?

For those of you that may have missed it, in early 2016 Microsoft released a hotfix for Microsoft Identity Manager that included a change that removed the ability for multiple management agents on a Microsoft Identity Manager Synchronization Server to simultaneously run synchronization run profiles. I detailed the error you get in this blog post.

At the time it didn’t hurt me too much as I didn’t require any other fixes that were incorporated into that hotfix (and the subsequent hotfix). That all changed with MIM 2016 SP1 which includes functionality that I do require. The trade-off however is I am now coming up against the inability to perform simultaneous delta/full synchronization run profiles.

Having had this functionality for all the previous versions of the product that I’ve worked with for the last 15+ years, you don’t realise how much you need it until its gone. I understand Microsoft introduced this constraint to protect the integrity of the system, but my argument is that it should be up to the implementor. Where I use this functionality a LOT is with Management Agents processing different object types from different connected systems (eg. Users, Groups, Contacts, Photos, other entity types). For speed of operation I want my management agents running synchronization run profiles simultaneously.

Overview of my workaround

First up this is a workaround not a solution to actually running multiple sync run profiles at once. I really hope Microsoft remove this limitation in the next hotfix. What I’m doing is trying to have my run sequences run as timely as possible.

What I’m doing is:

  • splitting my run profiles into individual steps
    • eg. instead of doing a Delta Import and a Delta Sync in a single multi-step run profile I’m doing a Delta Import as one run profile and the sync as a separate one. Same for Full Import Full Synchronization
  • running my imports in parallel. Essentially the Delta or Full imports are all running simultaneously
  • utilising the Lithnet MIIS Automation Powershell Module for the Start-ManagementAgent and Wait-ManagementAgent cmdlets to run the synchronisation run profiles

 

Example Script

This example script incorporates the three principles from above. The environment I built this example in has multiple Active Directory Forests. The example queries the Sync Server for all the Active Directory MA’s then performs the Imports and Sync’s. This shows the concept, but for your environment potentially without multiple Active Directories you will probably want to change the list of MA’s you use as input to the execution script.

Prerequisites

You’ll need;

The following script takes each of the Active Directory Management Agents I have (via a dynamic query in Line 9) and performs a simultaneous Delta Import on them. If your Run Profiles for Delta Imports aren’t named DI then update Line 22 accordingly.

-RunProfileName DI

 

With imports processed we want to kick into the synchronisation run profiles. Ideally though we want to kick these off as soon as the first import has finished.  The logic is;

  • We loop through each of the MA’s looking for the first MA that has finished its Import (lines 24-37)
  • We process the Sync on the first completed MA that completed its import and wait for it to complete (lines 40-44)
  • Enumerate through the remaining MA’s, verifying they’ve finished their Import and run the Sync Run Profile (lines 48-62)

If your Run Profiles for Delta Sync’s aren’t named DS then update Line 40 and 53 accordingly.

-RunProfileName DS

Summary

While we don’t have our old luxury of being able to choose for ourselves when we want to execute synchronisation run profiles simultaneously (please allow us to Microsoft), we can come up with band-aid workarounds to still get our synchronisation run profiles to execute as quickly as possible.

Let me know what solutions you’ve come up with to workaround this constraint.

 

 

How to create a PowerShell FIM/MIM Management Agent for AzureAD Groups using Differential Sync and Paged Imports

Introduction

I’ve been working on a project where I must have visibility of a large number of Azure AD Groups into Microsoft Identity Manager.

In order to make this efficient I need to use the Differential Query function of the AzureAD Graph API. I’ve detailed that before in this post How to create an AzureAD Microsoft Identity Manager Management Agent using the MS GraphAPI and Differential Queries. Due to the number of groups and the number of members in the Azure AD Groups I needed to implement Paged Imports on my favourite PowerShell Management Agent (Granfeldt PowerShell MA). I’ve previously detailed that before too here How to configure Paged Imports on the Granfeldt FIM/MIM PowerShell Management Agent.

This post details using these concepts together specifically for AzureAD Groups.

Pre-Requisites

Read the two posts linked to above. They will detail Differential Queries and Paged Imports. My solution also utilises another of my favourite PowerShell Modules. The Lithnet MIIS Automation PowerShell Module. Download and install that on the MIM Sync Server where you be creating the MA.

Configuration

Now that you’re up to speed, all you need to do is create your Granfeldt PowerShell Management Agent. That’s also covered in the post linked above  How to create an AzureAD Microsoft Identity Manager Management Agent using the MS GraphAPI and Differential Queries.

What you need is the Schema and Import PowerShell Scripts. Here they are.

Schema.ps1

Two object classes on the MA as we need to have users that are members of the groups on the same MA as membership is a reference attribute. When you bring through the Groups into the MetaVerse and assuming you have an Azure AD Users MA using the same anchor attribute then you’ll get the reference link for the members and their full object details.

Import.ps1

Here is my PSMA Import.ps1 that performs what is described in the overview. Enumerate AzureAD for Groups, import the active ones along with group membership.

Summary

This is one solution for managing a large number of Azure AD Groups with large memberships via a PS MA with paged imports showing progress thanks to differential sync which then allows for subsequent quick delta-sync run profiles.

I’m sure this will help someone else. Enjoy.

Follow Darren on Twitter @darrenjrobinson

An Azure Timer Function App to retrieve files via FTP and Remote PowerShell

Introduction

In an age of Web Services and API’s it’s an almost a forgotten world where FTP Servers exist. However most recently I’ve had to travel back in time and interact with a FTP server to get a set of files that are produced by other systems on a daily basis. These files are needed for some flat-file imports into Microsoft Identity Manager.

Getting files off a FTP server is pretty simple. But needing to do it across a number of different environments (Development, Staging and Production) meant I was looking for an easy approach that I could also replicate quickly across multiple environments. As I already had Remote PowerShell setup on my MIM Servers for other Azure Function Apps I figured I’d use an Azure Function for obtaining the FTP Files as well.

Overview

My PowerShell Timer Function App performs the following:

  • Starts a Remote PowerShell session to my MIM Sync Server
  • Imports the PSFTP PowerShell Module
  • Creates a local directory to put the files into
  • Connects to the FTP Server
  • Gets the files and puts them into the local directory
  • Ends the session

Pre-requisites

From the overview above there are a number of pre-requites that other blog posts I’ve written detail nicely the steps involved to appropriately setup and configure. So I’m going to link to those. Namely;

  • Configure your Function App for your timezone so the schedule is correct for when you want it to run. Checkout the WEBSITE_TIME_ZONE note in this post.

    WEBSITE_TIME_ZONE

  • You’ll need to configure your Server that you are going to put the files onto for Remote PowerShell. Follow the Enable Powershell Remoting on the FIM/MIM Sync Server section of this blogpost.
  • The credentials used to connect to the MIM Server are secured as detailed in the Using an Azure Function to query FIM/MIM Service section of this blog post.
  • Create a Timer PowerShell Function App. Follow the Creating your Azure App Service section of this post but choose a Timer Trigger PowerShell App.
    • I configured my Schedule for 1030 every day using the following CRON configuration
      0 30 10 * * *
  • On the Server you’ll be connecting to in order to run the FTP processes you’ll need to copy the PSFTP Module and files to the following directories. I unzipped the PSFTP files and copied the PSFTP folder and its contents to;
    • C:\Program Files\WindowsPowerShell\Modules
    • C:\Windows\System32\WindowsPowerShell\v1.0\Modules

Configuring the Timer Trigger Function App

With all the pre-requisites in place it’s time to configure the Timer Function App that you created in the pre-requisites.

The following settings are configured in the Function App Application Settings;

  • FTPServer (the server you will be connecting to, to retrieve files)
  • FTPUsername (username to connect to the FTP Sever with)
  • FTPPassword (password for the username above)
  • FTPSourceDirectory (FTP directory to get the files from)
  • FTPTargetDirectory (the root directory under which the files will be put)

ApplicationSettings

You’ll also need Application Settings for a Username and Password associated with a user that exists on the Server that you’ll be connecting to with Remote PowerShell. In my script below these application settings are MIMSyncCredUser and MIMSyncCredPassword

Function App Script

Finally here is a raw script. You’ll need to add appropriate error handling for your environment. You’ll also want to change lines 48 and 51 for the naming of the files you are looking to acquire. And line 59 for the servername you’ll be executing the process on.

Summary

A pretty quick and simple little Azure Function App that will run each day and obtain daily/nightly extracts from an FTP Server. Cleanup of the resulting folders and files I’m doing with other on-box processes.

This post is cross-blogged on both the Kloud Blog and Darren’s Blog.

Automate the nightly backup of your Development FIM/MIM Sync and Portal Servers Configuration

Last week in a customer development environment I had one of those oh shit moments where I thought I’d lost a couple of weeks of work. A couple of weeks of development around multiple Management Agents, MV Schema changes etc. Luckily for me I was just connecting to an older VM Image, but it got me thinking. It would be nice to have an automated process that each night would;

  • Export each Management Agent on a FIM/MIM Sync Server
  • Export the FIM/MIM Synchronisation Server Configuration
  • Take a copy of the Extensions Folder (where I keep my PowerShell Management Agents scripts)
  • Export the FIM/MIM Service Server Configuration

And that is what this post covers.

Overview

My automated process performs the following;

  1. An Azure PowerShell Timer Function WebApp is triggered at 2330 each night
  2. The Azure Function App initiates a Remote PowerShell session to my Dev MIM Sync Server (which is also a MIM Service Server)
  3. In the Remote PowerShell session the script;
    1. Creates a new subfolder under c:\backup with the current date and time (dd-MM-yyyy-hh-mm)

  1. Creates further subfolders for each of the backup elements
    1. MAExports
    2. ServerExport
    3. MAExtensions
    4. PortalExport

    1. Utilizes the Lithnet MIIS Automation PowerShell Module to;
      1. Enumerate each of the Management Agents on the FIM/MIM Sync Server and export each Management Agent to the MAExports Folder
      2. Export the FIM/MIM Sync Server Configuration to the ServerExport Folder
    2. Copies the Extensions folder and subfolder contexts to the MAExtensions Folder
    3. Utilizes the FIM/MIM Export-FIMConfig cmdlet to export the FIM Server Configuration to the PortalExport Folder

Implementing the FIM/MIM Backup Process

The majority of the setup to get this to work I’ve covered in other posts, particularly around Azure PowerShell Function Apps and Remote PowerShell into a FIM/MIM Sync Server.

Pre-requisites

  • I created a C:\Backup Folder on my FIM/MIM Server. This is where the backups will be placed (you can change the path in the script).
  • I installed the Lithnet MIIS Automation PowerShell Module on my MIM Sync Server
  • I configured my MIM Sync Server to accept Remote PowerShell Sessions. That involved enabling WinRM, creating a certificate, creating the listener, opening the firewall port and enabling the incoming port on the NSG . You can easily do all that by following my instructions here. From the same post I setup up the encrypted password file and uploaded it to my Function App and set the Function App Application Settings for MIMSyncCredUser and MIMSyncCredPassword.
  • I created an Azure PowerShell Timer Function App. Pretty much the same as I show in this post, except choose Timer.
    • I configured my Schedule for 2330 every night using the following CRON configuration

0 30 23 * * *

  • I set the Azure Function App Timezone to my timezone so that the nightly backup happened at the correct time relative to my timezone. I got my timezone index from here. I set the  following variable in my Azure Function Application Settings to my timezone name AUS Eastern Standard Time.

    WEBSITE_TIME_ZONE

The Function App Script

With all the pre-requisites met, the only thing left is the Function App script itself. Here it is. Update lines 2, 3 & 6 if your variables and password key file are different. The path to your password keyfile will be different on line 6 anyway.

Update line 25 if you want the backups to go somewhere else (maybe a DFS Share).
If your MIM Service Server is not on the same host as your MIM Sync Server change line 59 for the hostname. You’ll need to get the FIM/MIM Automation PS Modules onto your MIM Sync Server too. Details on how to achieve that are here.

Running the Function App I have limited output but enough to see it run. The first part of the script runs very quick. The Export-FIMConfig is what takes the majority of the time. That said less than a minute to get a nice point in time backup that is auto-magically executed nightly. Sorted.

Summary

The script itself can be run standalone and you could implement it as a Scheduled Task on your FIM/MIM Server. However I’m using Azure Functions for a number of things and having something that is easily portable and repeatable and centralised with other functions (pun not intended) keeps things organised.

I now have a daily backup of the configurations associated with my development environment. I’m sure this will save me some time in the near future.

Follow Darren on Twitter @darrenjrobinson

How to configure Paged Imports on the Granfeldt FIM/MIM PowerShell Management Agent

Introduction

In the last 12 months I’ve lost count of the number of PowerShell Management Agents I’ve written to integrate Microsoft Identity Manager with a plethora of environments. The majority though have not been of huge scale (<50k objects) and the import of the managed entities into the Connector Space/Metaverse runs through pretty timely.

However this week I’ve been working on a AzureAD Groups PS MA for an environment with 40k+ groups. That in itself isn’t that large, but when you start processing Group Memberships as well, the Import process can take an hour for a Full Sync. During this time before the results are passed to the Sync Engine you don’t have any visual of where the Import is up to (other than debug logging). And the ability to stop the MA requires a restart of the Sync Engine Server.

I’ve wanted to mess with Paging the Imports for sometime, but it hadn’t been a necessity. Now it is, so I looked to working out how to achieve it. The background information on Paged Imports is available at the bottom of the PSMA documentation page here.  However there are no working examples. I contacted Soren and he had misplaced his demo scripts for the time being. With some time at hand (in between coats of paint on the long weekend renovation)  I therefore worked it out for myself. I detail how to implement Paged Imports in this blogpost.

This post uses an almost identical Management Agent to the one described in this post. Review that post to get an understanding of the AzureAD Differential Queries. I’m not going to cover those elements in this post or setting up the MA at all.

Getting Started

There are two things you need to do in preparation for enabling Paged Imports on your PowerShell Management Agent;

  1. Enable Paged Imports (if your Import.ps1 is checking for this setting)
  2. Configure Page Size on your Import Run Profiles

The first is as simple as clicking the checkbox on the Global Parameters tab on your PS MA as shown below.

The 2nd is in your Run Profile. By default this will be 100. For my “let’s figure this out” process I dropped my Run Profiles to 50 on one Run Profile and 10 on another.

Import Script

With Paged Imports setup on the MA the rest of the logic goes into your Import Script. In your param section at the start of the script $usepagedimport and $pagesize are the variables that reflect the items from the two enablement components you did above.

$usepageimport is either True or False. Your Import.ps1 script can check to see if it is set and process accordingly. In this example I’m not even checking if it is set and doing Paged Imports anyway. For completeness in a production example you should check to see what the intention of the MA is.

$pagesize is the pagesize from the Run Profile (100 by default, or whatever you changed your’s too).

param (
    $Username,
    $Password,
    $Credentials,
    $OperationType,
    [bool] $usepagedimport,
    $pagesize
 )

 

An important consideration to keep in mind is that the Import.ps1 will be called multiple times (ie. #of_times = #ofObjects / pagesize).

So anything that you would normally expect in any other MA to only process once when the Import.ps1 runs you need to limit to only running once.

Essentially the way I’ve approached it is, retrieve all the objects that will be processed and put them in a Global variable. If the variable does not have any values/data then it is the first run, so go and get our source data. If the Global variable has values/data in it then we must be on a subsequent loop so no need to go process that part, just page through our import.

In my example below this appears as;

if (!$global:tenantObjects) {
    # Authenticate
    # Search and get the users
    # Do some rationalisation on the results (if required)
    # setup some global variables so we know where we are with processing the data
} # Finish the one time tasks

As you’ll see in the full import.ps1 script below there are more lines that could be added into this section so they don’t get processed each time. In a production implementation I would.

For the rest of the Import.ps1 script we are expecting it to run multiple times. This is where we do our logic and process our objects to send through to the Sync Engine/Connector Space. We need to keep track of where we are up to processing the dataset and continue on from where we left off. We also need to know how many objects we have processed in relation to the ‘pagesize’ we get from the Run Profile so we know when we’ve finished.

When we reach the pagesize but know we have more objects to process we set the $global:MoreToImport  to $true and break out of the foreach loop.

When we have processed all our objects we set $global:MoreToImport = $false and break out of the foreach loop to finish.

With that explanation out of the way here is a working example. I’ve left in debugging output to a log file so you can see what is going on.

You can get the associated relevant Schema.ps1 from the Management Agent described in this post. You’ll need to update your Tenant name on line 29, your directory paths on lines 10 and 47. If you are using a different version of the AzureADPreview PowerShell Module you’ll need to change line 26 as well.

Everything else is in the comments within the example script below and should make sense.

Summary

For managing a large number of objects on a PS MA we can now see progress as the import processes the objects, and we can now stop an MA if required.

I’m sure this will help someone else. Enjoy.

Follow Darren on Twitter @darrenjrobinson

 

 

 

 

 

 

How to create an AzureAD Microsoft Identity Manager Management Agent using the MS GraphAPI and Differential Queries

Introduction

In August 2016 I wrote this post on how to use PowerShell to leverage the Microsoft GraphAPI and use Differential Queries. The premise behind that post was I required a Microsoft Identity Manager Management Agent to synchronize identity information from AzureAD into Microsoft Identity Manager. However the environment it was intended for has a large AzureAD implementation and performing a Full Sync every-time is just to time consuming. Even more so with this limitation that still exists today in MIM 2016 with SP1.

In this blog post I’ll detail how to implement a PowerShell Management Agent for FIM/MIM to use the MS GraphAPI to synchronize objects into FIM/MIM, supporting Delta and Full Synchronization run profiles. I’m also using my favourite PowerShell Management Agent, the Granfeldt PowerShell Management Agent.

Pre-requsites

I’m using the ADAL helper library from the AzureADPreview PowerShell Module. Install that module on you MIM Sync Server via PowerShell (WMF5 or later) using the PowerShell command;

Install-Module AzureADPreview

Getting Started with the Granfeldt PowerShell Management Agent

If you don’t already have it, what are you waiting for. Go get it from here. Søren’s documentation is pretty good but does assume you have a working knowledge of FIM/MIM and this blog post is no different.

Three items I had to work out that I’ll save you the pain of are;

  • You must have a Password.ps1 file. Even though we’re not doing password management on this MA, the PS MA configuration requires a file for this field. The .ps1 doesn’t need to have any logic/script inside it. It just needs to be present
  • The credentials you give the MA to run this MA are the credentials for the account that has permissions to the AzureAD/Office365 Tenant. Just a normal account is enough to enumerate it, but you’ll need additional permissions if you intend to write-back to AzureAD.
  • The path to the scripts in the PS MA Config must not contain spaces and be in old-skool 8.3 format. I’ve chosen to store my scripts in an appropriately named subdirectory under the MIM Extensions directory. Tip: from a command shell use dir /x to get the 8.3 directory format name. Mine looks like C:\PROGRA~1\MICROS~4\2010\SYNCHR~1\EXTENS~2\AzureAD

Schema.ps1

My Schema is based around enumerating and managing users from AzureAD. You’ll need to create a number of corresponding attributes in the Metaverse Schema on the Person ObjectType to flow the attributes into. Use the Schema info below for a base set of attributes that will get you started. You can add more as required. I’ve prefixed most of them with AAD for my implementation.

If you want to manage Groups or Contacts or a combination of object types, you will need to update the Schema.ps1 script accordingly.

Import.ps1

The logic that the Import.ps1 implements is the same as detailed here in my post using Differential Queries. Essentially, perform a full import and create a file with the cookie/watermark. Allow Delta Sync run profiles to be performed by requesting the GraphAPI to return only changes since the cookie/watermark.

You’ll need to update the script for your AzureAD Tenant name on Line 28. Also the path to where the cookie file will go and the debug file if your path is different to mine. Lines 11, 46 and 47.

Importing of the attributes is based around the names in the Schema.ps1 scripts. Any changes you made there will need to be reflected in the import.ps1.

Password Script (password.ps1)

Empty as not implemented

Export.ps1

Empty as not implemented in this example. If you are going to write information back to AzureAD you’ll need to put the appropriate logic into this script.

Management Agent Configuration

With the Granfeldt PowerShell Management Agent installed on your FIM/MIM Synchronisation Server, in the Synchronisation Server Manager select Create Management Agent and choose “PowerShell” from the list of Management Agents to create.

As this example is for Users, I’ve named my MA accordingly.

As per the tips above, the format for the script paths must be without spaces etc. I’m using 8.3 format and I’m using an Office 365 account to connect to AzureAD/Office365 and import the user data.

Paths to the Import, Export and Password scripts. Note: the Export and Password PS1 scripts files exist but are empty.

Object Type as configured in the Schema.ps1 file.

Attributes as configured in the Schema.ps1 file.

Anchor as per the Schema.ps1 file.

The rest of the MA configuration is up to your implementation. What you are going to join on and what attributes flow into the MV will vary based on your needs and solution. At a minimum you’d probably be looking to do a join on immutableID (after some manipulation) or UPN and flow in attributes such as AADAccountEnabled etc.

Completing the Configuration

To finalise the MA you’ll need to do the usual tasks of creating run profiles, staging the connector space from AzureAD/Office365 and syncing into the Metaverse. Once you’ve done your initial Stage/Full Sync you can perform Delta Sync’s.

Summary

A “Full Import” on a small AzureAD (~8500 Users) took 2 minutes.
A subsequent “Delta Import” with no changes took 6 seconds.

A similar implementation of the logic, but for Groups gives similar results/performance.
A  “Full Import” on a small AzureAD (~9800 Groups) took 5 minutes.
A subsequent “Delta Import” with 7 Adds (new Groups) and 157 Updates took 1 minute.

 

Follow Darren on Twitter @darrenjrobinson

How to configure a Graphical PowerShell Dev/Admin/Support User Interface for Azure/Office365/Microsoft Identity Manager

During the development of an identity management solution I find myself with multiple PowerShell/RDP sessions connected to multiple environments using different credentials often to obtain trivial data/information. It is easy to trip yourself up as well with remote powershell sessions to differing environments. If only there was a simple UI that could front-end a set of PowerShell modules and make those simple queries quick and painless. Likewise to allow support staff to execute a canned set of queries without providing them elevated permissions.

I figured someone would have already solved this problem and after some searching with the right keywords I found the powershell-command-executor-ui from bitsofinfo . Looking into it he had solved a lot of the issues with building a UI front-end for PowerShell with the powershell-command-executor and the stateful-process-command-proxy. That solution provided the framework for what I was thinking. The ability to provide a UI for PowerShell using powershell modules including remote powershell was exactly what I was after. And it was built on NodeJS and AngularJS so simple enough for some customization.

Introduction

In this blog post I’ll detail how I’ve leveraged the projects listed above for integration with;

Initially I had a vision of serving up the UI from an Azure WebApp. NodeJS on Azure WebApp’s is supported, however with all the solution dependencies I just couldn’t get it working.

My fallback was to then look to serve up the UI from a Windows Server 2016 Nano Server. I learnt from my efforts that a number of the PowerShell modules I was looking to provide a UI for, have .NET Framework dependencies. Nano Server does not have full .NET Framework support. Microsoft state to do so would mean the server would no longer be Nano.

For now I’ve deployed an Azure Windows Server 2016 Server secured by an Azure NSG to only allow my machine to access it. More on security later.

Overview

Simply, put the details in Github for the powershell-command-executor provide the architecture and integration. What I will detail is the modifications I’ve made to utilize the more recent AzureADPreview PowerShell Module over the MSOL PowerShell Module. I also updated the dependencies of the solution for the latest versions and hooked it into Microsoft Identity Manager. I also made a few changes to allow different credentials to be used for Azure and Microsoft Identity Manager.

Getting Started

I highly recommend you start with your implementation on a local development workstation/development virtual machine. When you have a working version you’re happy with you can then look at other ways of presenting and securing it.

NodeJS

NodeJS is the webserver for this solution. Download NodeJS for your Windows host here. I’m using the 64-bit version, but have also implemented the solution on 32-bit. Install NodeJS on your local development workstation/development virtual machine.

You can accept all the defaults.

Following the installation of NodeJS download the powershell-command-executor-ui from GitHub. Select Clone 0r Download, Download ZIP and save it to your machine.

Right click the download when it has finished and select Extract All. Select Browse and create a folder at the root of C:\ named nodejs. Extract powershell-command-executor-ui.

Locate the c:\nodejs\powershell-command-executor-ui-master\package.json file.

Using an editor such as Notepad++ update the package.json file ……

…… so that it looks like the following. This will utilise the latest versions of the dependencies for the solution.

From an elevated (Administrator) command prompt in the c:\nodejs\powershell-command-executor-ui-master directory run “c:\program files\nodejs npm” installThis will read the package.json file you edited and download the dependencies for the solution.

You can see in the screenshot below NodeJS has downloaded all the items in package.json including the powershell-command-executor and stateful-process-command-proxy.

When you now list the directories under C:\nodejs\powershell-command-executor-ui-master\node_modules you will see those packages and all their dependencies.

We can now test that we have a working PowerShell UI NodeJS website. From an elevated command prompt whilst still in the c:\nodejs\powershell-command-executor-ui-master directory run “c:\Program Files\nodejs\node.exe” bin\www

Open a browser on the same host and go to “http://localhost:3000”. You should see the default UI.

Configuration and Customization

Now it is time to configure and customize the PowerShell UI for our needs.

The files we are going to edit are:

  • C:\nodejs\powershell-command-executor-ui-master\routes\index.js
    • Update Paths to the encrypted credentials files used to connect to Azure, MIM. We’ll create the encrypted credentials files soon.
  • C:\nodejs\powershell-command-executor-ui-master\public\console.html
    • Update for your customizations for CSS etc.
  • C:\nodejs\powershell-command-executor-ui-master\node_modules\powershell-command-executor\O365Utils.js
    • Update for PowerShell Modules to Import
    • Update for Commands to make available in the UI

We also need to get a couple of PowerShell Modules installed on the host so they are available to the site. The two I’m using I’ve mentioned earlier. With WMF5 intalled using Powershell we can simply install them as per the commands below.

Install-Module AzureADPreview
Install-Module LithnetRMA

In order to connect to our Microsoft Identity Manager Synchronization Server we are going to need to enable Remote Powershell on our Microsoft Identity Manager Synchronization Server. This post I wrote here details all the setup tasks to make that work. Test that you can connect via RPS to your MIM Sync Server before updating the scripts below.

Likewise for the Microsoft Identity Manager Service Server. Make sure after installing the LithnetRMA Powershell Module you can connect to the MIM Service using something similar to:

# Import LithnetRMA PS Module
import-module lithnetrma

# MIM AD User Admin
$username = "mimadmin@mim.mydomain.com"
# Password 
$password = "Secr3tSq1rr3l!" | convertto-securestring -AsPlainText -Force
# PS Creds
$credentials = New-Object System.Management.Automation.PSCredential $Username,$password

# Connect to the FIM service instance
# Will require an inbound rule for TCP 5725 (or your MIM Service Server Port) in you Resource Group Network Security Group Config
Set-ResourceManagementClient -BaseAddress http://mymimportalserver.:5725 -Credentials $credentials

 

\routes\index.js

This file details the encrypted credentials the site uses. You will need to generate the encrypted credentials for your environment. You can do this using the powershell-credentials-encryption-tools. Download that script to your workstation and unzip it. Open the credentialEncryptor.ps1 script using an Administrator PowerShell ISE session.

I’ve changed the index.js to accept two sets of credentials. This is because your Azure Admin Credentials are going to be different from your MIM Administrator Credentials (both in name and password). The username for my Azure account looks something like myname@mycompany.com whereas for MIM it is Domainname\Username.

Provide an account name for your Azure environment and the associated password.

The tool will create the encrypted credential files.

Rename the encrypted.credentials file to whatever makes sense for your environment. I’ve renamed it creds1.encrypted.credentials.

Now we re-run the script to create another set of encrypted credentials. This time for Microsoft Identity Manager. Once created, rename the encrypted.credentials file to something that makes sense in your environment. I’ve renamed the second set to creds2.encrypted.credentials.

We now need to copy the following files to your UI Website C:\nodejs\powershell-command-executor-ui-master directory:

  • creds1.encrypted.credentials
  • creds2.encrypted.credentials
  • decryptUtil.ps1
  • secret.key

Navigate back to Routes.js and open the file in an editor such as Notepad++

Update the index.js file for the path to your credentials files. We also need to add in the additional credentials file.

The changes to the file are, the paths to the files we just copied above along with the addition var PATH_TO_ENCRYPTED_RPSCREDENTIALS_FILE for the second set of credentials used for Microsoft Identity Manager.

var PATH_TO_DECRYPT_UTILS_SCRIPT = "C:\\nodejs\\powershell-command-executor-ui-master\\decryptUtil.ps1";
var PATH_TO_ENCRYPTED_CREDENTIALS_FILE = "C:\\nodejs\\powershell-command-executor-ui-master\\creds1.encrypted.credentials";
var PATH_TO_ENCRYPTED_RPSCREDENTIALS_FILE = "C:\\nodejs\\powershell-command-executor-ui-master\\creds2-encrypted.credentials";
var PATH_TO_SECRET_KEY = "C:\\nodejs\\powershell-command-executor-ui-master\\secret.key";


Also to initCommands to pass through the additional credentials file


initCommands: o365Utils.getO365PSInitCommands(
 PATH_TO_DECRYPT_UTILS_SCRIPT,
 PATH_TO_ENCRYPTED_CREDENTIALS_FILE,
 PATH_TO_ENCRYPTED_RPSCREDENTIALS_FILE,
 PATH_TO_SECRET_KEY,
 10000,30000,3600000),

Here is the full index.js file for reference.

 

public/console.html

The public/console.html file is for formatting and associated UI components. The key things I’ve updated are the Bootstrap and AngularJS versions. Those are contained in the top of the html document. A summary is below.

https://ajax.googleapis.com/ajax/libs/angularjs/1.6.1/angular.min.js
https://cdnjs.cloudflare.com/ajax/libs/angular.js/1.6.1/angular-resource.min.js
http://javascripts/ui-bootstrap-tpls-2.4.0.min.js
http://javascripts/console.js
<link rel="stylesheet" href="https://maxcdn.bootstrapcdn.com/bootstrap/3.3.7/css/bootstrap.min.css">
<link rel="stylesheet" href="https://maxcdn.bootstrapcdn.com/bootstrap/3.3.7/css/bootstrap-theme.min.css">

You will also need to download the updated Bootstrap UI (ui-bootstrap-tpls-2.4.0.min.js). I’m using v2.4.0 which you can download from here. Copy it to the javascripts directory.

I’ve also updated the table types, buttons, colours, header, logo etc in the appropriate locations (CSS, Tables, Div’s etc). Here is my full file for reference. You’ll need to update for your colours, branding etc.

powershell-command-executor\O365Utils.js

Finally the O365Utils.js file. This contains the commands that will be displayed along with their options, as well as the connection information for your Microsoft Identity Manager environment.

You will need to change:

  • Line 52 for the address of your MIM Sync Server
  • Line 55 for the addresses of your MIM Service Server
  • Line 141 on-wards for what commands and parameters for those commands you want to make available in the UI

Here is an example with a couple of AzureAD commands, a MIM Sync and a MIM Service command.

Show me my PowerShell UI Website

Now that we have everything configured let’s start the site and browse to it. If you haven’t stopped the NodeJS site from earlier go to the command window and press Cntrl+C a couple of times. Run “c:\Program Files\nodejs\node.exe” bin\www again from the C:\nodejs\powershell-command-executor-ui-master directory unless you have restarted the host and now have NodeJS in your environment path.

In a browser on the same host go to http://localhost:3000 again and you should see the site as it is below.

Branding and styling from the console.html, menu options from the o365Utils.js and when you select a command and execute it data from the associated service …….

… you can see results. From the screenshot below a Get-AzureADUser command for the associated search string executed in milliseconds.

 

Summary

The powershell-command-executor-ui from bitsofinfo is a very extensible and powerful NodeJS website as a front-end to PowerShell.

With a few tweaks and updates the look and feel can be easily changed along with the addition of any powershell commands that you wish to have a UI for.

As it sits though keep in mind you have a UI with hard-coded credentials that can do whatever commands you expose.

Personally I am running one for my use only and I have it hosted in Azure in its own Resource Group with an NSG allowing outgoing traffic to Azure and my MIM environment. Incoming traffic is only allowed from my personal management workstations IP address. I also needed to allow port 3000 into the server on the NSG as well as the firewall on the host. I did that quickly using the command below.

# Enable the WebPort NodeJS is using on the firewall 
netsh advfirewall firewall add rule name="NodeJS WebPort 3000" dir=in action=allow protocol=TCP localport=3000

Follow Darren on Twitter @darrenjrobinson

How to assign and remove user Office365 licenses using the AzureADPreview Powershell Module

A couple of months ago the AzureADPreview module was released. The first cmdlet that I experimented with was Set-AzureADUserLicense. And it didn’t work, there was no working examples and I gave up and used GraphAPI instead.

Since then the AzureADPreview has gone through a number of revisions and I’ve been messing around a little with each update. The Set-AzureADUserLicense cmdlet has been my litmus test. Now that I have both removing and assigning Office 365 licenses working I’ll save others the pain of working it out and give a couple of working examples.

 

If like me you have been experimenting with the AzureADPreview module you’ll need to force the install of the newest one. And for whatever reason I was getting an error informing me that it wasn’t signed. As I’m messing around in my dev sandpit I skipped the publisher check.
Install-Module -Name AzureADPreview -MinimumVersion 2.0.0.7 -Force -SkipPublisherCheck
Import-Module AzureADPreview RequiredVersion 2.0.0.7

Removing an Office 365 License from a User

Removing a license with Set-AzureADUserLicense looks something like this.

What if there are multiple licenses ? Similar concept but just looping through each one to remove.

Assigning an Office 365 License to a User

Now that we have the removal of licenses sorted, how about adding licenses ?

Assigning a license with Set-AzureADUserLicense looks something like this;

Moving forward this AzureAD Powershell Module will replace the older MSOL Module as I wrote about here. If you’re writing new scripts it’s a good time to start using the new modules.

Follow Darren on Twitter @darrenjrobinson

 

Enumerating all Users/Groups/Contacts in an Azure tenant using PowerShell and the Azure Graph API ‘odata.nextLink’ paging function

Recently I posted about using PowerShell and the Azure Active Directory Authentication Library to connect to Azure AD here. Whilst that post detailed performing simple tasks like updating an attribute on a user, in this post I’ll use the same method to connect to Azure AD via PowerShell but cover;

  • enumerate users, contacts or groups
  • where the number of objects is greater than the maximum results per page, get all remaining pages of results
  • limit results based on filters

The premise of my script was one that could just be executed without prompts. As such the script contains the ‘username’ and ‘password’ that are used to perform the query. No special access is required for this script. Any standard user account will have ‘read’ permissions to Azure AD and will return results.

Here is the base script to return all objects of a given type from a tenant. For your environment;

  • change line 7 for your tenant name
  • change line 11 for your account in your tenant
  • change line 12 for the password associated with the account specified in line 11
  • change line 18 for the object type (eg. Users, Groups, Contacts)

I’ve hardcoded the number of results to return per page in both line 39 and 64 to the maximum 999. The default is 100. I wanted to return all objects as quickly as possible.

The first query along with returning 999 query results also returns a value for $query.’odata.nextLink’ if there are more than 999 results. The .nextLink value we then use in subsequent API calls to return the remaining pages until we have returned all objects.

Brilliant. So we can now simply change line 18 for different object types (Users, Groups, Contacts) if required. But what if we want to filter on other criteria such as attribute values?

Here is a slightly modified version (to the URI) to include a query filter. Lines 19-24 have a couple of examples of query filters.

So there you have the basics on getting started returning large numbers of objects from Azure AD via Azure Graph from PowerShell. Hopefully the time I spent working out the syntax for the URI’s helps someone else out as there aren’t any examples I could find whilst working this out.

Follow Darren on Twitter @darrenjrobinson

 

 

 

Goodbye Set-MsolUser, Hello Set-AzureADUser & Azure Graph API

Recently Microsoft released the preview of the v2.0 Azure AD PowerShell cmdlets. https://azure.microsoft.com/en-us/updates/azure-ad-new-powershell-cmdlets-preview/

I’ve got a project coming up where I’m looking to change my approach for managing users in Azure using Microsoft Identity Manager. Good timing to do a quick proof of concept to manage users with the new cmdlets and directly using the Graph API in preparation to move away from the msol cmdlets.

New Modules

First up, the Azure AD v2.0 PowerShell module was released in public preview on July 13, 2016. There will likely be changes before they become GA, so keep that in mind.

The v2.0 Azure AD PowerShell Module modules themselves are available for download from here https://www.powershellgallery.com/packages/AzureADPreview/1.1.143.0

If you have Windows Management Framework v5 installed you can download and install from PowerShell (as below).

Once installed, pretty quickly you can import the module, authenticate to your tenant, retrieve a user and update a few attributes (as below).

Whilst functional it doesn’t really work for how we need to interact with Azure from an Identity Management perspective. So how can we still use PowerShell but enumerate and manipulate identities in Azure ?

Now that we have the AzureAD v2.0 module installed we can reference the Active Directory library it installs (Microsoft.IdentityModel.Clients.ActiveDirectory.dll), authenticate to our Tenant retrieve users, and update them. That’s exactly what is shown in the commands below.

Where interacting with the GraphAPI directly really shines however is at the directory services layer and the Differential Query functionality.  https://msdn.microsoft.com/en-us/Library/Azure/Ad/Graph/howto/azure-ad-graph-api-differential-query

As such this is the approach that I’ll be taking for integration of Azure with Microsoft Identity Manager for managing users for entitlements (such as Azure licensing).
I hope this though also saves a few people time in working out how to use PowerShell to manage Azure objects via the Graph API (using both the PowerShell Module or via the RestAPI).