Adding Delta Sync Support to the Microsoft Identity Manager PowerShell Management Agent for Workday HR

Recently I posted a sample Microsoft Identity Manager Management Agent for Workday HR. Subsequently I also posted about some updates I made to the WorkdayAPI PowerShell Module to enable functionality to specify the time period to return changes for. This post details updating  my sample Workday Management Agent to support Delta Synchronisation.

WorkdayAPI PowerShell Module

First up you will need the updated WorkdayAPI PowerShell Module that provides the Get-WorkdayWorkerAdv cmdlet and can take a time period to return information for. Get the updated WorkdayAPI PowerShell Module from here

Update the PowerShell Module on the MIM Sync Server. The module by default will be in the  C:\Program Files\WindowsPowerShell\Modules\WorkdayApi folder.

You will need to unblock the new files.

Get-ChildItem 'C:\Program Files\WindowsPowerShell\Modules\WorkdayApi' | Unblock-File
Get-ChildItem 'C:\Program Files\WindowsPowerShell\Modules\WorkdayApi\scripts' | Unblock-File

Updated Schema

In the updated Management Agent I’m also bringing into MIM additional attributes from the other enhancements I made to the PowerShell Module for HireDate, StartDate, EndDate, Supplier and WorkdayActive. The updates to the Schema.ps1 are shown below.

$obj | Add-Member -Type NoteProperty -Name "HireDate|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "StartDate|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "EndDate|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "Supplier|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "WorkdayActive|Boolean" -Value $True

The full updated Schema Script is below;

With the Schema Script updated, refresh the Management Agent Schema.

Update Schema

You can then select the new attributes in the Workday MA under Select Attributes.

Select New Attributes.PNG

Then select Ok.

Attributes Selected.PNG

Updated Import Script

The Import Script has a number of changes to handle creating and updating a WaterMark File that is used to store the date stamp of the last run. Also updated in the Import Script is the change to use the Get-WorkdayWorkerAdv cmdlet over the Get-WorkdayWorker cmdlet so that a time period can be specified, and to retrieve the additional attributes we just added to the schema.

Update:

  • Line 11 for the path and name of the Watermark File you wish to use
  • Line 31 for the URI of your Workday Tenant

Executing the Management Agent using a Delta Import Delta Sync Run Profile

After creating a Delta Import Delta Sync Run Profile we can now run a Delta Sync. The following graphic is after seeding the WaterMark file (with the last run time in a format like this 2018-10-29T22:09:08.3628953+00:00), as by default without the WaterMark file being present a Full Import is performed by the MA as it doesn’t have a watermark to base the import time period on.

The changed records in Workday HR are then identified and those records obtained, imported and synchronised via the Management Agent.

Delta Sync.PNG

Summary

Using Delta Synchronisation functionality from Workday HR allows for much quicker synchronsiation from Workday HR to Microsoft Identity Manager.

Updated: Azure AD B2B Guest Invitations Microsoft Identity Manager Management Agent

In August I posted this that detailed Automating Azure AD B2B Guest Invitations using Microsoft Identity Manager. More recently Microsoft updated the Microsoft Graph to include additional information about Azure AD B2B Guest users and I wrote this that creates HTML Reports based off these new attributes.

That information is also handy when managing the lifecyle of Azure AD B2B Users. As we do that using Microsoft Identity Manager I’ve updated my Azure AD B2B Guest Invitation Management Agent for these attributes so they can be used in the lifecycle logic.

Updated Schema

I’ve updated the Schema script to include three new attributes that are shown in bold below in an extract from the Microsoft Graph.

odata.type : Microsoft.DirectoryServices.User
objectType : User
objectId : 38154c4c-a539-4920-a656-b5f8413768b5
deletionTimestamp : 
accountEnabled : True
creationType : Invitation
displayName : Rick Sanchez
givenName : Rick
mail : Rick.Sanchez@customer.com.au
mailNickname : Rick.Sanchez_customer.com.au#EXT#
otherMails : {Rick.Sanchez@customer.com.au}
proxyAddresses : {SMTP:Rick.Sanchez@customer.com.au}
refreshTokensValidFromDateTime : 2018-08-26T02:05:36Z
showInAddressList : False
surname : Sanchez
userPrincipalName : Rick.Sanchez_customer.com.au#EXT#@corporationone.onmicrosoft.com
userState : PendingAcceptance
userStateChangedOn : 2018-08-26T02:05:36Z
userType : Guest

Each are String attributes and I’ve named these;

  • B2BCreationType
  • B2BUpdatedDateTime
  • B2BExternalUserState

Here is the full updated Schema.ps1 Script.

Updated Import Script

The Import Script requires the following changes to bring in the B2B User State attributes.

 # B2B External User State for B2B Users from other AAD's 
if ($user.creationType) {$obj.Add("B2BCreationType", $user.creationType)} 
[string]$B2BUpdatedDateTime = $null 
if ($user.userStateChangedOn) {$B2BUpdatedDateTime = get-date($user.userStateChangedOn); $obj.Add("B2BUpdatedDateTime", $B2BUpdatedDateTime)} 
if ($user.userState) {$obj.Add("B2BExternalUserState", $user.userState)}

The full script with these additions is below. As per this post, make the following updates;

  • Change line 10 for your file path
  • Change line 24 for the version of an AzureAD or AzureADPreview PowerShell Module that you have installed on the MIM Sync Server so that the AuthN Helper Lib can be used. Note if using a recent version you will also need to change the AuthN calls as well as the modules change. See this post here for details.
  • Change line 27 for your tenant name
  • Change line 47/48 for a sync watermark file
  • The Import script also contains an attribute from the MA Schema named AADGuestUser that is a boolean attribute. I create the corresponding attribute in the MetaVerse and MIM Service Schemas for the Person/User objectClasses. This is used to determine when a Guest has been successfully created so their naming attributes can then be updated (using a second synchronisation rule).

Updating the Management Agent

With the updated Schema.ps1 and Import.ps1 scripts in place on the Synchronisation Server, using the Microsoft Identity Manager Synchronisation Service Manager right-click on the B2B Invitiation PSMA and select Refresh Schema.

Refresh B2B MA Schema.PNG

Select the Properties of the MA and choose Select Attributes.  Select the new Attributes.

Select New Attributes

Select Ok.

Select New Attributes - Selected

With the Schema updated and the Attributes selected a Stage/Full Sync can be performed. We now see the External User State, User Creation Type and External User Updated DateTime.

Updates with B2B External State.PNG

Summary

With a change to the Schema and Import B2B Invitation PSMA scripts we can now leverage the new B2B Attributes from the Microsoft Graph for use in our lifecycle management logic.

Building a Microsoft Identity Manager PowerShell Management Agent for Workday HR

Before I even get started with this post, let me state that the integration I describe here is not a standalone solution. Integrating with Workday for any organisation of significant size will require multiple integration points each providing coverage for the scenarios for your implementation. I list a few in this post, but Alexander Filipin has already done an awesome job here.

You may state, that there is of course the Azure Active Directory Provisioning Service for Workday. But what if you need more granular customisation than that provides, or you have requirements to get that data to a number of other systems and you desire to have connectivity to the authoritative source? Those are requirements I had and why I built a Management Agent for Workday to consume Workday HR data directly.

As the title implies it uses the ever versatile Granfeldt PowerShell Management Agent. The other key component is a PowerShell Module that eases the integration with Workdays’ SOAP API. Specifically the Workday API PowerShell Module available here.

Enabling the Workday (Get_Workers) API

In order to access the Workday API you need to have an API  account created. I pointed the Workday Support guys to this Microsoft Azure Inbound Workday Provisioning Documentation. Specifically the ‘Configure a system integration user in Workday‘ section in that link.

Once enabled they were able to give me a Service and Tenant name along with a Username and Password.

  • when using this information your Username is the username and the tenant. So if the username is ‘API User’ and the Tenant is ‘Identity_Corp’ then loginID for our purpose is API User@Identity_Corp
  • the URL you are provided will combine the Service and Tenant names. It will look something like this for the Human Resources Endpoint https://wd3-impl-services1.workday.com/ccx/service/TENANTNAME/Human_Resources/v30.2
    • where wd3-impl-services1 is the Service Name

Install the WorkdayAPI PowerShell Module

On your FIM/MIM Sync Server you will need to install the Workday API PowerShell Module available here. You will need to install it using an Elevated PowerShell session.

Unblock the PowerShell Module and Scripts

After installing the Workday API PowerShell Module it should be located in ‘C:\Program Files\WindowsPowerShell\Modules\WorkdayApi’. You will need to unblock the module and scripts. Run the following two commands in an elevated PowerShell session.

Get-ChildItem 'C:\Program Files\WindowsPowerShell\Modules\WorkdayApi' | Unblock-File
Get-ChildItem 'C:\Program Files\WindowsPowerShell\Modules\WorkdayApi\scripts' | Unblock-File

Verify your Execution Policy

As the PowerShell Module is unsigned you might need to do something similar to the following. The Get-ExecutionPolicy -List command will show you what the Execution Policy settings currently are.

Set-ExecutionPolicy "Unrestricted" -Scope Process -Confirm:$false
Set-ExecutionPolicy "Unrestricted" -Scope LocalMachine -Confirm:$false

Import Analytics

50k records with just the base profile (no -include work or -include personal options) takes ~7 minutes to ‘stage’ into the connector space. 50k records WITH work and personal metadata takes ~32 hours at a pretty consistent rate of ~20 mins/500 user records.

If you are retrieving just the Base record then the networking receive bandwidth consumption is ~240kbps. When retrieving the full records as a batch process the networking receive bandwidth consumption it ~3Mbps as shown below.

Full Object Network Graph

Why is this important?

The first “FULL” Sync depending on how many records you have in Workday will alter the approach you will need to take in order to obtain them all. I found that trying to retrieve full records in one call for anything over ~5000 records got inconsistent. I wouldn’t get the full dataset and the machine running it would start to run out of resources (processing power and memory). If you have only a few thousand records, requesting full records in one call will probably suffice.

Now I have ~100k records to return. What I found worked best is to get just the base record for all users then the full record for each user using pagination (via PSMA Paged Imports; I have my set to 500). The the PSMA Paged Imports feature will process the objects through the MA 500 at a time. That way you’re not stressing the host running the Sync Engine to the maximum and you don’t have to wait an hour+ to see any processing of objects on the MA.

Once you have completed a Full Sync and you are of any significant scale you will want to perform Delta Sync’s for the objects that have changed since your last sync. I’m not going to cover that in this post, but in a separate one in the future.

Here is a screenshot of showing the time taken for a Stage (Import) of 50k objects. Just under 33 hours.

Import - Stage Only.PNG

Other Options for Scale

If you are a large organisation this solution isn’t necessary a valid one (in isolation) as I indicated in the opening paragraph. Consider it ancillary augmentation to a multi-pronged implementation (as described nicely by Alexander Filipin here). Potentially something like;

  • Azure Active Directory Inbound Provisioning for object creation
  • A Management Agent such as the one I describe in this post for certain aspects
    • and a modification or two to identify new accounts from a Base Workday discovery and only import the full object for them on workdays and a full sync on the weekends or
    • delta syncs using the Workday Transaction Log Criteria Data and Transaction_Date_Range_Data
      • I’ll cover this in a future post but essentially on every sync I store a cookie-file with the watermark of the time of the sync. On the next deltasync I retrieve the cookie-file with the timestamp and make a call to get all objects changed since the previous sync up to the current time

PSMA Workday Management Agent Script Files

Wow, what a lot of caveats and clarifications. But with all that said, below are base  Schema and Import Scripts examples for the Grandfeldt PowerShell Management Agent that leverages the Workday API PowerShell module.

Schema.ps1

The schema is the base schema for my tenant. You shouldn’t have to change anything here unless you are retreiving additional attributes you want in MIM.

Import.ps1

The import script leverages AuthN creds from the MA config. Make sure the Username is in the format of UserID@TenantName. Also update;

  • Line 10 for the location you put your extension as well as the 8.3 format path to the MA Debug folder
  • Line 30 for the correct Service and Tenant info
  • Make sure you have Paged Imports selected on the Global Parameters screen of the MA Configuration

Export.ps1

I haven’t provided an example. The Workday API PowerShell Module has examples for updating Email, Phone and Photos. You can implement what you require.

Summary

The sample Workday MA Config in this post will give you a base integration with Workday. It is unlikely that it will give you everything you need and there isn’t a single solution that probably will, unless your organisation is quite small. There are other options as mentioned in this post and also the Workday Reports REST API. But those are topics for future posts.

 

Automate the Generation of a Granfeldt PowerShell Management Agent Schema Definition File

Generating Schema.ps1 for the Granfeldt FIM/MIM PowerShell Management Agent

Getting started writing your first Forefront/Microsoft Identity Manager Granfeldt PowerShell Management Agent can be a bit daunting. Before you can do pretty much anything you need to define the schema for the PSMA. Likewise if you have written many, the generation of the schema file often seems to take longer than it should and can be a little tedious when all you want to do is write the logic for the Import and Export scripts.

After a few chats with Soren around enhancements for the PSMA I suggested it would be awesome if the generation of the schema.ps1 file could be (semi)automated. So here is my first stab at doing just that.

My approach is;

  • Using PowerShell get an object that represents an object that will be managed on the PSMA
  • Enumerate the Properties of the PSObject and generate the Schema script accordingly
  • All that is left to do afterwards is;
    • define your anchor
    • define the name of the ObjectType
    • combine multiples if your MA will have multiple ObjectClasses
      • Update $obj to $obj2 etc for any additional object classes residing in the same schema file

Below I provide four examples covering the script that generates the schema definition along with the output. The four examples cover;

  • Azure AD User
  • Azure AD Group
  • Workday User
  • Flat File CSV

Example 1: Azure Active Directory User

The example below utilises the AzureAD PowerShell Module to connect to Azure AD. It then gets a User Object (update line 7 for a user to retrieve) and enumerates the properties of the User to generate the Schema file.

The output looks like this:

$obj = New-Object -Type PSCustomObject
$obj | Add-Member -Type NoteProperty -Name "Anchor-YourAnchorATTR|String" -Value ""
$obj | Add-Member -Type NoteProperty -Name "objectClass|String" -Value "YourObjectClass"
$obj | Add-Member -Type NoteProperty -Name "AccountEnabled|boolean" -Value $true
$obj | Add-Member -Type NoteProperty -Name "AgeGroup|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "AssignedLicenses|String[]" -Value ("","")
$obj | Add-Member -Type NoteProperty -Name "AssignedPlans|String[]" -Value ("","")
$obj | Add-Member -Type NoteProperty -Name "City|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "CompanyName|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "ConsentProvidedForMinor|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "Country|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "CreationType|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "DeletionTimestamp|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "Department|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "DirSyncEnabled|boolean" -Value $true
$obj | Add-Member -Type NoteProperty -Name "DisplayName|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "ExtensionProperty|String[]" -Value ("","")
$obj | Add-Member -Type NoteProperty -Name "FacsimileTelephoneNumber|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "GivenName|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "ImmutableId|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "IsCompromised|boolean" -Value $true
$obj | Add-Member -Type NoteProperty -Name "JobTitle|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "LastDirSyncTime|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "LegalAgeGroupClassification|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "Mail|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "MailNickName|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "Mobile|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "ObjectId|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "ObjectType|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "OnPremisesSecurityIdentifier|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "OtherMails|String[]" -Value ("","")
$obj | Add-Member -Type NoteProperty -Name "PasswordPolicies|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "PhysicalDeliveryOfficeName|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "PostalCode|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "PreferredLanguage|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "ProvisionedPlans|String[]" -Value ("","")
$obj | Add-Member -Type NoteProperty -Name "ProvisioningErrors|String[]" -Value ("","")
$obj | Add-Member -Type NoteProperty -Name "ProxyAddresses|String[]" -Value ("","")
$obj | Add-Member -Type NoteProperty -Name "RefreshTokensValidFromDateTime|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "ShowInAddressList|boolean" -Value $true
$obj | Add-Member -Type NoteProperty -Name "SignInNames|String[]" -Value ("","")
$obj | Add-Member -Type NoteProperty -Name "SipProxyAddress|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "State|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "StreetAddress|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "Surname|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "TelephoneNumber|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "UsageLocation|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "UserPrincipalName|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "UserType|string" -Value "string"

Update the Anchor for the attribute you’d like to use. I recommend ObjectId and give the ObjectClass a name for how you’d like it represented on your MA (User, AADUser or similar) and save it as something like schema.ps1 in you MA folder and you can get started.

Example 2: Azure Active Directory Group

The example below utilises the AzureAD PowerShell Module to connect to Azure AD. It then gets a Group Object (update line 7 for a group to retrieve) and enumerates the properties of the Group to generate the Schema file

The output looks like this:

$obj = New-Object -Type PSCustomObject
$obj | Add-Member -Type NoteProperty -Name "Anchor-YourAnchorATTR|String" -Value ""
$obj | Add-Member -Type NoteProperty -Name "objectClass|String" -Value "YourObjectClass"
$obj | Add-Member -Type NoteProperty -Name "DeletionTimestamp|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "Description|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "DirSyncEnabled|boolean" -Value $true
$obj | Add-Member -Type NoteProperty -Name "DisplayName|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "LastDirSyncTime|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "Mail|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "MailEnabled|boolean" -Value $true
$obj | Add-Member -Type NoteProperty -Name "MailNickName|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "ObjectId|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "ObjectType|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "OnPremisesSecurityIdentifier|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "ProvisioningErrors|String[]" -Value ("","")
$obj | Add-Member -Type NoteProperty -Name "ProxyAddresses|String[]" -Value ("","")
$obj | Add-Member -Type NoteProperty -Name "SecurityEnabled|boolean" -Value $true

Update the Anchor for the attribute you’d like to use. I recommend ObjectId and give the ObjectClass a name for how you’d like it represented on your MA (Group, AADGroup or similar) and save it as something like schema.ps1 in you MA folder and you can get started.

Example 3: Workday User

The example below utilises the Workday PowerShell Module to connect to Workday. It then gets a User Object (update line 7 for a user to retrieve) and enumerates the properties of the User to generate the Schema file.

Update

  • Line 6 for your ServiceName and Tenant.
  • Line 13 for an object to retrieve

This script differs from AAD User and Group above in that the PowerShell Object returned uses NoteProperty as the type. So I updated Line 14 for that. Also the attribute when parsed by Get-Member includes a value so I had to get a substring of the result for the attribute name. That is what this change does:

$d[1].substring(0,$d[1].indexof("="))

The output looks like this:

$obj = New-Object -Type PSCustomObject
$obj | Add-Member -Type NoteProperty -Name "Anchor-YourAnchorATTR|String" -Value ""
$obj | Add-Member -Type NoteProperty -Name "objectClass|String" -Value "YourObjectClass"
$obj | Add-Member -Type NoteProperty -Name "BusinessTitle|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "FirstName|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "JobProfileName|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "LastName|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "Location|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "PreferredName|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "UserId|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "WorkerDescriptor|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "WorkerId|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "WorkerType|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "WorkerTypeReference|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "WorkSpace|string" -Value "string"

Example 4: Flat File CSV

The example below utilises a sample CSV file with headers. It uses the Header row to generate the Schema file. It defaults all columns to strings.

Update;

  • Line 2 for your CSV File Name

The output looks like this (for my CSV File):

$obj = New-Object -Type PSCustomObject
$obj | Add-Member -Type NoteProperty -Name "Anchor-YourAnchorATTR|String" -Value ""
$obj | Add-Member -Type NoteProperty -Name "objectClass|String" -Value "YourObjectClass"
$obj | Add-Member -Type NoteProperty -Name "id|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "name|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "displayName|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "comments|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "created|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "endDate|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "lastLogon|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "modified|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "startDate|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "status|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "type|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "groups|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "costCenter|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "country|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "department|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "division|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "email|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "employeeNumber|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "familyName|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "givenName|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "honorificPrefix|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "honorificSuffix|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "locale|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "location|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "manager|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "middleName|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "organization|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "phoneNumber|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "preferredLanguage|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "preferredName|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "secondaryEmail|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "secondaryPhoneNumber|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "timezone|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "title|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "risk|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "WorkerWid|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "WorkerDescriptor|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "WorkerId|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "OtherId|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "JobProfileName|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "WorkSpace|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "WorkerTypeReference|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "WorkerType|string" -Value "string"
$obj

Summary

Using a simple script and an example object we can quickly create the basis for a Granfeldt PSMA Schema Definition script.
As shown with the Workday example a minor tweak was required, but it was still a lot quicker than generating manually.

Hopefully this helps you get started quickly with your first, or next PSMA that you are building.

Automating Azure AD B2B Guest Invitations using Microsoft Identity Manager

Update: Oct 30 '18 
Also see this post that adds support for Microsoft's updates 
to the Microsoft Graph to include additional information 
about Azure AD B2B Guest users.

Introduction

Earlier this year Microsoft released the Microsoft Identity Manager Azure AD B2B Management Agent. I wrote about using it to write to Azure AD in this post here. As detailed in that post my goal was to write to Azure AD using the MA. I provided an incomplete example of doing that for Guests. This post fills in the gap and unlike the note preceding the post indicates, I’ve updated my MA to use the Graph API over the Azure AD PowerShell Module. It does though work in unison with the Microsoft Azure AD B2B Management Agent.

Overview

The process is;

  • Using the Microsoft Azure B2B Management Agent connect to an Azure AD Tenant that contains users that you want to invite as Guests to your Tenant. Flow in the naming information for users and their email address and any other metadata that you need to drive the logic for who you wish to invite
  • Use my Azure AD B2B Invitation Management Agent to automate the invitation of users to your Azure AD Tenant using Azure AD B2B

My Azure AD B2B Invitation Management Agent works in two phases;

  1. Invitation of Users as Guests
  2. Update of Guests with naming information (Firstname, Lastname, DisplayName)

The Azure AD B2B Invite Management Agent uses my favorite PowerShell Management Agent (the Granfeldt PSMA). I’ve posted many times on how to configure it. See these posts here if you are new to it.

Prerequisites

Setting up an Azure AD User Account for the Management Agent

In your Azure AD create a New User that will be used by the Management Agent to invite users to your Azure AD. I named mine B2B Inviter as shown below.

Inviter Account.PNG

You then want to assign them the Guest inviter role as shown below. This will be enough permissions to invite users to the Azure AD.

Inviter Role.PNG

However depending on how you want these invitee’s to look, you probably also want their names to be kept consistent with their home Azure AD. To also enable the Management Agent to do that you need to also assign the User administrator role as shown below.

Add User Admin Role.PNG

Now log in using that account to Azure AD and change the password. The account is now ready to go.

Management Agent Scripts

The Management Agent uses the Granfeldt PowerShell Management Agent. This is a cut down version of my MIM Azure AD Management Agent. 

Schema Script

I’ve kept the schema small with just enough interesting info to facilitate the functionality required. Expand it if you need additional attributes and update the import.ps1 accordingly.

Import.ps1

The Import script imports users from the Azure AD Tenant that you will be inviting remote Azure AD users too (as Guests).

  • Change line 10 for your file path
  • Change line 24 for the version of an AzureAD or AzureADPreview PowerShell Module that you have installed on the MIM Sync Server so that the AuthN Helper Lib can be used. Note if using a recent version you will also need to change the AuthN calls as well as the modules change. See this post here for details.
  • Change line 27 for your tenant name
  • Change line 47/48 for a sync watermark file
  • The Import script also contains an attribute from the MA Schema named AADGuestUser that is a boolean attribute. I create the corresponding attribute in the MetaVerse and MIM Service Schemas for the Person/User objectClasses. This is used to determine when a Guest has been successfully created so their naming attributes can then be updated (using a second synchronisation rule).

Export.ps1

The Export script handles the creation (invitation) of users from another azure AD Tenant as Guests as well synchronizing their naming data. It doesn’t include deletion logic, but that is simple enough include a deletion  API call based on your MA Deprovisioning logic and requirements.

  • By default I’m not sending invitation notifications. If you want to send invitation notifications change “sendInvitationMessage“= $false to $true on Line 129. You should then also change the Invitation Reply URL on line 55 to your Tenant/Application.
  • Change Line 10 for the path for the debug logging
  • Change Line 24 as per the Import Script if you are using a different version of the help lib
  • Change Line 27 for your Azure AD Tenant Name

Declarative Sync Rules

I’m not going to cover import flow configurations on the MS Azure AD B2B MA here. See here for that. Below details how I’ve configured my Invitation MA for the Creation/Export functions. My join rule (configured in the Sync Engine Invitation MA Config) is email address as shown below. Not the best anchor as it isn’t immutable. But as we don’t know what the DN is going to be until after it is created this is the next best thing.

Join Rule.PNG

Creation Sync Rule

Here are the three attributes I’m syncing to the B2B Invite Management Agent to perform the invitation. I’m using the mail attribute for the DN as it matches the anchor for the schema. We don’t know what objectID will be assigned until the directory service does it. By using email/upn once created we will get the join and won’t end up with two objects on the MA for the same user.

Outbound Flow for Create 2.PNG

For Inbound I’m flowing in the AADGuestUser boolean value. You will need to create this attribute in the MetaVerse and then the MIM Service. In the MIM Service allow the Sync Service Account to manage the attribute and change the MIM Service Filter Permissions to allow Admins to use the attribute in Sets. Also on the MIM Service MA add an Export flow from the MV to the MIM Service for AADGuestUser.

Inbound Flow Create.PNG

Naming Update Sync Rule

The second Sync Rule is to update the guests GivenName, Surname and DisplayName. This sync rule has a dependency on the creation sync rule and has a corresponding Set, Workflow and MPR associated with value of the AADGuestUser boolean attribute populated by the Import script. If True (which it will be after successful creation and the confirming import) the second synchronization rule will be applied.

Sync Naming Synchronisation Rule.PNG

I will trigger an export flow for the three naming attributes.

Outbound Flow for Naming.PNG

Example of Inviting Guests

In this example Rick Sanchez is a member of a guest organisation and meets the criteria of my rules to be invited as a guest to our Tenant. We then, that we get an Add for Rick on the B2B Invite MA.

Create Rick Sanchez.PNG

On export Rick is created in our Azure AD as a Guest User

Rick Created Sync Engine.PNG

Rick appears in Azure AD as a Guest via the Azure Portal.

Rick Created AzureAD.PNG

Following the confirming import our second sync rule fires and flows through an update to DisplayName and adds GivenName and Surname.

Update Rick.PNG

This naming attributes are then successfully exported.

Success Export.PNG

Going to the Azure AD Portal we see that Rick has indeed been updated.

Rick Updated.PNG

Notification Emails

If you enable notification emails a generic notification email is sent like shown below. The import.ps1 MA script has them disabled by default.

Email Invite Notification2.PNG

Summary

Using a combination of the Microsoft Azure AD B2B Management Agent and my Azure AD B2B Invitation Management Agent you can automate the invitation of Guest users to your Azure AD Tenant.

A Voice Assistant for Microsoft Identity Manager

This is the third and final post in my series around using your voice to query/search Microsoft Identity Manager or as I’m now calling it, the Voice Assistant for Microsoft Identity Manager.

The two previous posts in this series detail some of my steps and processes in developing and fleshing out this concept. The first post detailed the majority of the base functionality whilst the second post detailed the auditing and reporting aspects into Table Storage and Power BI.

My final architecture is depicted below.

Identity Manager integration with Cognitive Services and IoT Hub 4x3
Voice Assistant for Microsoft Identity Manager Architecture

I’ve put together more of an overview in a presentation format embedded here.

GitPitch Presents: github/darrenjrobinson/MIM-VoiceAssistant/presentation

The Markdown Presentation Service on Git.

If you’re interested in building the solution checkout the Github Repo here which includes the Respeaker Python Script, Azure Function etc.

Let me know how you go @darrenjrobinson

Using your Voice to Search Microsoft Identity Manager – Part 2

Introduction

Last month I wrote this post that detailed using your voice to search/query Microsoft Identity Manager. That post demonstrated a working solution (GitHub repository coming next month) but was still incomplete if it was to be used in production within an Enterprise. I hinted then that there were additional enhancements I was looking to make. One is an Auditing/Reporting aspect and that is what I cover in this post.

Overview

The one element of the solution that has visibility of each search scenario is the IoT Device. As a potential future enhancement this could also be a Bot. For each request I wanted to log/audit;

  • Device the query was initiated from (it is possible to have many IoT devices; physical or bot leveraging this function)
  • The query
  • The response
  • Date and Time of the event
  • User the query targeted

To achieve this my solution is to;

  • On my IoT Device the query, target user and date/time is held during the query event
  • At the completion of the query the response along with the earlier information is sent to the IoT Hub using the IoT Hub REST API
  • The event is consumed from the IoT Hub by an Azure Event Hub
  • The message containing the information is processed by Stream Analytics and put into Azure Table Storage and Power BI.

Azure Table Storage provides the logging/auditing trail of what requests have been made and the responses.  Power BI provides the reporting aspect. These two services provide visibility into what requests have been made, against who, when etc. The graphic below shows this in the bottom portion of the image.

Auditing Reporting Searching MIM with Speech.png
Voice Search for Microsoft Identity Manager Auditing and Reporting

Sending IoT Device Events to IoT Hub

I covered this piece in a previous post here in PowerShell. I converted it from PowerShell to Python to run on my device. In PowerShell though for initial end-to-end testing when developing the solution the body of the message being sent and sending it looks like this;

[string]$datetime = get-date
$datetime = $datetime.Replace("/","-")
$body = @{
 deviceId = $deviceID
 messageId = $datetime
 messageString = "$($deviceID)-to-Cloud-$($datetime)"
 MIMQuery = "Does the user Jerry Seinfeld have an Active Directory Account"
 MIMResponse = "Yes. Their LoginID is jerry.seinfeld"
 User = "Jerry Seinfeld"
}

$body = $body | ConvertTo-Json
Invoke-RestMethod -Uri $iotHubRestURI -Headers $Headers -Method Post -Body $body

Event Hub and IoT Hub Configuration

First I created an Event Hub. Then on my IoT Hub I added an Event Subscription and pointed it to my Event Hub.

IoTHub Event Hub.PNG
Azure IoT Hub Events

Streaming Analytics

I then created a Stream Analytics Job. I configured two Inputs. One each from my IoT Hub and from my Event Hub.

Stream Analytics Inputs.PNG
Azure Stream Analytics Inputs

I then created two Outputs. One for Table Storage for which I used an existing Storage Group for my solution, and the other for Power BI using an existing Workspace but creating a new Dataset. For the Table storage I specified deviceId for Partition key and messageId for Row key.

Stream Analytics Outputs.PNG
Azure Stream Analytics Outputs

Finally as I’m keeping all the data simple in what I’m sending, my query is basically copying from the Inputs to the Outputs. One is to get the events to Table Storage and the other to get it to Power BI. Therefore the query looks like this.

Stream Analytics Query.PNG
Azure Stream Analytics Query

Events in Table Storage

After sending through some events I could see rows being added to Table Storage. When I added an additional column to the data the schema-less Table Storage obliged and dynamically added another column to the table.

Table Storage.PNG
Table Storage Events

A full record looks like this.

Full Record.PNG
Voice Search Table Storage Audit Record

Events in Power BI

Just like in Table Storage, in Power BI I could see the dataset and the table with the event data. I could create a report with some nice visuals just as you would with any other dataset. When I added an additional field to the event being sent from the IoT Device it magically showed up in the Power BI Dataset Table.

PowerBI.PNG
PowerBI Voice Search Analytics

Summary

Using the Azure IoT Hub REST API I can easily send information from my IoT Device and then have it processed through Stream Analytics into Table Storage and Power BI. Instant auditing and reporting functionality.

Let me know what you think on twitter @darrenjrobinson

Using your Voice to Search Microsoft Identity Manager – Part 1

Introduction

Yes, you’ve read the title correctly. Speaking to Microsoft Identity Manager. The concept behind this was born off the back of some other work I was doing with Microsoft Cognitive Services. I figured it shouldn’t be that difficult if I just break down the concept into individual elements of functionality and put together a proof of concept to validate the idea. That’s what I did and this is the first post of the solution as an overview.

Here’s a quick demo.

 

Overview

The diagram below details the basis of the solution. There are a few extra elements I’m still working on that I’ll cover in a future post if there is any interest in this.

Searching MIM with Speech Overview

The solution works like this;

  1. You speak to a microphone connected to a single board computer with the query for Microsoft Identity Manager
  2. The spoken phrase is converted to text using Cognitive Speech to Text (Bing Speech API)
  3. The text phrase is;
    1. sent to Cognitive Services Language Understanding Intelligent Service (LUIS) to identify the target of the query (firstname lastname) and the query entity (e.g. Mailbox)
    2. Microsoft Identity Manager is queried via API Management and the Lithnet REST API for the MIM Service
  4. The result is returned to the single board computer as a text result phase which it then uses Cognitive Services Text to Speech to convert the response to audio
  5. The result is spoken back

Key Functional Elements

  • The microphone array I’m using is a ReSpeaker Core v1 with a ReSpeaker Mic Array
  • All credentials are stored in an Azure Key Vault
  • An Azure Function App (PowerShell) interfaces with the majority of the Cognitive Services being used
  • Azure API Management is used to front end the Lithnet MIM Webservice
  • The Lithnet REST API for the MIM Service provides easy integration with the MIM Service

 

Summary

Leveraging a lot of Serverless (PaaS) Services, a bunch of scripting (Python on the ReSpeaker and PowerShell in the Azure Function) and the Lithnet REST API it was pretty simple to integrate the ReSpeaker with Microsoft Identity Manager. An alternative to MIM could be any other service you have an API interface into. MIM is obviously a great choice as it can aggregate from many other applications/services.

Why a female voice? From a small response it was the popular majority.

Let me know what you think on twitter @darrenjrobinson

Implementing Azure API Management with the Lithnet Microsoft Identity Manager Rest API

Introduction

Earlier this week I wrote this post that detailed implementing the Lithnet REST API for FIM/MIM Service. I also detailed using PowerShell to interact with the API Endpoint.

Now lets imagine you are looking to have a number of Azure Serverless features leverage your Rest API enabled Microsoft Identity Manager environment. Or even offer it “as-a-Service”. You’ll want to have some visibility as to how it is performing, and you’ll probably want to implement features such as caching and rate limiting let alone putting more security controls around it. Enter Azure API Management, which provides all those functions and more.

In this post I detail getting started with Azure API Management by using it to front-end the Lithnet FIM/MIM Rest API.

Overview

In this post I will detail;

  • Enabling Azure API Management
  • Configuring the Lithnet FIM/MIM Rest API integration with Azure API Management
  • Accessing MIM via Azure API Management and the Lithnet FIM/MIM Rest API using PowerShell
  • Reporting

Prerequisites

For this particular scenario I’m interfacing Azure API Management with a Rest API that uses Digest Authentication. So even though it is a Windows WCF Webservice you could do something similar with a similar API Endpoint. If the backend API endpoint is using SSL it will need to have a valid certificate. Even though Azure API Management allows you to add your own certificates I had issues with Self Signed Certificates. I have it working fine with Lets Encrypt issued certificates. Obviously you’ll need an Azure Subscription as well as an App/Servive with an API.

Enabling Azure API Management

From the Azure Portal select Create a resource and search for API management and select it.

Add API Mgmt.PNG

Select Create

Create API Mgmt.PNG

Give your API Management Service a name, select a subscription, resource group etc and select Create.

API Mgmt Config 1.PNG

Once you select Create it will take about 30 minutes to be deployed.

Configuring the Lithnet FIM/MIM Rest API integration with Azure API Management

Once your new API Management service has been deployed, from the Azure Portal select the API Management services blade and select the API Management service that you just created. Select APIs.

API Config 1.PNG

Select Add API and then select Add a new API

API Mgmt Config 2.PNG

Give the API a name, description, enter the URI for your API EndPoint, and select HTTPS. I’m going to call this MIMSearcher so have entered that under API URL Suffix. For initial testing under Products select starter. Finally select Create.

API Mgmt Config 4.PNG

We now have our base API setup. From the Backend tile select the Edit icon.

API Mgmt Config 5.PNG

As the backed is authenticated using Basic Authentication, select Basic in Gateway credentials and enter the details of an account with access that will be used by the API Gateway. Select Save.

API Mgmt Config 6.PNG

Now from our API Configuration select Add operation.

API Mgmt Config 7.PNG

First we will create a test operation for the Help page on the Lithnet FIM/MIM Rest API. Provide a Display name, and for the URL add /v2/help. Give it a description and select Create.

Note: I could have had v2 as part of the base URI for the API in the previous steps. I didn’t as I will be using API’s from both v1 and v2 and didn’t want to create multiple operations.

API Mgmt Config 8.PNG

Select the new Operation (Help)

API Mgmt Config 9.PNG

Select the Test menu. Select Send.

API Mgmt Config 10.PNG

If everything is set up correctly you will get a 200 Success OK response as below.

API Mgmt Config 11.PNG

Accessing MIM via Azure API Management and the Lithnet FIM/MIM Rest API using PowerShell

Head over to your API Portal. The URL is https://.portal.azure-api.net/ where is the name you gave your API Management Service shown in the third screenshot at the top of this post. If you are doing this from the browser you used to create the API Management Service you should be signed in already. From the Administrator menu on the right select Profile.

Test API Mgmt 1.PNG

Click on Show under one of the keys and record its value.

Test API Mgmt 2.PNG

Using PowerShell ISE or VSCode update the following Code Snippet and test.

$APIURL = 'https://.azure-api.net//v2/help'
$secret = 'yourSecret'
$Headers = @{'Ocp-Apim-Subscription-Key' = $secret} 
[Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12

$response = Invoke-RestMethod -Uri $APIURL -Headers $Headers -ContentType "application/json" -UseBasicParsing -Method Get
$response

The snippet will create a Web Request to the new API and display the results.

Test API Mgmt 3.PNG

Querying the Lithnet Rest API via Azure API Management

Now that we have a working solution end-to-end, let’s do something useful with it. Looking at the Lithnet Rest API, the Resources URI is the key one exposing Resources from the MIM Service.

Resources.PNG

Let’s create a new Operation for Resources similar to what we did for the Help. After selecting Create configure the Backend for Basic Authentication like we did for Help.

Get Resources.PNG

Testing out the newly exposed endpoint is very similar to before. Just a new APIURL with the addition of /?Person to return all Person Resources from the MIM Portal. It lets us know it’s returned 7256 Person Objects, and the Results are still paged (100 by default).

Get Persons.PNG

Let’s now Search for just a single user. Search for a Person object whose Display Name is ‘darrenjrobinson’.

$query = "Person[DisplayName='darrenjrobinson']"
$queryEncoded = [System.Web.HttpUtility]::UrlEncode($query)

$APIURL = "https://.azure-api.net//v2/resources/?filter=/$($queryEncoded)" 
$secret = 'yourSecret'
$Headers = @{'Ocp-Apim-Subscription-Key' = $secret} 
[Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12

$user = Invoke-RestMethod -Uri $APIURL -Headers $Headers -ContentType "application/json" -UseBasicParsing -Method Get
$user

Executing, we get a single user returned.

Search for User.PNG

Reporting

Using the Publisher Portal we can get some Stats on what is happening with our API Management implementation.

Go to https://.portal.azure-api.net/admin and select Analytics.

We then have visibility to what has been using the API Management Service. At a Glance gives and overview and you can drill down into;

  • Top Users
  • Top Products
  • Top subscriptions
  • Top APIs
  • Top Operations

At a glance looks like this;

At a Glance Stats.PNG

And Top Operations looks like this;

Top Operations.PNG

Summary

That is a quick start guide to implementing Azure API Management in front of a Rest API and using PowerShell to integrate with it. Next steps would be to enable caching, and getting into more of the advanced features. Enjoy.

Getting started with the Lithnet REST API for the Microsoft Identity Manager Service

Introduction

A common theme with my posts on Microsoft Identity is the extensibility of it particularly with the Lithnet tools that Ryan has released.

One such tool that I’ve used but never written about is the Lithnet REST API for the Microsoft Identity Manger Service. For a small proof of concept I’m working on I was again using this REST API and I needed to update it as Ryan has recently added some new functionality. I realised I hadn’t set it up in a while and while Ryan’s documentation is very good it was written some time ago when IIS Manager looked a little different. So here is a couple of screenshots and a little extra info to get you started if you haven’t used it before to supplement Ryan’s documentation located here.

Configuring the Lithnet REST API for the Microsoft Identity Manager Service

You can download the Lithnet REST API for the FIM/MIM Service from here

If you are using the latest version of the Lithnet Rest API you will need to make sure you have .NET 4.6.1 installed. If you are running Windows Server 2012 R2 you can get it from here.

When configuring your WebSite make sure you choose .NET v4.5 Classic for the Application Pool.

WebSite AppPool Settings.PNG

The web.config must match your MIM version. Currently the latest is 4.4.1749.0 as detailed here. That therefore looks like this.

WebConfig Resource Management Version.PNG

Finally you’ll need an SSL Certificate. For development environments a Self-Signed Certificate is fine. Personally I use this Cert Generator. Make sure you put the certificate in the cert store on the machine you will be testing access with. Here’s an example of my command line for generating a cert.

Cert Generation.PNG

You could also use Lets Encrypt.

In your bindings in IIS have the Host Name match your certificate.

Bindings.PNG

If you’ve done everything right you will be able to hit the v2 endpoint help. By default with Basic Auth enabled you’ll be prompted for a username and password.

v2 EndPoint.PNG

Using PowerShell to query MIM via the Lithnet Rest API

Here is an example script to query MIM via the Lithnet MIM Rest API. Update for your credentials (Lines 2 and 3), the URL of the server running the API Endpoint (Line 11) and what you are querying for (Line 14). My script takes into account Self Signed Certs in a Development environment.

Example output from a query is shown below.

Example Output.PNG

Summary

Hopefully that helps you quickly get started with the Lithnet REST API for the FIM/MIM Service. I showed an example using PowerShell directly, but using an Azure Function is also a valid pattern. I’ve covered similar functionality in the past.