Update:Dec 20 2018.
See this post that details the changes to the Azure AD
Reports and Events Rest API.
Hybrid Reporting is a great little feature of Microsoft Identity Manager. A small agent installed on the MIM Sync Server will send reporting data to Azure for MIM SSPR and MIM Group activities. See how to install and configure it here.
But what if you want to get the reporting data without going to the Azure Portal and looking at the Audit Reports ? Enter the Azure AD Reports and Events REST API that is currently in preview. It took me a couple of cracks and getting this working, because documentation is a little vague especially when accessing it via PowerShell and oAuth2. So I’ve written it up and hope it helps for anyone else looking to go down this route.
Accessing the Reports via the API has a couple of caveats that I had to work through:
Having the correct permissions to access the report data. Pretty much everything you read tells you that you need to be a Global Admin. Once I had my oAuth tokens I messed around a little and a was able to also get the following from back from the API when purposely using an identity that didn’t have the right permissions. The key piece is “Api request is not from global admin or security admin or security reader role”. I authorized the WebApp using an account that is in the Security Reader Role, and can successfully access the report data.
Reading the documentation here on MSDN I incorrectly assumed each category was the report name. Only when I called the “https://graph.windows.net//$metadata?api-version=beta” and looked at the list of reports I noticed each report was plural.The three that I wanted to access (and report on) are obviously the MIM Hybrid Reports;
Get an oAuth2 Authentication Code using an account that is either Global Admin or in the Security Admin or Security Reader Azure Roles
Use your Bearer and Refresh tokens to query for the reports you’re interested in
Register your WebApp
In the Azure Portal create a new Web app/API app and assign it https://localhost as the Reply URL. Record the Application ID for use in the PowerShell script.
Assign the Read Directory data permission as shown below
Obtain a key from the Keys option on your new Web App. Record it for use in the PowerShell script.
Generate an Authentication Code, get a Bearer and Refresh Token
Update the following script, changing Lines 5 & 6 for the ApplicationID/ClientId and Client Secret for the WebApp you created above.
Run the script and you will be prompted to authenticate. Use an account in the tenant where you created the Web App that is a Global Admin or in the Security Admin or Security Reader Azure Roles. You will need to change the location where you want the refresh.token stored (line 18).
If you’ve done everything correctly you have authenticated, got an AuthCode which was then used to get your Authorization Tokens. The value of the $Authorization variable should look similar to this;
Now you can use the Refresh token to generate new Authorization Tokens when they time out, simply by calling the Get-NewTokens function included in the script above.
Querying the Reporting API
Now that you have the necessary prerequisites sorted you can query the Reporting API.
Here are a couple of simple queries to return some data to get you started. Update the script for the tenant name of your AzureAD. With the $Authorization values from the script above you can get data for the MIM Hybrid Reports.
This is Part Two in the two-part blog post on managing users profile photos with Microsoft FIM/MIM. Part one here detailed managing users Azure AD/Active Directory profile photo. This post delves deeper into photos, specifically around Office 365 and the reason why you may want to manage these via FIM/MIM.
User profile photos should be simple to manage. But in a rapidly moving hybrid cloud world it can be a lot more complex than it needs to be. The best summary I’ve found of this evolving moving target is from Paul Ryan here.
Using Paul’s sound advice we too are advising our customers to let users manage their profile photo (within corporate guidelines) via Exchange Online. However as described in this article photos managed in OnPremise Active Directory are synchronized to Azure AD and on to other Office365 services only once. And of course we want them to be consistent across AD DS, Azure AD, Exchange Online and all other Office365 Services.
This post details synchronizing user profile photos from Exchange Online to MIM for further synchronization to other systems. The approach uses a combination of Azure GraphAPI and Exchange Remote PowerShell to manage Exchange Online User Profile Photos.
Users historically had a photo in Active Directory. DirSync/ADSync/AzureADConnect then synchronized that to Azure AD (and once only into Office 365).
Users update their photo in Office365 (via Exchange Online and Outlook Web Access)
the photo is synchronized across Office365 Services
An extension of the Current State is the requirement to be able to take the image uploaded by users in Exchange Online, and synchronize it back to the OnPremise AD, and any other relevant services that leverage a profile photo
Have AzureADConnect keep AzureAD consistent with the new photo obtained from Office365 that is synchronized to the OnPrem Active Directory
Sync the current photo to the MIM Portal
Synchronizing Office365 Profile Photos
Whilst Part-one dealt with the AzureAD side of profile photos as an extension to an existing AzureAD PowerShell Management Agent for FIM/MIM, I’ve separated out the Office365 side to streamline it and make it as efficient as possible. More on that later. As such I’ve created a new PowerShell Management Agent specifically for Office365 User Profile Photos.
I’m storing the Exchange Online photo in the MIM Metaverse as a binary object just as I did for the AzureAD photo (but in a different attribute ). I’m also storing a checksum of the photos (as I did for the AzureAD Photo, but also in a different attribute) to make it easier for comparing what is in Azure AD and Exchange Online, to then be used to determine if changes have been made (eg. user updated their profile photo).
For generating the hash of the profile photos I’m using Get-Hash from the Powershell Community Extensions. Whilst PowerShell has Get-FileHash I don’t want to write the profile photos out to disk and read them back in just to get the checksum. That slows the process up by 25%. You can get the checksum using a number of different methods and algorithms. Just be consistent and use the same method across both profile photos and you’ll be comparing apples with apples and the comparison logic will work.
Some notes on Photos and Exchange Online (and MFA)
This is where things went off on a number of tangents. Initially I tried accessing the photos using Exchange Online Remote PowerShell.
CAVEAT 1: If your Office365 Tenant is enabled for Multi-Factor Authentication (which it should be) you will need to get the Exchange Online Remote PowerShell Module as detailed here. Chances are you won’t have full Office365 Admin access though, so as long as the account you will be using is in the Recipient Management Role you should be able to go to the Exchange Control Panel using a URL like https://outlook.office365.com/ecp/?realm=<tenantname>&wa=wsignin1.0 where tenantname is something like customer.com.au From the Hybrid menu on in the right handside pane you will then be able to download the Microsoft.Online.CSE.PSModule.Client.application I had to use Internet Explorer to download the file and get it installed successfully. Once installed I used a few lines from this script here to load the Function and start my RPS session from within PowerShell ISE during solution development.
CAVEAT 2: The EXO RPS MFA PS Function doesn’t allow you to pass it your account password. You can pass it the identity you want to use, but not the password. That makes scheduled process automation with it impossible.
CAVEAT 3: The RPS session exposes the Get-UserPhoto cmdlet which is great. But the RPS session leverages the GraphAPI. The RPS PS Module doesn’t refresh it’s tokens, so if the import takes longer than 60 minutes then using this method you’re a bit stuffed.
CAVEAT 4: Using the Get-UserPhoto cmdlet detailed above, the syncing of photos is slow. As in I was only getting ~4 profile photos per minute slow. This also goes back to the token refresh issue as for pretty much any environment of the size I deal with, this is too slow and will timeout.
CAVEAT 5: You can whitelist the IP Address (or subnet) of your host so MFA is not required using Contextual IP Addressing Whitelisting. At that point there isn’t really a need to use the MFA Enabled PREVIEW EXO RPS function anyway. That said I still needed to whitelist my MIM Sync Server(s) from MFA to allow integration into the Graph API. I configured just the single host. The whitelist takes CIDR format so that looks like /32 (eg. 188.8.131.52/32)
As I mentioned above,
using the Get-UserPhoto cmdlet was slow. ~4 per minute slow
using the GraphAPI into Exchange Online and looking at each user and determining if they had a photo then downloading it, was also slow. Slow because at this customer only ~50% of their users have a photo on their mailbox. As such I was only able to retrieve ~145 photos in 25 minutes. *Note: all timings listed above were during development and actually outputting the images to disk to verify functionality.
After all my trial and error on this, here is myfinal approach and working solution;
Use the Exchange Online Remote PowerShell (non-MFA version) to query and return a collection of all mailboxes with an image *Note, add an exception for your MIM Sync host to the white-listed hosts for MFA (if your Office365 Tenant is enabled for MFA) so the process can be automated
Use the Graph API to obtain those photos
with this I was able to retrieve ~1100 profile photos in ~17* minutes (after ~2 minutes to query and get the list of mailboxes with a profile photo)
There’s a lot of info above, so let me summarize the pre-requisties;
The Granfeldt PowerShell MA
Whitelist your FIM/MIM Sync Server from MFA (if your Office 365 environment is enabled for MFA)
Add the account you will run the MA as, that will in turn connect to EXO via RPS to the Recipient Management Role
Create a WebApp for the PS MA to use to access users Profile Photos via the Graph API (fastest method)
Powershell Community Extensions to generate the image checksum
Creating the WebApp to access Office365 User Profile Photos
Go to your Azure Portal and select the Azure Active Directory Blade from the Resource Menu bar on the left. Then select App Registrations and from the Manage Section of the Azure Active Directory menu, and finally from the top of the main pane select “New Application Registration“.
Give it a name and select Web app/API as the type of app. Make the sign-in URL https://localhost and then select Create.
Record the ApplicationID that you see in the Registered App Essentials window. You’ll need this soon.
Now select All Settings => Required Permissions. Select Read all users basic profiles in addition to Sign in and read user profile. Select Save.
Under Required Permissions select Add and then select 1 Select an API, and select Office 365 Exchange Online then click Select.
Choose 2 Select Permissions and then select Read user profiles and Read all users’ basic profiles. Click Select.
Select Grant Permissions
From Settings select Keys, give your key a Description, choose a key lifetime and select Save.RECORD the key value. You’ll need this along with the WebApp ApplicationID/ClientID for the Import.ps1 script.
Using the information from your newly registered WebApp, we need to perform the first authentication (and authorization of the WebApp) to the Graph API. Taking your ApplicationID, Key (Client Secret) and the account you will use on on the Management Agent (and that you have assigned the Recipient Management Role in Exchange Online) and run the script detailed in this post here. It will authenticate you to your new WebApp via the GraphAPI after asking you to provide the account you will use on the MA and Authorizing the permissions you selected when registering the app. It will also create a refresh.token file which we will give to the MA to automate our connection. The Authorization dialog looks like this.
As detailed above the PSMA will leverage the WebApp to read users Exchange Profile Photos via the Graph API. The Import script also leverages Remote Powershell into Exchange Online (for reasons also detailed above). The account you run the Management Agent as will need to be added to the Recipient Management Role Group in order to use Remote PowerShell into Exchange Online and get the information required.
Take the Import.ps1 script below and update;
Update lines 11, 24 and 42 for the path to where you have put your PSMA. Mine is under the Extensions directory in a directory named EXOPhotos.
copy the refresh.token generated when authenticating and authorizing the WebApp earlier into the directory you specified in line 42 above.
Create a Debug directory under the directory you specified in lines 11,24 and 42 above so you can see what the MA is doing as you implement and debug it the first few times.
I’ve written the Import to use Paged Imports, so make sure you tick the Paged Imports checkbox on the configuration of the MA
Update Lines 79 and 80 with your ApplicationID and Client Secret that you recorded when creating your WebApp
Running the Exchange User Profile Photos MA
Now that you have created the MA, you should have select the EXOUser ObjectClass and the attributes defined in the schema. You should also create the EXOPhoto (as Binary) and EXOPhotoChecksum (as String) attributes in the Metaverse on the person ObjectType (assuming you are using the built-in person ObjectType).
Configure your flow rules to flow the EXOPhoto and EXOPhotoChecksum on the MA to their respective attributes in the MV.
Create a Stage Only run profile and run it. If you have done everything correctly you will see photos come into the Connector Space.
Looking at the Connector Space, I can see EXOPhoto and EXOPhotoChecksum have been imported.
After performing a Synchronization to get the data from the Connector Space into the Metaverse it is time to test the image that lands in the Metaverse. That is quick and easy via PowerShell and the Lithnet MIIS Automation PowerShell Module.
The file is output to the directory with the filename specified.
Opening the file reveals correctly my Profile Photo.
In Part one we got the AzureAD/Active Directory photo. In this post we got the Office365 photo.
Now that we have the images from Office365 we need to synchronize any update to photos to Active Directory (and in-turn via AADConnect to Azure AD). Keep in mind the image size limits for Active Directory and that we retrieved the largest photo available from Office365 when synchronizing the photo on. There are a number of PowerShell modules for photo manipulation that will allow you to resize accordingly.
Whilst Microsoft FIM/MIM can be used to do pretty much anything your requirements dictate, dealing with object types other than text and references can be a little tricky when manipulating them the first time. User Profile Photos fall into that category as they are stored in the directory as binary objects. Throw in Azure AD and obtaining and synchronizing photos can seem like adding a double back-flip to the scenario.
This post is Part 1 of a two-part post. Part two is here. This is essentially the introduction to the how-to piece before extending the solution past a users Active Directory Profile Photo to their Office 365 Profile Photo. Underneath the synchronization and method for dealing with the binary image data is the same, but the API’s and methods used are different when you are looking to implement the solution for any scale.
As for why you would want to do this, refer to Part two here. It details why you may want to do this.
As always I’m using my favourite PowerShell Management Agent (the Granfeldt PSMA). I’ve updated an existing Management Agent I had for Azure AD that is described here. I highly recommend you use that as the basis for the extra photo functionality that I describe in this post. Keep in mind the AzureADPreview, now AzureAD Powershell Module has change the ADAL Helper Libraries. I detail the changes here so you can get AuthN to work with the new libraries.
Therefore the changes to my previous Azure AD PowerShell MA are to add two additional attributes to the Schema script, and include the logic to import users profile photo (if they have one) in the Import script.
The AADPhoto attribute of type Binary is where we will store the photo. The AADPhotoChecksum attribute of type String is where we will store a checksum of the photo for use in logic if we need to determine if images have changed easily during imports.
Add the following lines into the Import.ps1 in the section where we are creating the object to pass to the MA. After the$obj.Add(“AADCity”,$user.city)line is a good spot.
What the script below does is create a WebClient rather than use Invoke-RestMethod or Invoke-WebRequest to get the users Azure AD Profile image only if the ‘thumbnailPhoto@odata.mediaContentType’ attribute exists which indicates the user has a profile photo. I’m using the WebClient over the PowerShell Invoke-RestMethod or Invoke-WebRequest functions so that the returned object is in binary format (rather than being returned as a string), which saves having to convert it to binary or output to a file and read it back in. The WebClient is also faster for transferring images/data.
Once the image has been returned (line 8 below) the image is added to the object as the attribute AADPhoto to be passed to the MA (line 11)
Line 14 gets the checksum for the image and adds that to the AADPhotoChecksum attribute in line 16.
Now that you’ve updated the Schema and Import scripts, you will need to;
Refresh your schema on your Azure AD PSMA to get the new attributes (AADPhoto and AADPhotoChecksum) added
Select the two new attributes in the Attributes section of your Azure AD PSMA
Create in your MetaVerse via the MetaVerse Designer two new attributes on the person (or whatever ObjectType you are using for users), for AADPhoto and AADPhotoChecksum. Make sure that AADPhoto is of type Binary and AADPhotoChecksum is of type string.
Configure your Attribute Flow on your Azure AD PSMA to import the AADPhoto and AADPhotoChecksum attributes into the Metaverse. Once done and you’ve performed an Import and Sync you will have Azure AD Photos in your MV.
How do you know they are correct ? Let’s extract one from the MV, write it to a file and have a look at it. This small script using the Lithnet MIIS Automation PowerShell Module makes it easy. First I get my user object from the MV. I then have a look at the text string version of the image (to make sure it is there), then output the binary version to a file in the C:\Temp directory.
Since that point in time I’ve found myself doing considerably more via PowerShell and the Graph API using oAuth. I regularly find myself leveraging previous scripts to generate a new script for the initial connection. To the point that I decided to make this simpler and provide a nice clean starting point for new scripts.
This blog post details a simple script to generate a couple of PowerShell Functions that can be the basis for integration with Graph API using PowerShell via a WebApp using oAuth2.
This script will request the necessary information required to call into the Graph API and establish a session. Specifically;
The API Endpoint. Historically there were many different API endpoints depending on what you are integrating with. Microsoft is moving to simplify this (great article here about the evolving API), but it is still a work in progress. For this example I’ll be using graph.microsoft.com which is where Microsoft are heading. If you need access to an API not currently on the Graph API see here to workout which API Endpoint fits your apps requirements. In short though typically all that changes between API’s is the Resource (API end-point) and the scope (what permissions your app will have). Variations to the primary Graph API endpoint is when you are integrating with applications such as OneNote (https://www.onenote.com/api), Office 365 Discovery Service (https://api.office.com/discovery/), One Drive etc.
The Scope of the WebApp. To make it seamless this should be done via the WebApp registration in the Application Registration Portal and configured as part of the PowerShell web requests
Armed with this information the shell of a PowerShell script will be created that will;
Authenticate a user to Graph API via Powershell and oAuth 2.0
Request Authorization for the WebApp to access the Scope provided (if Admin approval scope is requested and the AuthN is performed by a non-admin an authorization failure message will appear detailing an Administrator must authorize).
Obtain and Authorization Code which will contain the Bearer Token and Refresh Token.
The Bearer token can be used to make Graph API calls for up to 1 hour.
The Refresh token will allow you to request a new token and allow your script to be used again to interact via Graph API without going through the Authentication process again.
The following graphic shows this flow.
Create/Register your Application
Go to the Application Registration Portal https://apps.dev.microsoft.com/ and sign in. This is the new portal for registering your apps. It will show any previous apps you registered within AzureAD and any of the new “Converged Apps” you’ve created via the new Application Registration Portal.
Select Add an app from the Converged applications list.
Give your app a name and select Create
Record the Application ID (previously known as the Client ID) and select Generate New Password.
You will be provided your Client Secret. Record this now as it is the only time you will see it. Select Ok.
By default you will get User.Read permissions on the API. That is enough for this sample. Depending on what you will do with the API you will probably need to come and change the permissions or do it dynamically via the values you supply the $resource setting in your API calls.
Select Platforms, select Web and add a reply URL of https://localhost
Scroll to the bottom of the Registration windows and select Save.
Generate your PowerShell Graph API oAuth Script
Copy the following script and put it into an Administrator PowerShell/PowerShell ISE session and run it.
It will ask you to choose a folder to output the resultant PowerShell Script to. You can create a new folder through this dialog window if require.
The script will prompt you for the Client/Application ID, Client Secret and the Reply URL you obtained when registering the Web App in the steps above.
The script will be written out to the folder you chose in the first step and it will be executed. It will prompt you to authenticate. Provide the credentials you used when you created the App in the Application Registration Portal.
You will be prompted to Authorize the WebApp. Select Accept
If you’ve executed the previous steps correctly you’ll receive an AuthCode in your PowerShell output window
You’ll then see the output for a sample query for your user account and below that the successful call for a refresh of the tokens.
In the folder you chose you will find a PowerShell script with the name Connect-to-Microsoft-Graph.ps1. You will also find a file named refresh.token. You can use the script to authenticate with your new app, but more simply use the Get-NewTokens function to refresh your tokens and then write your own API queries to your app using the tokens. Unless you change the scope you don’t need to run Get-AzureAuthN again. Just use Get-NewTokens before your API calls.
Change the scope of your app to get more information. If you add a scope that requires Admin consent (and you’re not an admin), when prompted to authenticate you will need to get an Admin to authenticate and authorize the scope. Because you’ve changed the scope you will need to run the Get-AzureAuthN function again after updating $scope (as per below) and the dependent $scopeEncoded.
As the screen shot below shows I added the Mail.Read permission. I changed the $scope in the script so that it reflected the changes e.g
I hope that makes getting started with the oAuth2 Graph API via PowerShell a lot simpler than it was for me initially, with the differing endpoints, evolving API and the associated documentation somewhere in-between.
How many times have you wanted a consolidated report out of FIM/MIM for an object? What connectors does it have, what are the values of the attributes, which Management Agent contributed the value(s) and when? Individually of course you can get that info using the Metaverse Search and looking at the object in MIM Portal. But what if you wanted it all with a single query? This blog post provides an approach to doing just that. The graphic above shows a screenshot of a sample output. Click this Sample Report for full resolution version of the screenshot above. Note: The updated version of the script below outputs DisplayName for the ExpectedRulesList attribute so it actually provides valuable information.
The approach is quite simple. It is;
Query the FIM/MIM Metaverse for an object
Take the response from the Metaverse to build the Connectors and Metaverse Hologram reports
Use the connector information to query the MIM Service MA (this example assumes it is on the same server. If not add the following line into the script with the appropriate values) and get the objects MIM Service Connector Space info
Take information retrieved above to then query the MIM Service and return the information for the object.
Format all the output for HTML, apply a simple style sheet, output to file and display in the default browser
NOTE: If you combine this with the Get-MVObject query building script detailed here it can be a relatively simple solution. That script even uses the same variables $queries and $query as outputs from the search and input into the HTML Report.
Here it is. Lines 23 and 24 contain a hard-coded query. Update for your search criteria, or as detailed above combine this with the Get-MVObject query building script detailed here . The Output directory specified in Line 7 is where the stylesheet and the resultant HTML file will be placed. Update for your needs.
For the Expected Rules List (unlike the screenshot as I’ve modified the script afterwards), the script gets the DisplayName for them and puts that in the report. DisplayName is more valuable than an ERE ObjectID.
It probably seems obvious by now, but I seem to live in PowerShell and Microsoft Identity Manager. I’m forever looking into the Microsoft Identity Manager Metaverse for objects.
However, sometimes I get tripped up by the differences in Object Classes between the FIM/MIM Service and the Metaverse, the names of the Object Classes (obviously not Person, Group and Contact) and in situations where they are case-sensitive. If you’re using the Sync Service Manager Metaverse Search function though you get a pick list. But getting the data out to do something else with isn’t an option.
I’ve looked to quickly provide a similar function to the pick lists in the Metaverse Search GUI via Powershell which then gets executed by the Get-MVObject PowerShell Module.
I’ve defaulted the ObjectClass to Person so you can just press enter. But if you have custom ObjectClasses in your Metaverse you may need to change the index number in Line 48 from 5 to whichever index Person appears in your environment. Same goes for the default attribute of AccountName in the Attribute list. It appears at index 5 (Line 77) in my attribute list.
The Script itself will query the FIM/MIM MV Schema and return a list of Object Classes. As detailed above, in Line 48 of the script I have ‘index 5’ as the default which in my environment is Person and as such you can just hit enter if that is the Object Class you want to choose attributes from in the next step. Otherwise type the name of the ObjectClass you want. You don’t have to worry about case sensitivity as the script handles that. You can only choose a single ObjectClass obviously, but the menu ui I’ve used allows for multiple selections. Just press enter when prompted for another option for ObjectClass.
You’ll then be presented with a list of attributes from the chosen Object Class above. Again as detailed above I have it defaulting to ‘accountName’ which is index 5 in my list. Change (Line 77) for the default you want. This means you can just hit enter if accountName is what you’re querying on (which is common). Or choose another option. This then also allows you to also choose multiple attributes (which will be added to an array). This means you can use this for complex queries such as;
accountName startsWith 'dar'
sn startsWith 'rob'
mail contains '@kloud'
If you want to choose multiple attributes for your query and one of them is the default option, make sure you specify one of the attributes that is not the default first so that you get the option to specify more. When you’ve chosen all the attributes you are going to use in your query hit enter and the script will take an empty response as the end of your choices.
Now for each attribute chosen you will be prompted for an Operator. Pretty simple. Just choose from the available options. Note: all operators are shown but not all operators can be used for all attribute types. e.g. Don’t select ‘EndsWith’ for a Boolean attribute type and expect it to work. If you choose an operator other than the default (equals in my example) hit enter when prompted for the second time and the script will take an empty response as the end of your choices.
Finally provide what you the value is for the search term for the attribute. If the value has spaces, don’t worry about putting the value in quotes. The script takes care of that.
The last two steps will iterate through, for queries where you have chosen multiple attributes.
And you’re done. $query is the variable that contains the results. In line 115 I’m using Show-Object from the PowershellCookBook PSM. That then gives you a GUI representation of the result as shown below. If the query returns multiple results this will only show the last.
Line 114 outputs the value of the attributes ($query.attributes) to the console as well. If you have multiple objects returned $query will show them as shown below.
Finally if you want to run the query again, or just make a subtle change, you shouldn’t have to go through that again. Get the value of $querytxt and you’ll get the query and the command to execute it. $querytxt is also output to the console as shown below. Copy and paste it into Powershell ISE, update and execute.
Here is the raw script. Hardly any error handling etc, but enough to get you started and tailor it for your requirements. Enjoy.
Yes, that title is quite a mouthful. And this post is going to be quite long. But worth the read if you are having to create a number of rules in Microsoft/Forefront Identity Manager, or even more so the same rule in multiple environments (eg. Dev, Staging, Production).
In this post I detail using Import-RMConfig to create a Set, Workflow, Synchronization Rule and Management Policy Rule to populate a Development AD Domain with Users from a Production AD Domain. This process is designed to run on a combined MIM Service/Sync Server. If your roles a separated (as they likely will be in a Production environment) you will need to run these scripts on the MIM Sync Server (so it can query the Management Agents, and you will need to add in a line to connect to the MIM Service (eg. Set-ResourceManagementClient ) at the beginning of the script.
In my environment I have two Active Directory Management Agents, each connected to an AD Forest as shown below.
On each of the AD MA’s I have a Constant Flow Attribute (named Source) configured to flow in a value representing the source AD Forest. I’m doing this in my environment as I have more than one production forest (hence the need for automation). You could simply use the Domain attribute for the Set criteria. That attribute is used in the Set later on. Mentioning it up front so it make sense.
The Import-RMConfig cmdlet uses XML and XOML files that contain the configuration required to create the Set, Workflow, Sync Rule and MPR in the FIM/MIM Service. The order that I approach the creation is, Sync Rule, Workflow, Set and finally the MPR.
Each of these objects as indicated above leverage an XML and/or XOML input file. I’ve simplified base templates and included them in the scripts.
The Sync Rule Script includes a prompt to choose a folder (you can create one through the GUI presented) to store the XML/XOML files to allow the Import-RMConfig to use them. Once generated you can simply reference the files with Import-RMConfig to replicate the creation in another environment.
Creating the Synchronization Rule
For creation of the Sync Rule we need to define which Management Agent will be the target for our Sync Rule. In my script I’ve automated that too (as I have a number to do). I’m querying the MIM Sync Server for all its Active Directory MA’s and then providing a dialog to allow you to choose the target MA for the Sync Rule. That dialog simply looks like the one below.
Creating the Sync Rule will finally ask you to give the Rule a name. This name will then be used as the base Display Name for the Set, MPR and Workflow (and a truncated version as the Rule ID’s).
The script below in the $SyncRuleXML section defines the rules of the Sync Rule. Mine is an Outbound Sync Rule, with a base set of attributes and transforming the users UPN and DN (for the differing Development AD namespace). Update lines 42 and 45 for the users UPN and DN your namespace.
Creating the Workflow
The Workflow script is pretty self-explanatory. A simple Action based workflow and is below.
Creating the Set
The Set is the group of objects that will be synchronized to the target management agent. As my Sync Rule is only for Users my Set is also only contains users. As stated in the Overview I have an attribute that defines the authoritative source for the objects. I’m also using that in my Set criteria.
Creating the Management Policy Rule
The MPR ties everything together. Here’s that part of the script.
Tying them all together
Here is the end to end automation, and the raw script that you could use as the basis for automating similar rule generation. The Sync Rule could easily be updated for Contacts or Groups. Remember the attributes and object classes are ‘case sensitive’.
Through the Browse for Folder dialog I created a new folder named ProvisionDevAD
I provided a Display Name for the rules
I chose the target Management Agent
The SyncRule, Workflow, Set and MPR are created. The whole thing takes seconds.
Let’s take a look at the completed objects through the MIM Portal.
The Sync Rule is present as we named it. Including the !__ prefix so it appears at the top of the list.
Outbound Sync Rule based on a MPR, Set and Workflow
The Resources will be created and if deleted de-provisioned.
And our base attribute flows.
Our Set was created.
Our naming aligns with what we input
And a Criteria based Set. As per the Overview I have an attribute populated by a Constant flow rule that I based my set on. You’ll want to update for you environment.
The Action Workflow was created
All looks great
And it applies our Sync Rule
And finally our MPR. Created as a Transition In MPR with Action Workflow
Set Transition and naming all aligned
The Transition Set configured for the Set that was created
And the Workflow configured with the Workflow that was just created
When you have a lot of Sync Rules to create, or you know you will need to re-create them numerous times, potentially in different environments automation is key. This just scratches the surface on what can be achieved, and made so much easier using the Lithnet PowerShell Modules.
Here’s is the full script. Note: You’ll need to make a couple of minor changes as indicated earlier, but you should be able to create a Provisioning Rule end to end pretty quick to validate the process. Then customize accordingly for your environment and requirements. Enjoy.
Recently I’ve migrated a bunch of Virtual Box Virtual Machines to Azure as detailed here. These VM’s are in Resource Groups with a Network Security Group associated that restricts access to them for RDP based on a source TCPIP address. All good practice. However from a usability perspective, when I want to use these VM’s, I’m not always in the same location, and rarely on a connection with a static IP address.
This post details a simple little script that;
Has a couple of variables associated with a Resource Group, Network Security Group, Virtual Machine Name and an RDP Configuration File associated with the VM
Gets the public IP Address of the machine I’m running the script from
Prompts for Authentication to Azure, and retrieves the NSG associated with the Resource Group
Compares the Source IP Address in the ‘RDP’ Inbound Rule to my current IP Address. If they aren’t a match it updates the Source IP Address to be my current public IP Address
Starts the Virtual Machine configured at the start of the script
Launches Remote Desktop using the RDP Configuration file
Here’s the raw script. Update lines 2-8 for your environment and away you go. Simple but useful as is often the way.
I have a complex customer environment where Microsoft Identity Manager is managing identities across three Active Directory Forests. The Forests all serve different purposes and are contained in different network zones. Accordingly there are firewalls between the zone where the MIM Sync Server is located and two of the other AD Forests as shown in the graphic below.
As part of the project the maintainers of the network infrastructure had implemented rules to allow the MIM Sync server to connect to the other two AD Forests. I had successfully been able to create the Active Directory Management Agents for each of the Forests and perform synchronization imports.
The Error ‘kerberos-no-logon-server’
Everything was going well right up to the point I went to export changes to the two AD Forests that were separated by firewalls. I received the ‘kerberos-no-logon-server’ error as shown below from the run profile output.
I started investigating the error as I hadn’t encountered this one before. There were a few posts on the possibilities mainly dealing with properties of the AD MA’s configuration. But I did stumble on a mention of kerberos being used when provisioning users to Active Directory and setting the initial password. That aligned with what I was doing. I had provided the networking engineers with my firewall port requirements. Those are (no PCNS required for this implementation) ;
389 TCP – LDAP
636 TCP – LDAPS
88 TCP – Kerberos
464 TCP/UDP – Kerberos
53 TCP – DNS
3268 TCP/UDP – Global Catalog
3269 TCP/UDP – Global Catalog
135 TCP – RPC
My old school immediate thought was to Telnet to each of the ports to see if the firewall was allowing me through. But with a couple of forests to test against and UDP ports as well, it wasn’t going to be that easy. I found a nice little Test-Port function that did both TCP and UDP. I already had an older script for testing TCP ports via PowerShell. So I combined them.
Identifying the cause
As suspected connectivity to the forest where my MIM Sync Server was located was all good. The other two, not so much. GC connectivity wouldn’t give me the Kerberos error, but not having Kerberos Port 464 certainly would.
In the backwards and forwards with the networking team I had to test connectivity many times so I added a running output as well as a summary output. The running output highlighting ports that weren’t accessible.
Here’s the raw script if you’re in a similar situation. Get the Test-Port Function from the URL in line 1 to test UDP Port connectivity. Add additional ports to the arrays if required (eg. for PCNS), and update the forest names in lines 21-23.
I’m sure this is going to become more relevant in a Cloud/Hybrid world where MIM Servers will be in Azure, Active Directory Forests will be in different networks and separated by firewalls and Network Security Groups.
As often happens in development environments, data changes, configurations change and at some point you end up with a whole bunch of objects that are in no-mans land. This happened to me today. I had thousands of objects that we basically empty but had previously triggered to be exported to the MIM Service prior to them actually being deleted from the source management agent.
An example of one of the objects. A group with a Pending Export to the MIM Service.
A closer look at the object and there is no attribute data present as the source object had been removed.
And only a single connector, to the MIM Service which it will never reach as it doesn’t contain the mandatory attributes.
Normally to clean up such a mess you’d probably be looking at deleting the Connector Space for the MIM Service and then refreshing it from the MIM Service and these objects would be gone. However, this development environment is rather large, and that wasn’t something I had time or was prepared for at this time. So here’s how I worked around the issue.
Deleting spurious objects from the Connector Space
There’s two approaches;
Select each of the errors, select the MIM Service Connector and select delete. That would work but I had thousands.
Automate the process described in point 1. That’s the approach I took
Using the ever versatile Lithnet MIM Sync Powershell Module I retrieved the last run details for my MIM Service MA. I grabbed all the errors, inspected the errors for the ones that were failing creation to the MIM Service and then deleted the CSObject for that orphan.
Here’s where it got more than a little *clink clink* cowboy-ish. The Delete-CSObject cmdlet requires confirmation to delete the CSObject. There is not a switch to force the delete, or accept confirmation globally*. I wasn’t going to click Yes or press Enter 5000 times either.
So I wrote a small script that loops and checks for the Confirm disconnection dialog and sends the enter key to window.
Here’s the two little scripts.
This first script retrieves the last run details and loops through the errors.
This second script which I ran in a second separate PowerShell Runspace loops around and presses enter at the right time.
*I’ve submitted an enhancement request to Ryan to add a confirm parameter to Delete-CSObject. UPDATE: You can now use the -Force switch. Get-CSObject -DN $objDN -MA “MIM Service” | Disconnect-CSObject -Force