Integration of Microsoft Identity Manager with Azure Platform-as-a-Service Services

Overview

This isn’t an out of the box solution. This is a bespoke solution that takes a number of elements and puts them together in a unique way. I’m not expecting anyone to implement this specific solution (but you’re more than welcome to) but to take inspiration from it to implement solutions relevant to your environment(s). This post supports a presentation I did to The MIM Team User Group on 14 June 2017.

This post describes a solution that;

  • Leverages an Azure WebApp (NodeJS) to present a simple website. That site can be integrated easily in the FIM/MIM Portal
  • The NodeJS website leverages an Azure Function App to get a list of users from the FIM/MIM Synchronization Server and allows the user to use typeahead functionality to find the user they want to generate a FIM/MIM object report on
  • On selection of a user, a request will be sent to another Azure Function App to generate and return the report to the user in a new browser window

This is shown graphically below.

 

Report Request UI

The NodeJS WebApp is integrated into the FIM/MIM portal. Bootstrap Typeahead is used to find the user to generate a report on. The Typeahead userlist if fulfilled by an Azure Function into the MIM Sync Metaverse. The Generate Report button fires off a call to FIM/MIM via another Azure Function into the MIM Sync and MIM Service to generate the report.

The returned report opens in a new tab in the users browser. The report contains details of the FIM/MIM connectors the user is represented on.

The values of all attributes for the users hologram from the Metaverse are displayed along with the MA the value came from and the last modified date.

Finally the metadata report from the MIM Service MA Connector Space and the MIM Service.

Prerequisites

These are numerous, but I’ve previously posted about them. You will need;

I encourage you to digest those posts to understand how to configure the prerequisites for this solution.

Additional Solution Requirements

To bring all the individual components together, there are a few additional tasks to enable this solution.

  • Enable CORS on your Azure Function App Configuration (see details further below)
  • If you want to display User Object Photos as part of the report, you will likely need to synchronize them into FIM/MIM from an authoritative source (e.g. Office365/Exchange Online)   Checkout this post  and additional details further below
  • In order to embed the NodeJS WebApp into the FIM/MIM Portal, this post provides the details. Change the target URL from PowerBI URL to your NodeJS site
  • Object Report Request WebApp (see below for sample site)

Azure Functions Cross Origin Resource Sharing (CORS)

You will need to configure CORS to allow the NodeJS WebApp to access the Azure Functions (from both local and Azure). Reflect your port number if it is different from 3000, and use the DNS name for your Azure WebApp.

Sample UI NodeJS HTML

Here is a sample HTML file for your NodeJS WebApp with the UI to provide Input for LoginID fulfilled by the NodeJS Javascript file further below.

Sample UI NodeJS JavaScript

The following NodeJS JavaScript supports the HTML UI above. It populates the LoginID typeahead box and takes the Submit Report button to fulfill the report for the desired object(s). Yes if you use the UI to select (individually) multiple different objects all will be returned in their separate output windows.

As the HTML file above indicates you will need to obtain and make available as part of your NodeJS project the typeahead.bundle.js library.

Azure PowerShell Trigger Function App for AccountNames Lookup

The following Azure Function takes the call from the load of the NodeJS WebApp to populate the typeahead userlist.

Azure PowerShell Trigger Function App for User Object Report

Similar in structure to the Username List Lookup Azure Function above, but in the ScriptBlock you embed the Report Generation Script that is detailed here. Modify for what you want to report on.

Photos in the Report

If you want to display images in your report, you will need to determine if the user has an image during the MV metadata report generation part of the script. Add the following lines (updating for the name of your Image attribute; mine is named EXOPhoto) after the Try {} Catch {} in this section $obj = @() ; foreach ($attr in $attributes.Keys)

 # Display the Objects Photo rather than Base64 string
 if ($attr.equals("EXOPhoto")){
     $objectphoto = "<img src=$([char]0x22)data:image/jpeg;base64,$($attributes.$attr.Values.Valuestring)$([char]0x22)>"
     $val = "System.Byte[]"
 }

Then in the output of the HTML report at the end of the report generation insert the $objectphoto variable into the HTML stream.

# Output MIM Service Object Data
 $MIMServiceObjOut = $MIMServiceObjectMetaData | Sort-Object -Property Attribute | ConvertTo-Html -Fragment
 $htmlreport = ConvertTo-HTML -Body "$htmlcss<h1>Microsoft Identity Manager User Object Report</h1><h2>Query</h2>$sourcequery</br><b><center>$objectphoto</br>NOTE: Only attributes with values are displayed.</center></b><h2>Connector(s) Summary</h2>$connectorsummary<h2>MetaVerse Data</h2>$objectmetadata <h2>MIM Service CS Object Data</h2>$MIMServiceCSobjectmetadata <h2>MIM Service Object Data</h2>$MIMServiceObjOut" -Title "MIM Object Report"

As you can see above I’ve also injected the CSS ($htmlcss) into the output stream at the beginning of the Body section.  Somewhere in your script block you will need to define your CSS values. e.g.

 # StyleSheet for nice pretty output
 $htmlcss = "<style>
    h1, h2, th { text-align: center; }
    table { margin: auto; font-family: Segoe UI; box-shadow: 10px 10px 5px #888; border: thin ridge grey; }
    th { background: #0046c3; color: #fff; max-width: 400px; padding: 5px 10px; }
    td { font-size: 11px; padding: 5px 20px; color: #000; }
    tr { background: #b8d1f3; }
    tr:nth-child(even) { background: #dae5f4; }
    tr:nth-child(odd) { background: #b8d1f3; }
 </style>"

Summary

An interesting solution integrating Azure PaaS Services with Microsoft Identity Manager via PowerShell and the extremely versatile Lithnet FIM/MIM PowerShell Modules.

Please share your implementations enhancing your FIM/MIM Solution.

An Azure Timer Function App to retrieve files via FTP and Remote PowerShell

Introduction

In an age of Web Services and API’s it’s an almost a forgotten world where FTP Servers exist. However most recently I’ve had to travel back in time and interact with a FTP server to get a set of files that are produced by other systems on a daily basis. These files are needed for some flat-file imports into Microsoft Identity Manager.

Getting files off a FTP server is pretty simple. But needing to do it across a number of different environments (Development, Staging and Production) meant I was looking for an easy approach that I could also replicate quickly across multiple environments. As I already had Remote PowerShell setup on my MIM Servers for other Azure Function Apps I figured I’d use an Azure Function for obtaining the FTP Files as well.

Overview

My PowerShell Timer Function App performs the following:

  • Starts a Remote PowerShell session to my MIM Sync Server
  • Imports the PSFTP PowerShell Module
  • Creates a local directory to put the files into
  • Connects to the FTP Server
  • Gets the files and puts them into the local directory
  • Ends the session

Pre-requisites

From the overview above there are a number of pre-requites that other blog posts I’ve written detail nicely the steps involved to appropriately setup and configure. So I’m going to link to those. Namely;

  • Configure your Function App for your timezone so the schedule is correct for when you want it to run. Checkout the WEBSITE_TIME_ZONE note in this post.

    WEBSITE_TIME_ZONE

  • You’ll need to configure your Server that you are going to put the files onto for Remote PowerShell. Follow the Enable Powershell Remoting on the FIM/MIM Sync Server section of this blogpost.
  • The credentials used to connect to the MIM Server are secured as detailed in the Using an Azure Function to query FIM/MIM Service section of this blog post.
  • Create a Timer PowerShell Function App. Follow the Creating your Azure App Service section of this post but choose a Timer Trigger PowerShell App.
    • I configured my Schedule for 1030 every day using the following CRON configuration
      0 30 10 * * *
  • On the Server you’ll be connecting to in order to run the FTP processes you’ll need to copy the PSFTP Module and files to the following directories. I unzipped the PSFTP files and copied the PSFTP folder and its contents to;
    • C:\Program Files\WindowsPowerShell\Modules
    • C:\Windows\System32\WindowsPowerShell\v1.0\Modules

Configuring the Timer Trigger Function App

With all the pre-requisites in place it’s time to configure the Timer Function App that you created in the pre-requisites.

The following settings are configured in the Function App Application Settings;

  • FTPServer (the server you will be connecting to, to retrieve files)
  • FTPUsername (username to connect to the FTP Sever with)
  • FTPPassword (password for the username above)
  • FTPSourceDirectory (FTP directory to get the files from)
  • FTPTargetDirectory (the root directory under which the files will be put)

ApplicationSettings

You’ll also need Application Settings for a Username and Password associated with a user that exists on the Server that you’ll be connecting to with Remote PowerShell. In my script below these application settings are MIMSyncCredUser and MIMSyncCredPassword

Function App Script

Finally here is a raw script. You’ll need to add appropriate error handling for your environment. You’ll also want to change lines 48 and 51 for the naming of the files you are looking to acquire. And line 59 for the servername you’ll be executing the process on.

Summary

A pretty quick and simple little Azure Function App that will run each day and obtain daily/nightly extracts from an FTP Server. Cleanup of the resulting folders and files I’m doing with other on-box processes.

This post is cross-blogged on both the Kloud Blog and Darren’s Blog.

How to create an Azure Function App to Simultaneously Start|Stop all Virtual Machines in a Resource Group

Just on a year ago I wrote this blog post that detailed a method to “Simultaneously Start|Stop all Azure Resource Manager Virtual Machines in a Resource Group”. It’s a simple script that I use quite a lot and I’ve received a lot of positive feedback on it.

One year on though and there are a few enhancements I’ve been wanting to make to it. Namely;

  • host the script in an environment that is a known state. Often I’m authenticated to different Azure Subscriptions, my personal, my employers and my customers.
  • prioritize the order the virtual machines startup|shutdown
  • allow for a delay between starting each VM (to account for environments where the VM’s have roles that have cross dependencies; e.g A Domain Controller, an SQL Server, Application Servers). You want the DC to be up and running before the SQL Server, and so forth
  • and if I do all those the most important;
    • secure it so not just anyone can start|stop my environments at their whim

Overview

This blog post is the first that executes the first part of implementing the script in an environment that is a known state aka implementing it as an Azure Function App. This won’t be a perfect implementation as you will see, but will set the foundation for the other enhancements. Subsequent posts (as I make time to develop the enhancements) will add the new functionality. This post covers;

  • Creating the Azure Function App
  • Creating the foundation for automating management of Virtual Machines in Azure using Azure Function Apps
  • Starting | Stopping all Virtual Machines in an Azure Resource Group

Create a New Azure Function App

First up we are going to need a Function App. Through your Azure Resource Manager Portal create a new Function App.

For mine I’ve created a new Resource Group and a new Storage Account as this solution will flesh out over time and I’d like to keep everything organised.

Now that we have the Azure App Plan setup, create a New PowerShell HTTP Trigger Function App.

Give it a name and hit Create.

 

Create Deployment Credentials

In order to get some of the dependencies into the Azure Function we need to create deployment credentials so we can upload them. Head to the Function App Settings and choose Go to App Service Settings.

Create a login and give it a password. Record the FTP/Deployment username and the FTP hostname along with your password as you’ll need this in the next step.

Upload our PowerShell  Modules and Dependencies

Just as my original PowerShell script did I’m using the brilliant Invoke Parallel Powershell Script from Rambling Cookie Monster. Download it from that link and save it to your local machine.

Connect to your Azure Function App using your favourite FTP Client using the credentials you created earlier. I’m using WinSCP. Create a new sub-directory under /site/wwwroot/ named “bin” as shown below.

Upload the Invoke-Parallel.ps1 file from wherever you extracted it to on your local machine to the bin folder you just created in the Function App.

We are also going to need the AzureRM Powershell Modules. Download those via Powershell to your local machine (eg. Save-Module -Name AzureRM -Path c:\temp\azurerm). There are a lot of modules obviously and you’re not going to need them all. At a minimum for this solution you’ll need;

  • AzureRM
  • AzureRM.profile
  • AzureRM.Compute

Upload them under the bin directory also as shown below.

Test that our script dependencies are accessible

Now that we have our dependent modules uploaded let’s test that we can load and utilise them. Below is commands to load the Invoke-Parallel script and test that it has loaded by getting the Help.

# Load the Invoke-Parallel Powershell Script
. "D:\home\site\wwwroot\RG-Start-Stop-VirtualMachines\bin\Invoke-Parallel.ps1"

# See if it is loaded by getting some output
Get-Help Invoke-Parallel -Full

Put those lines into the code section, hit Save and Run and select Logs to see the output. If successful you’ll see the help. If you don’t you probably have a problem with the path to where you put the Invoke-Parallel script. You can use the Kudu Console from the Function App Settings to get a command line and verify your path.

Mine worked successfully. Now to test our AzureRM Module Loads. Update the Function to load the AzureRM Profile PSM as per below and test you have your path correct.

# Import the AzureRM Powershell Module
import-module 'D:\home\site\wwwroot\RG-Start-Stop-VirtualMachines\bin\AzureRM.profile\2.4.0\AzureRM.Profile.psm1'
Get-Help AzureRM

Success. Fantastic.

Create an Azure Service Principal

In order to automate the access and control of the Azure Virtual Machines we are going to need to connect to Azure using a Service Principal with the necessary permissions to manage the Virtual Machines.

The following script does just that. You only need to run this as part of the setup for the Azure Function so we have an account we can use for our automation tasks. Update line 6 for your naming and the password you want to use. I’m assigning the Service Principal the “DevTest Labs User” Azure Role (Line 17) as that allows the ability to manage the Virtual Machines. You can find a list of the available roles here.

Take note of the key outputs from this script. You will need to note the;

  • ApplicationID
  • TenantID

I’m also securing the credential that has the permissions to Start|Stop the Virtual Machines using the example detailed here in Tao’s post.

For reference here is an example to generate the keyfile. Update your path in line 5 if required and make sure the password you supply in line 18 matches the password you supplied for the line in the script (line 6) when creating the Security Principal.

Take note of the password encryption string from the end of the script to pair with the ApplicationID and TenantID from the previous steps. You’ll need these shortly in Application Settings.

Additional Dependencies

I created another sub-directory under the function app site named ‘keys’ again using WinSCP. Upload the passkey file created above into that directory.

Whilst we’re there I also created a “logs” directory for any erroneous output (aka logfiles created when you don’t specify them) from the invoke-parallel script.

Application Variables

Using the identity information you have created and generated we will populate variables on the Function App, Application Settings that we can then leverage in our Function App. Go to your Azure Function App, Application Settings and add an application setting (with the respective values you have gathered in the previous steps) for;

  • AzureAutomationPWD
  • AzureAutomationAppID
  • AzureAutomationTennatID (bad speed typing there)

Don’t forget to click Save up the top of the Application Settings screen.

 

The Function App Script

Below is the sample script for your testing purposes. If you plan to use something similar in a production environment you’ll want to add more logging and error handling.

Testing the Function

Select the Test option from the right-hand side pane and update the request body for what the Function takes (mode and resourcegroup) as below.   Select Run and watch the logs. You will need to select Expand to get more screen real estate for them.

You will see the VM’s enumerate then the script starting them all up. My script has a 30 second timeout for the Invoke-Parallel Runspace as the VM’s will take longer than 30 seconds to startup. And you pay for use, so we want to keep this lean. Increase the timeout if you have more VM’s or latency that doesn’t see all your VM’s state transitioning.

Checking in the Azure Portal I can see my VM’s all starting up (too fast on the screenshot for the spfarm-mim host).

 

Sample Remote PowerShell Invoke Script

Below is a sample PowerShell script that is remotely calling the Azure Function and providing the info the Function takes (mode and resourcegroup) the same as we did in the Test Request Body script in the Azure Function Portal.  This time to stop the VMs.

Looking in the Azure Portal and we can see all the VMs shutting down.

 

Summary

A foundational implementation of an Azure Function App to perform orchestration of Azure Virtual Machines.

The Function App is rudimentary in that the script exits (as described in the Runspace timeout) after 30 seconds which is prior to the VMs fully returning after starting|stopping. This is because the Function App will timeout after 5mins anyway.

Now to workout the enhancements to it.

Finally, yes I have renewed/changed the Function Key so no-one else can initiate my Function 🙂

Follow Darren Robinson on Twitter

How to use a Powershell Azure Function to Tweet IoT environment data

Overview

This blog post details how to use a Powershell Azure Function App to get information from a RestAPI and send a social media update.

The data can come from anywhere, and in the case of this example I’m getting the data from WioLink IoT Sensors. This builds upon my previous post here that details using Powershell to get environmental information and put it in Power BI.  Essentially the difference in this post is outputting the manipulated data to social media (Twitter) whilst still using a TimerTrigger Powershell Azure Function App to perform the work and leverage the “serverless” Azure Functions model.

Prerequisites

The following are prerequisites for this solution;

Create a folder on your local machine for the Powershell Module then save the module to your local machine using the powershell command ‘Save-Module” as per below.

Save-Module -Name InvokeTwitterAPIs -Path c:\temp\twitter

Create a Function App Plan

If you don’t already have a Function App Plan create one by searching for Function App in the Azure Management Portal. Give it a Name, Select Consumption so you only pay for what you use, and select an appropriate location and Storage Account.

Create a Twitter App

Head over to http://dev.twitter.com and create a new Twitter App so you can interact with Twitter using their API. Give you Twitter App a name. Don’t worry about the URL too much or the need for the Callback URL. Select Create your Twitter Application.

Select the Keys and Access Tokens tab and take a note of the API Key and the API Secret. Select the Create my access token button.

Take a note of your Access Token and Access Token Secret. We’ll need these to interact with the Twitter API.

Create a Timer Trigger Azure Function App

Create a new TimerTrigger Azure Powershell Function. For my app I’m changing from the default of a 5 min schedule to hourly on the top of the hour. I did this after I’d already created the Function App as shown below. To update the schedule I edited the Function.json file and changed the schedule to “schedule”: “0 0 * * * *”

Give your Function App a name and select Create.

Configure Azure Function App Application Settings

In your Azure Function App select “Configure app settings”. Create new App Settings for your Twitter Account, Twitter Account AccessToken, AccessTokenSecret, APIKey and APISecret using the values from when you created your Twitter App earlier.

Deployment Credentials

If you haven’t already configured Deployment Credentials for your Azure Function Plan do that and take note of them so you can upload the Twitter Powershell module to your app in the next step.

Take note of your Deployment Username and FTP Hostname.

https://dl.dropboxusercontent.com/u/76015/BlogImages/FunctionIOTWeather/AppDevSettings.png

Upload the Twitter Powershell Module to the Azure Function App

Create a sub-directory under your Function App named bin and upload the Twitter Powershell Module using a FTP Client. I’m using WinSCP.

From the Applications Settings option start Kudu.

Traverse the folder structure to get the path do the Twitter Powershell Module and note it.

Update the code to replace the sample from the creation of the Trigger Azure Function as shown below to import the Twitter Powershell Module. Include the get-help lines for the module so we can see in the logs that the modules were imported and we can see the cmdlets they contain.

Validating our Function App Environment

Update the code to replace the sample from the creation of the Trigger Azure Function as shown below to import the Twitter Powershell Module. Include the get-help line for the module so we can see in the logs that the module was imported and we can see the cmdlets they contain. Select Save and Run.

Below is my output. I can see the output from the Twitter Module.

Function Application Script

Below is my sample script. It has no error handling etc so isn’t production ready, but gives a working example of getting data in from an API (in this case IoT sensors) and sends a tweet out to Twitter.

Viewing the Tweet

And here is the successful tweet.

Summary

This shows how easy it is to utilise Powershell and Azure Function Apps to get data and transform it for use in other ways. In this example a social media platform. The input could easily be business data from an API and the output a corporate social platform such as Yammer.

Follow Darren on Twitter @darrenjrobinson

How to use a Powershell Azure Function App to get RestAPI IoT data into Power BI for Visualization

Overview

This blog post details using a Powershell Azure Function App to get IoT data from a RestAPI and update a table in Power BI with that data for visualization.

The data can come from anywhere, however in the case of this post I’m getting the data from WioLink IoT Sensors. This builds upon my previous post here that details using Powershell to get environmental information and put it in Power BI.  Essentially the major change is to use a TimerTrigger Azure Function to perform the work and leverage the “serverless” Azure Functions model. No need for a reporting server or messing around with Windows scheduled tasks.

Prerequisites

The following are the prerequisites for this solution;

  • The Power BI Powershell Module
  • Register an application for RestAPI Access to Power BI
  • A Power BI Dataset ready for the data to go into
  • AzureADPreview Powershell Module

Create a folder on your local machine for the Powershell Modules then save the modules to your local machine using the powershell command ‘Save-Module” as per below.

Save-Module -Name PowerBIPS -Path C:\temp\PowerBI
Save-Module -Name AzureADPreview -Path c:\temp\AzureAD 

Create a Function App Plan

If you don’t already have a Function App Plan create one by searching for Function App in the Azure Management Portal. Give it a Name, Select Consumption Plan for the Hosting Plan so you only pay for what you use, and select an appropriate location and Storage Account.

Register a Power BI Application

Register a Power BI App if you haven’t already using the link and instructions in the prerequisites. Take a note of the ClientID. You’ll need this in the next step.

Configure Azure Function App Application Settings

In this example I’m using Azure Functions Application Settings for the Azure AD AccountName, Password and the Power BI ClientID. In your Azure Function App select “Configure app settings”. Create new App Settings for your UserID and Password for Azure (to access Power BI) and our PowerBI Application Client ID. Select Save.

Not shown here I’ve also placed the URL’s for the RestAPI’s that I’m calling to get the IoT environment data as Application Settings variables.

Create a Timer Trigger Azure Function App

Create a new TimerTrigger Azure Powershell Function App. The default of a 5 min schedule should be perfect. Give it a name and select Create.

Upload the Powershell Modules to the Azure Function App

Now that we have created the base of our Function App we’re going to need to upload the Powershell Modules we’ll be using that are detailed in the prerequisites. In order to upload them to your Azure Function App, go to App Service Settings => Deployment Credentials and set a Username and Password as shown below. Select Save.

Take note of your Deployment Username and FTP Hostname.

Create a sub-directory under your Function App named bin and upload the Power BI Powershell Module using a FTP Client. I’m using WinSCP.

To make sure you get the correct path to the powershell module from Application Settings start Kudu.

Traverse the folder structure to get the path to the Power BI Powershell Module and note the path and the name of the psm1 file.

Now upload the Azure AD Preview Powershell Module in the same way as you did the Power BI Powershell Module.

Again using Kudu validate the path to the Azure AD Preview Powershell Module. The file you are looking for is the Microsoft.IdentityModel.Clients.ActiveDirectory.dll” file. My file after uploading is located in “D:\home\site\wwwroot\MyAzureFunction\bin\AzureADPreview\2.0.0.33\Microsoft.IdentityModel.Clients.ActiveDirectory.dll”

This library is used by the Power BI Powershell Module.

Validating our Function App Environment

Update the code to replace the sample from the creation of the Trigger Azure Function as shown below to import the Power BI Powershell Module. Include the get-help line for the module so we can see in the logs that the modules were imported and we can see the cmdlets they contain. Select Save and Run.

Below is my output. I can see the output from the Power BI Module get-help command. I can see that the module was successfully loaded.

Function Application Script

Below is my sample script. It has no error handling etc so isn’t production ready, but gives a working example of getting data in from an API (in this case IoT sensors) and puts the data directly into Power BI.

Viewing the data in Power BI

In Power BI it is then quick and easy to select our Inside and Outside temperature readings referenced against time. This timescale is overnight so both sensors are reading quite close to each other.

Summary

This shows how easy it is to utilise Powershell and Azure Function Apps to get data and transform it for use in other ways. In this example a visualization of IoT data into Power BI. The input could easily be business data from an API and the output a real time reporting dashboard.

Follow Darren on Twitter @darrenjrobinson

 

 

 

 

Using an Azure Function to search the FIM/MIM Metaverse, create a Set and update the Set membership in the the FIM/MIM Service

Introduction

This is the third and last post in this series of integrating Microsoft Identity Manager with Azure Functions.

The first detailed how to use an Azure Function to retrieve data from the MIM Service Server. The second detailed how to use an Azure Function to retrieve data from the MIM Sync (Metaverse) Server.

This third post combines the two and then performs an action in the MIM Service. The practical purpose of this could be functions like “find all users in location y” and “enable them for entitlement x” or “add an attribute value on each of their objects”.

Overview

The reasoning for the two stage approach is that in my experience it is a lot easier to search the Metaverse than the MIM Service to find an object(s), but also the Metaverse has all the information about objects whereas the MIM Service is a ShadowVerse of the Metaverse containing a subset of the managed objects metadata.

Moving forward then the architecture is a hybrid of the first two posts that introduced the concepts associated with integrating MIM with Azure Functions. As per the other two posts this is a base working example and concept.

Prerequisites

The prerequisites are the same as for the 1st and 2nd posts in this series. You’ll need to work through those examples to setup the dependencies and prerequisites. From there you can create one more Azure Function that brings everything together. That’s what I’m covering in this post.

Therefore the prerequisites are;

  • Azure Tenant and a Function Plan
  • Microsoft Identity Manager implementation
  • Remote Powershell configured for your MIM Sync Server
  • Lithnet FIM MIIS Automation Powershell Module installed on your MIM Sync Server
  • The necessary Firewall Rules on your MIM Sync Server and your Azure Network Security Group (assuming your MIM Infrastructure is in Azure) to allow Azure Functions to communicate with MIM Sync and Service Servers

This Example performs the following

In this example the HTTP Trigger Azure Function;

  • Takes input for ObjectType, Attribute, AttributeValue, SetName
  • Searches the MIM Sync Metaverse for the input ObjectType, with the AttributeValue in the Attribute
  • Connects to the MIM Service
  • Creates the Set based of the input SetName if it doesn’t exist
  • Adds the objects from the search to the Set
  • Returns the objects added to the Set

In a real world implementation you’d do the above with a criteria based set. This post is a concept of search and find, performing a create and updating. That has many practical applications.

Create your new Azure Function

Just like the other two posts, we’re going to create a new Powershell HTTP Trigger Azure Function.

Upload the Lithnet RMA PS Module to your new Azure Function (as per blog post 1 in this series). You should also be using protected credentials now as well. So upload your username/password encryption key.

Here is the Azure Function Powershell Script that performs the process detailed above.

Test it out. Looks good. 88 users matched the value of Sydney in their location attribute.

Verify that the Set was created and the membership updated.

Test calling the Azure Function remotely

Now that it is all working in the Azure Function, lets try doing it from Powershell remotely. This time I’m again looking for Person objects that have Sydney in their location attribute and I’ll create a set named Sydney-NSW and put them in it.

Brilliant, that works nicely. Let’s verify that the Set was created and has the correct number of users in it. Yes, a perfect match.

Summary

Putting Azure Functions and Powershell together along with the Lithnet Powershell Modules opens up a world of possibilities for automation and integration of the MIM Service without the need for any additional infrastructure or any considerable effort.

Experiment and let me know what you do with this style of integration.

Follow Darren on Twitter @darrenjrobinson

Get Users/Groups/Objects from Microsoft/Forefront Identity Manager with Azure Functions and the Lithnet Resource Management Powershell Module

Introduction

As an Identity Management consultant if I had a $1 for every time I’ve been asked “what is user x’s current status in IDAM”, “is user x active?”, “does user x have an account in y?”, “what is user x’s primary email address?”, particularly after Go Live of an IDAM solution my holidays would be a lot more exotic.

From a Service Desk perspective IDAM implementations are often a black box in the middle of the network that for the most part do what they were designed and implemented to do. However as soon as something doesn’t look normal for a user the Service Desk are inclined to point their finger at that black box (IDAM Solution). And the “what is the current value of ..”, “does user x ..” type questions start flying.

What if we could give the Service Desk a simple query interface into FIM/MIM without needing to give them access to another complicated application?

This is the first (of potentially a series) blog post on leveraging community libraries and Azure PaaS services to provide visibility of FIM/MIM information. This first post really just introduces the concept with a working example in an easy way to understand and replicate. It is not intended for production implementation without additional security and optimisation. 

Overview

The following graphic shows the concept of using Azure Functions to take requests from a client (web app, powershell, some other script) query the FIM/MIM Service and return the result. This post details the setup and configuration for the section in the yellow shaded box with the process outlined in the numbers 1, 2 & 3. This post assumes you already have your FIM/MIM implementation setup and configured according to your connected integrated applications/services such as Active Directory. In my example my connected datasource is actually Twitter.

Prerequisites

The prerequisites for this solution are;

 

Creating your Azure App Service

First up you’ll need to create an Azure App Service in your Azure Tenant. To keep everything logically structured for this example I created an Azure App Service in the same resource group that contains my MIM IaaS infrastructure (MIM Sync Server, MIM Service Server, SQL Server, AD Domain Controller etc).

In the Azure Marketplace select New (+) and search for Function App. Select the Function App item from the results and select Create.

Give your Azure App Service a name, choose the Resource Group where you want to locate it. Choose Dynamic for the Hosting Plan. This means you don’t have to worry about resource management for your Web App and you only pay for execution time which unless you put this into production and have gone crazy with it your costs should be zero as they will (should) be well under the free grant tier.  Put the application in the appropriate location such as close to your FIM/MIM resources that it’ll be interacting with and select Create.

Now that you have your Azure App Service setup, you need to create your Azure Function.

Create an HTTP Trigger Powershell Function App

In the Azure Portal locate your App Services Blade and select the Function App created in the steps above. Mine was named MIMMetaverseSearch in the example above. Select PowerShell as the Language and HttpTrigger-Powershell as the Function type.

Give your Function a name. I’ve kept it simple in this example and named it the same as my App Service Plan. Select Create.

Adding the Lithnet Powershell Module into your Function App.

As you’d expect the Powershell Function App by default only has a handful of core Powershell Modules. As we’re using something pretty specific we’ll need put the module into our Function App so we can load it and use the library.

Download and save the Lithnet Resource Management Powershell Module to your local machine. Something like the Powershell command below will do that.

Next follow this great blog post here from Tao to upload the Lithnet RMA PS Module you downloaded earlier into your function directory. I used WinSCP as my FTP client as I’ve shown below to upload the Lithnet RMA PS Module.

FTP to the host for your App Service and navigate to the /site/wwwroot/

Create a bin folder and upload via FTP the Lithnet RMA PS Module.

Using Kudu navigate to the path and version of the Lithnet RMA PS Module.
I’m using v1.0.6088 and my app is named MIMMetaverseSearch so MY path is D:\home\site\wwwroot\MIMMetaverseSearch\bin\LithnetRMA\1.0.6088

Note: the Lithnet RMA PS Module is 64-bit so you’ll need to configure your Web App for 64-bit as per the info in the same blog you followed to upload the module here.

Test loading the Lithnet RMA PS Module in your Function App

In your Function App select </> Develop. Remove the sample script and in your first line import the Lithnet RMA PS Module using the path from the previous step. Then, to check that it loads add a line that references a cmdlet in the module. I used Help Get-Resource. Select Save then Run.

If you’ve done everything correct when you select Run and look at the Logs you’ll see the module was loaded and the Help Get-Resource command was run in the Logs.

Allow your Function App to access your FIM/MIM Service Server

Even though you have logically placed your Function App in the same Resource Group (if you did it like I have) you’ll need to actually allow the Function App that is running in a shared PaaS environment to connect to your FIM/MIM Service Server.

Create an inbound rule in your Network Security Group to allow access to your FIM/MIM Service Server. The example below isn’t as secure as it could (and should) be as it allows access from anywhere. You should restrict the source of the request(s) accordingly. I’m just showing how to quickly get a working example. TCP Port 5725 is required to access your MIM Service Server. Enter the details as per below and select Ok.

Using an Azure Function to query FIM/MIM Service

Note: Again, this is an example to quickly show the concept. In the script below your credentials are in the script in clear text (and of course those below are not valid). For anything other than validating the concept you must protect your credentials. A great example is available here in Tao’s post.

The PS Azure Function gets the incoming request and converts it from JSON. In my request which you’ll see in the next step I’m passing in “displayName” and “objectType”.

In this example I’m using Get-Resource from the Lithnet RMA to get an object from the FIM/MIM Service. First you need to open a connection to the FIM/MIM Service Server. On my Azure IaaS MIM Service Server I’ve configured a DNS name so you can see I’m using that name in line 17 to connect to it using the unsecured credentials from earlier in the script. If you haven’t set up a DNS name for your FIM/MIM Service Server you can use the Public IP Address instead.

Line 20 queries for the ObjectType and DisplayName passed into the Function (see calling the Function in the next step) and returns the response in line 22. Again this is just an example. There is no error checking, validation or anything. I’m just introducing the concept in this post.

Testing your Function App

Now that you have the function script saved, you can test it from the Function App itself. Select Test from the options up in the right from your function. Change the Request Body for what the Function is expecting. In my case displayname and objectType. Select Run and in the Logs if you’ve got everything configured correctly (like inbound network rules, DNS name, your FIM/MIM Service Server is online, your query is for a valid resource) you should see an object returned.

Calling the Function App from a Client

Now that we have our Function App all setup and configured (and tested in the Function App) let’s send a request to the Azure Function using the Powershell Invoke-RestMethod function. The following call I did from my laptop. It is important to note that there is no authN in this example and the function app will be using whatever credentials you gave it to execute the request. In a deployed solution you’ll need to scope who can make the requests, limit on the inbound network rules who can submit requests and of course further protect the account credentials used to connect to your FIM/MIM Service Server.

Successful Response

The following screenshot shows calling the Function App and getting the responding object. Success. In a couple of lines I created a hashtable for the request, converted it to JSON and submitted it and got a response. How powerful is that!?

Summary

Using the awesome Lithnet Resource Management PowerShell Module with Azure Functions it is pretty quick and flexible to access a wealth of information we may want to expose for business benefit.

Now if only there was an affiliation program for Azure Functions that could deposit funds for each IDAM request to an Azure Functions App into my holiday fund.

Stay tuned for more posts on taking this concept to the next level.

Follow Darren on Twitter @darrenjrobinson