Implementing Azure API Management with the Lithnet Microsoft Identity Manager Rest API

Introduction

Earlier this week I wrote this post that detailed implementing the Lithnet REST API for FIM/MIM Service. I also detailed using PowerShell to interact with the API Endpoint.

Now lets imagine you are looking to have a number of Azure Serverless features leverage your Rest API enabled Microsoft Identity Manager environment. Or even offer it “as-a-Service”. You’ll want to have some visibility as to how it is performing, and you’ll probably want to implement features such as caching and rate limiting let alone putting more security controls around it. Enter Azure API Management, which provides all those functions and more.

In this post I detail getting started with Azure API Management by using it to front-end the Lithnet FIM/MIM Rest API.

Overview

In this post I will detail;

  • Enabling Azure API Management
  • Configuring the Lithnet FIM/MIM Rest API integration with Azure API Management
  • Accessing MIM via Azure API Management and the Lithnet FIM/MIM Rest API using PowerShell
  • Reporting

Prerequisites

For this particular scenario I’m interfacing Azure API Management with a Rest API that uses Digest Authentication. So even though it is a Windows WCF Webservice you could do something similar with a similar API Endpoint. If the backend API endpoint is using SSL it will need to have a valid certificate. Even though Azure API Management allows you to add your own certificates I had issues with Self Signed Certificates. I have it working fine with Lets Encrypt issued certificates. Obviously you’ll need an Azure Subscription as well as an App/Servive with an API.

Enabling Azure API Management

From the Azure Portal select Create a resource and search for API management and select it.

Add API Mgmt.PNG

Select Create

Create API Mgmt.PNG

Give your API Management Service a name, select a subscription, resource group etc and select Create.

API Mgmt Config 1.PNG

Once you select Create it will take about 30 minutes to be deployed.

Configuring the Lithnet FIM/MIM Rest API integration with Azure API Management

Once your new API Management service has been deployed, from the Azure Portal select the API Management services blade and select the API Management service that you just created. Select APIs.

API Config 1.PNG

Select Add API and then select Add a new API

API Mgmt Config 2.PNG

Give the API a name, description, enter the URI for your API EndPoint, and select HTTPS. I’m going to call this MIMSearcher so have entered that under API URL Suffix. For initial testing under Products select starter. Finally select Create.

API Mgmt Config 4.PNG

We now have our base API setup. From the Backend tile select the Edit icon.

API Mgmt Config 5.PNG

As the backed is authenticated using Basic Authentication, select Basic in Gateway credentials and enter the details of an account with access that will be used by the API Gateway. Select Save.

API Mgmt Config 6.PNG

Now from our API Configuration select Add operation.

API Mgmt Config 7.PNG

First we will create a test operation for the Help page on the Lithnet FIM/MIM Rest API. Provide a Display name, and for the URL add /v2/help. Give it a description and select Create.

Note: I could have had v2 as part of the base URI for the API in the previous steps. I didn’t as I will be using API’s from both v1 and v2 and didn’t want to create multiple operations.

API Mgmt Config 8.PNG

Select the new Operation (Help)

API Mgmt Config 9.PNG

Select the Test menu. Select Send.

API Mgmt Config 10.PNG

If everything is set up correctly you will get a 200 Success OK response as below.

API Mgmt Config 11.PNG

Accessing MIM via Azure API Management and the Lithnet FIM/MIM Rest API using PowerShell

Head over to your API Portal. The URL is https://.portal.azure-api.net/ where is the name you gave your API Management Service shown in the third screenshot at the top of this post. If you are doing this from the browser you used to create the API Management Service you should be signed in already. From the Administrator menu on the right select Profile.

Test API Mgmt 1.PNG

Click on Show under one of the keys and record its value.

Test API Mgmt 2.PNG

Using PowerShell ISE or VSCode update the following Code Snippet and test.

$APIURL = 'https://.azure-api.net//v2/help'
$secret = 'yourSecret'
$Headers = @{'Ocp-Apim-Subscription-Key' = $secret} 
[Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12

$response = Invoke-RestMethod -Uri $APIURL -Headers $Headers -ContentType "application/json" -UseBasicParsing -Method Get
$response

The snippet will create a Web Request to the new API and display the results.

Test API Mgmt 3.PNG

Querying the Lithnet Rest API via Azure API Management

Now that we have a working solution end-to-end, let’s do something useful with it. Looking at the Lithnet Rest API, the Resources URI is the key one exposing Resources from the MIM Service.

Resources.PNG

Let’s create a new Operation for Resources similar to what we did for the Help. After selecting Create configure the Backend for Basic Authentication like we did for Help.

Get Resources.PNG

Testing out the newly exposed endpoint is very similar to before. Just a new APIURL with the addition of /?Person to return all Person Resources from the MIM Portal. It lets us know it’s returned 7256 Person Objects, and the Results are still paged (100 by default).

Get Persons.PNG

Let’s now Search for just a single user. Search for a Person object whose Display Name is ‘darrenjrobinson’.

$query = "Person[DisplayName='darrenjrobinson']"
$queryEncoded = [System.Web.HttpUtility]::UrlEncode($query)

$APIURL = "https://.azure-api.net//v2/resources/?filter=/$($queryEncoded)" 
$secret = 'yourSecret'
$Headers = @{'Ocp-Apim-Subscription-Key' = $secret} 
[Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12

$user = Invoke-RestMethod -Uri $APIURL -Headers $Headers -ContentType "application/json" -UseBasicParsing -Method Get
$user

Executing, we get a single user returned.

Search for User.PNG

Reporting

Using the Publisher Portal we can get some Stats on what is happening with our API Management implementation.

Go to https://.portal.azure-api.net/admin and select Analytics.

We then have visibility to what has been using the API Management Service. At a Glance gives and overview and you can drill down into;

  • Top Users
  • Top Products
  • Top subscriptions
  • Top APIs
  • Top Operations

At a glance looks like this;

At a Glance Stats.PNG

And Top Operations looks like this;

Top Operations.PNG

Summary

That is a quick start guide to implementing Azure API Management in front of a Rest API and using PowerShell to integrate with it. Next steps would be to enable caching, and getting into more of the advanced features. Enjoy.

Protecting Application Credentials when implementing Modular Azure Functions with Microsoft Flow

This weekend I was attempting to rework some older Azure Automation tasks I wrote some time ago that were a combination of PowerShell scripts and Azure (PowerShell Functions). I was looking to leverage Microsoft Flow so that I could have them handy as ‘Buttons’ in the Microsoft Flow mobile app.

Quite quickly I realized that Microsoft Flow didn’t have the capability to perform some of the automation I required, so I handed that off to an Azure Function. The Azure Function then needed to leverage a Registered AAD Application. That required an Application ID and Secret (or a certificate).  This wasn’t going the way I wanted so I took a step back.

The Goals I was attempting to achieve were;

  • A set of Azure Functions that perform small repetitive tasks that can be re-used across multiple Flows
  • Separation of permissions associated with function/object orientated Azure Functions

The Constraints I encountered were;

  • Microsoft Flow doesn’t currently have Azure Key Vault Actions
  • The Flows I was looking to build required functionality that isn’t currently covered by available Actions within Flow

With my goal to have a series of Functions that can be re-used for multiple subscriptions I came up with the following workaround (until Flow has actions for Key Vault or Managed Service Identity).

Current working Workaround/Bodge;

  • I created an Azure Function that can be passed Key Vault URI’s for credential and subscription information
    • typically this is the Application ID, Application Secret, Azure Subscription. These are retrieved from Azure Key Vault using Managed Service Identity
    • returns to the Flow the parameters listed above
  • Flow calls another Azure Function to perform required tasks
    • that Azure Function can be leveraged for an AAD App in any Subscription as credentials are passed to it

RG VM Flow Integration 640px.png

Example Scenario (as shown above);

  1. Microsoft Flow triggered using a Flow Button in the mobile application to report on Azure Virtual Machines
  2. Flow calls Azure Function (Get-Creds) to get credentials associated with the Flow for the environment being reported on
  3. Managed Service Identity used from Azure Function to obtain credentials from Azure Key Vault
    • Application ID, Application Secret and Azure Subscription returned to Flow
  4. Flow calls Azure Function (Get-VM-Status) that authenticates to Azure AD based of credentials and subscription passed to it
  5. Azure Resource Group(s) and VM’s queried from the Function App with the details returned to Flow

Concerns/thoughts;

  1. Passing credentials between integration elements isn’t the best idea
    • obfuscation is that best that can be done for now
    • having the information stored in three different secrets means all information isn’t sent in one call
      • but three web requests are required to get the necessary creds
  2. A certificate for AAD App Authentication would reduce the Key Vault calls to one
    • would this be considered better or worse?
  3. At least the credentials aren’t at rest anywhere other than in the Key Vault.

Summary

We’ve come a long way in a year. Previously we just had Application Settings in Azure Functions and we were obfuscating credentials stored their using encryption techniques. Now with Managed Service Identity and Azure Key Vault we have Function sorted. Leveraging modular Azure Functions to perform actions not possible in Flow though still seems like a gap. How are you approaching such integration?

 

Global Azure Bootcamp 2018 – Creating the Internet of YOUR Things

Today is the 6th Global Azure Bootcamp and I presented at the Sydney Microsoft Office on the Creating the Internet of YOUR Things.

In my session I gave an overview on where IoT is going and some of the amazing things we can look forward to (maybe). I then covered a number of IoT devices that you can buy now that can enrich your life.

I then moved on to building IoT devices and leveraging Azure, the focus of my presentation. How to get started quickly with devices, integration and automation. I provided a working example based off previous my previous posts Integrating Azure IoT Devices with MongooseOS MQTT and PowerShellBuilding a Teenager Notification Service using Azure IoT an Azure Function, Microsoft Flow, Mongoose OS and a Micro Controller, and Adding a Display to the Teenager Notification Service Azure IoT Device

I provided enough information and hopefully inspiration to get you started.

Here is my presentation.

How to quickly copy Azure Functions between Azure Tenants and implement ‘Run From Zip’

As mentioned in this post yesterday I needed to copy a bunch of Azure WebApps and Functions from one Tenant to another. With anything cloud based, things move fast. Some of the methods I found were too onerous and more complex than they needed to be. There is of course the Backup option as well for Azure Functions. This does require a storage account associated with the Function App Plan. My Functions didn’t have the need for storage and the plan tier they were on meant that wasn’t a prerequisite. I didn’t have the desire to add a storage account to backup to then migrate.

Overview

In this post I show my method to quickly copy Azure Functions from one Azure Tenant to another. My approach is;

  • In the Source Tenant from the Azure Functions App
    • Using Kudu take a backup of the wwwroot folder (that will contain one or more Functions)
  • In the Target Tenant
    • Create an Azure Function App
    • Using Kudu locate the wwwroot archive in the new Azure Function App
    • Configure Azure Function Run From Zip

Backing up the Azure Functions in the Source Tenant

Using the Azure Portal in the Source Tenant go to your Function App => Application Settings and select Advanced Tools. Select Debug Console – Powershell and navigate to the Site Folder. Next to wwwroot select the download icon to obtain an archive of your functions.

Download WWWRoot Folder 2.PNG

Copying the Azure Functions to the Target Tenant

In the Target Tenant first create a New Azure Function App. I did this as I wanted to change the naming, the plan and a few other configuration items. Then using the Azure Portal go to your new Function App, Application Settings and select Advanced Tools.

Function Advanced Tools

Create a folder under D:\home\data named SitePackages.

Create Site Packages Folder

Drag and drop your wwwroot.zip file into the SitePackages Folder.

Drag Drop wwwroot

In the same folder select the + icon to create a file named siteversion.txt

Site Packages

Inside the file give the name of your archive file e.g.  wwwroot.zip Select Save.

Siteversion.txt.png

Back in your new Function App select Application Settings

Application Settings

Under Application Settings add a new setting for Website_Use_Zip with a setting value of ‘1’.

Website Use Zip.PNG

Refresh your Function App and you’ll notice it is now Read Only as it is running from Zip. All the Functions that were in the Zip are displayed.

Functions Migrated.PNG

Summary

This is a quick and easy method to get your functions copied from one Tenant to another. Keep in mind if your functions are using Application Settings, KeyVaults, Managed Service Identity type options you’ll need to add those settings, certificates, credentials in the target environment.

How to quickly copy an Azure Web App between Azure Tenants using ‘Zip Push Deploy’

In the last couple of weeks I’ve had to copy a bunch of Azure WebApps and Functions from one Azure Tenant to another. I hadn’t had to do this for a while and went looking for the quickest and easiest way to accomplish it. As with anything cloud based, things move fast. Some of the methods I found were too onerous and more complex than they needed to be. There is of course the Backup option as well. However for WebApps that is only available if you are on a Standard or above tier Plan. Mine weren’t and I didn’t have the desire to uplift to get that feature.

Overview

In this post I show my method to quickly copy an Azure WebApp from one Azure Tenant to another. I cover copying Azure Functions in another post. My approach is;

  • In the Source Tenant from the WebApp
    • Download the Automation Scripts for the WebApp
    • Using Kudu take a backup of the wwwroot folder
  • In the Target Tenant
    • Create a new Resource from a Template
    • Import the Deployment Automation Scripts from above
    • Modify for any changes, Resource Group, Location etc
    • Use Zip Push Deploy to upload the wwwroot archive and deploy it

Backing up the WebApp in the Source Tenant

Open your WebApp in the Azure Portal. Select Automation Script

WebApp Deployment Script

Download the Automation Script

Save Deployment Script

Select Advanced Tools

Kudu Adv Tools

Select the Site Folder then on the right menu of wwwroot select the download icon and save the backup of the WebApp.

Download WWWRoot Folder 3.png

Expand the Deployment Script archive file from the first step above. The contents will look like those below.

Expand the Deploy Script Archive.PNG

Deploy the WebApp to another Tenant

In the Azure Portal select Create a Resource from the top of the menu list on the left hand side. Type Template in the search box and select Template Deployment then select Create. Select Build your own template in the editor. Select Load File and select the parameters.json file. Then select Load File again and select the template.json file. Select Save.

Load Parameters then Template JSON Files

Make any changes to naming, and provide an existing or new Resource Group for the WebApp. Select Purchase.

New Template Deployment - Change Parameters

The WebApp will be created. Once completed select it from the Resource Group you specified and select Advanced Tools. From the Tools menu select Zip Push Deploy.

Tools Zip Push Deploy

Drag and drop the Zip file with the archive of the wwwroot folder you created earlier.

Drop WebApp ZipFile Export via Kudu

The zip will be processed and the WebApp deployed.

Deployed WebApp

Selecting the App in the new Tenant we can see it is deployed and running.

App Running.PNG

Hitting the App URL we can see that is being served.

Deployed App.PNG

This WebApp is the Microsoft Identity Manager User Object Report that I detailed in this post here.

Summary

In less that 10 minutes the WebApp is copied. No modifying JSON files, no long command lines, no FTP clients. Pretty simple. In the next post I’ll detail how I copied Azure Functions using a similar process.

Keep in mind if your WebApp is using Application Settings, KeyVaults, Managed Service Identity type options you’ll need to add those settings, certificates/credentials in the target environment.

Automating the creation of Azure IoT Hubs and the registration of IoT Devices with PowerShell and VS Code

The creation of an Azure IoT Hub is quick and simple, either through the Azure Portal or using PowerShell. But what can get more time-consuming is the registration of IoT Devices with the IoT Hub and generation of SAS Tokens for them for authentication.

In my experiments with micro-controllers and their integration with Azure IoT Services I often find I keep having to manually do tasks that should have just been automated. So I did. In this post I’ll cover using PowerShell to;

  • create an Azure IoT Hub
  • register an Azure IoT Device
  • generate a SAS Token for the IoT Device to use for authentication to an Azure IoT Hub from a Mongoose OS enabled ESP8266 micro controller

IoT Integration

Prerequisites

In order to fully test this, ideally you will have a micro-controller. I’m using an ESP8266 based micro-controller like this one. If you want to test this out without physical hardware, you could generate your own DeviceID (any text string) and use the AzureIoT Library detailed further on to send MQTT messages.

You will also require an Azure Subscription. I detail using a Free Tier Azure IoT Hub which is limited to 8000 messages per day. And instead of using PowerShell/PowerShell ISE get/use Visual Studio Code.

Finally you will need the AzureRM and AzureIoT PowerShell modules. With WinRM 5.x you can get them from the PowerShell Gallery with;

install-module AzureRM
install-module AzureIoT

Create an Azure IoT Hub

The script below will create a Free Tier Azure IoT Hub. Change the location (line 15) for which Azure Region you will use (the commands on the lines above will list what regions are available), the Resource Group Name that will be created to hold it (line 18) and the name of the IoT Hub (line 23) and let it rip.

From your micro-controller we will need the DeviceID. I’m using the ID generated by the device which I obtained from the Device Configuration => Expert View of my Mongoose OS enabled ESP8266.

Device Config.PNG

Register the IoT Device with our Azure IoT Hub

Using the AzureIoT PowerShell module we can automate the creation/registration of the IoT Device. Update the script below for the name of your IoTHub and the Resource Group that contains it that you created earlier (lines 7 and 11). Update line 21 for the DeviceID or your new IoT Device. I’m using the AzureIoT module to do this. With WinRM 5.x you can install it quickly fromt the gallery with install-module AzureIoT

Looking at our IoTHub in the Azure Portal we can see the newly registered IoT Device.

DeviceCreated.png

Generate an IoT Device SAS Token

The final step is to create a SAS Token for our IoT Device to use to connect to the Azure IoTHub. Historically you would use the IoT Device Explorer to do that. Alternatively you can also use the code samples to implement the SAS Device Token generation via an Azure Function App. Examples exist for JavaScript and C#. However as of mid-January 2018 you can do it direct from VS Code or Azure Cloud Shell using the Azure CLI and the IOT Extension. I’m using this method here as it is the quickest and simplest method of generating the Device SAS Token.

The command to generate a token that would work for all Devices on an IoT Hub is

az iot hub generate-sas-token --hub-name

Here I show executing it via the Azure Cloud Shell after installing the IOT Extensions as detailed here. To open the Bash Cloud Shell select the >_ icon next to the notification bell in the right top menu list.

Generate IOT Device SAS Token.PNG

As we have done everything else via PowerShell and VS Code we can also do it easily from VS Code. Install the Azure CLI Tools (v0.4.0 or later in VS Code as detailed here. Then from within VS Code press Control + Shift + P to open the Command Palette and enter Azure: Sign In. Sign in to Azure. Then Control + Shift + P again and enter Azure: Open Bash in Cloud Shell to open a Bash Azure CLI Shell. You can check to see if you have the Azure CLI IOT Extension (if you’ve previously used the Azure CLI for IoT operations) by typing;

az extension show --name azure-cli-iot-ext

and install it if you don’t with;

az extension add --name azure-cli-iot-ext

Then run the same command from VS Code to generate the SAS Token

az iot hub generate-sas-token --hub-name

VSCode Generate SAS Token.PNG

NOTE: That token can then be used for any Device registered with that IOT Hub. Best practice is to have a token per device. To do that type

az iot hub generate-sas-token --hub-name  --device-id

Generate SAS Token VS Code Per Device.PNG

By default you will get a token valid for 1 hour. Use the –duration switch to specify the duration of the token you require for your environment.

We can now take the SAS Token and put it into our MQTT Config on our Mongoose OS IoT Device. Update the Device Configuration using Expert View and Save.

Mongoose SAS Config.PNG

We can then test our IoT Device sending updates to our Azure IoT Hub. Update Init.js using the telemetry sample code from Mongoose.

load('api_config.js');
 load('api_mqtt.js');
 load('api_sys.js');
 load('api_timer.js');

let topic = 'devices/' + Cfg.get('device.id') + '/messages/events/';

Timer.set(1000, true /* repeat */, function() {
 let msg = JSON.stringify({ ram: Sys.free_ram() });
 let ok = MQTT.pub(topic, msg, 1);
 print(ok, topic, '->', msg);
 }, null);

We can then see the telemetry being sent to our Azure IOT Hub using MQTT. In the Device Logs after the datestamp and before device/ if you see a 0 instead of 1 (as shown below) then your conenction information or SAS Token is not correct.

Mongoose IOT Events.png

On the Auzre IoT side we can then check the metrics and see the incoming telemetry using the counter Telemetry Metrics Sent as shown below.

Telemetry Metrics Sent.PNG

If you don’t have an IoT Device you can simulate one using PowerShell. The following example shows sending a message to our IoT Hub (using variables from previous scripts).

$deviceParams = @{
 iotConnString = $IoTConnectionString
 deviceId = $deviceID
}
$deviceKeys = Get-IoTDeviceKey @deviceParams 
# Get Device 
$device = Get-IoTDeviceClient -iotHubUri $IOTHubDeviceURI -deviceId $deviceID -deviceKey $deviceKeys.DevicePrimaryKey

# Send Message
$deviceMessageParams = @{
 deviceClient = $device
 messageString = "Azure IOT Hub"
}
Send-IoTDeviceMessage -deviceClient $deviceMessageParams

Summary

Using PowerShell we have quickly been able to;

  • Create an Azure IoT Hub
  • Register an IoT Device
  • Generate the SAS Token for the IoT Device to authenticate to our IoT Hub with
  • Configure our IoT Device to send telemetry to our Azure IoT Hub and verify integration/connectivity

We are now ready to implement logic onto our IoT Device for whatever it is you are looking to achieve.

Integration of Microsoft Identity Manager with Azure Platform-as-a-Service Services

Overview

This isn’t an out of the box solution. This is a bespoke solution that takes a number of elements and puts them together in a unique way. I’m not expecting anyone to implement this specific solution (but you’re more than welcome to) but to take inspiration from it to implement solutions relevant to your environment(s). This post supports a presentation I did to The MIM Team User Group on 14 June 2017.

This post describes a solution that;

  • Leverages an Azure WebApp (NodeJS) to present a simple website. That site can be integrated easily in the FIM/MIM Portal
  • The NodeJS website leverages an Azure Function App to get a list of users from the FIM/MIM Synchronization Server and allows the user to use typeahead functionality to find the user they want to generate a FIM/MIM object report on
  • On selection of a user, a request will be sent to another Azure Function App to generate and return the report to the user in a new browser window

This is shown graphically below.

 

Report Request UI

The NodeJS WebApp is integrated into the FIM/MIM portal. Bootstrap Typeahead is used to find the user to generate a report on. The Typeahead userlist if fulfilled by an Azure Function into the MIM Sync Metaverse. The Generate Report button fires off a call to FIM/MIM via another Azure Function into the MIM Sync and MIM Service to generate the report.

The returned report opens in a new tab in the users browser. The report contains details of the FIM/MIM connectors the user is represented on.

The values of all attributes for the users hologram from the Metaverse are displayed along with the MA the value came from and the last modified date.

Finally the metadata report from the MIM Service MA Connector Space and the MIM Service.

Prerequisites

These are numerous, but I’ve previously posted about them. You will need;

I encourage you to digest those posts to understand how to configure the prerequisites for this solution.

Additional Solution Requirements

To bring all the individual components together, there are a few additional tasks to enable this solution.

  • Enable CORS on your Azure Function App Configuration (see details further below)
  • If you want to display User Object Photos as part of the report, you will likely need to synchronize them into FIM/MIM from an authoritative source (e.g. Office365/Exchange Online)   Checkout this post  and additional details further below
  • In order to embed the NodeJS WebApp into the FIM/MIM Portal, this post provides the details. Change the target URL from PowerBI URL to your NodeJS site
  • Object Report Request WebApp (see below for sample site)

Azure Functions Cross Origin Resource Sharing (CORS)

You will need to configure CORS to allow the NodeJS WebApp to access the Azure Functions (from both local and Azure). Reflect your port number if it is different from 3000, and use the DNS name for your Azure WebApp.

Sample UI NodeJS HTML

Here is a sample HTML file for your NodeJS WebApp with the UI to provide Input for LoginID fulfilled by the NodeJS Javascript file further below.

Sample UI NodeJS JavaScript

The following NodeJS JavaScript supports the HTML UI above. It populates the LoginID typeahead box and takes the Submit Report button to fulfill the report for the desired object(s). Yes if you use the UI to select (individually) multiple different objects all will be returned in their separate output windows.

As the HTML file above indicates you will need to obtain and make available as part of your NodeJS project the typeahead.bundle.js library.

Azure PowerShell Trigger Function App for AccountNames Lookup

The following Azure Function takes the call from the load of the NodeJS WebApp to populate the typeahead userlist.

Azure PowerShell Trigger Function App for User Object Report

Similar in structure to the Username List Lookup Azure Function above, but in the ScriptBlock you embed the Report Generation Script that is detailed here. Modify for what you want to report on.

Photos in the Report

If you want to display images in your report, you will need to determine if the user has an image during the MV metadata report generation part of the script. Add the following lines (updating for the name of your Image attribute; mine is named EXOPhoto) after the Try {} Catch {} in this section $obj = @() ; foreach ($attr in $attributes.Keys)

 # Display the Objects Photo rather than Base64 string
 if ($attr.equals("EXOPhoto")){
     $objectphoto = "<img src=$([char]0x22)data:image/jpeg;base64,$($attributes.$attr.Values.Valuestring)$([char]0x22)>"
     $val = "System.Byte[]"
 }

Then in the output of the HTML report at the end of the report generation insert the $objectphoto variable into the HTML stream.

# Output MIM Service Object Data
 $MIMServiceObjOut = $MIMServiceObjectMetaData | Sort-Object -Property Attribute | ConvertTo-Html -Fragment
 $htmlreport = ConvertTo-HTML -Body "$htmlcss<h1>Microsoft Identity Manager User Object Report</h1><h2>Query</h2>$sourcequery</br><b><center>$objectphoto</br>NOTE: Only attributes with values are displayed.</center></b><h2>Connector(s) Summary</h2>$connectorsummary<h2>MetaVerse Data</h2>$objectmetadata <h2>MIM Service CS Object Data</h2>$MIMServiceCSobjectmetadata <h2>MIM Service Object Data</h2>$MIMServiceObjOut" -Title "MIM Object Report"

As you can see above I’ve also injected the CSS ($htmlcss) into the output stream at the beginning of the Body section.  Somewhere in your script block you will need to define your CSS values. e.g.

 # StyleSheet for nice pretty output
 $htmlcss = "<style>
    h1, h2, th { text-align: center; }
    table { margin: auto; font-family: Segoe UI; box-shadow: 10px 10px 5px #888; border: thin ridge grey; }
    th { background: #0046c3; color: #fff; max-width: 400px; padding: 5px 10px; }
    td { font-size: 11px; padding: 5px 20px; color: #000; }
    tr { background: #b8d1f3; }
    tr:nth-child(even) { background: #dae5f4; }
    tr:nth-child(odd) { background: #b8d1f3; }
 </style>"

Summary

An interesting solution integrating Azure PaaS Services with Microsoft Identity Manager via PowerShell and the extremely versatile Lithnet FIM/MIM PowerShell Modules.

Please share your implementations enhancing your FIM/MIM Solution.

How to build and deploy an Azure NodeJS WebApp using Visual Studio Code

Introduction

This week I had the need to build a small web application with a reasonably simple front end that will later be integrated inside a Portal. The web application isn’t going to be high use and didn’t necessitate deployment of infrastructure (VM’s). I’d messed with NodeJS a while back in this post where I configured a UI for Microsoft Identity Manager and Azure based functions.

In the back of my mind I knew I didn’t want to have to go for a full Visual Studio Project Solution for this, and with the recent updates to Visual Studio Code I figured it must be possible to do it using it. There wasn’t much around on doing it, so I dived in and worked it out for myself. Here I share the end-to-end process to make it easy for you to started.

Overview

What you will need on your development workstation before you start are the following components. Download and install them on your dev machine.

You will also need an Azure Subscription to where you will publish your NodeJS site.

This post details setting up Visual Studio Code to build a shell NodeJS site and deploy it to Azure using a local GIT Repository. Let’s get started.

Visual Studio Code Extensions

A really smart and handy extension for VS Code is Azure Tools for VS Code. Release a few months ago (January 2017), this extension allows you to quickly create a Web App (Resource Group, App Service, Application Service Plan etc) from within VS Code. With VSCode on your development machine from the prerequisites above click on the Extensions Icon (bottom left) in the VSCode menu and type Azure Tools. Click the green Install button.

Azure Tools for VS Code

Creating the NodeJS Site in VS Code

I had a couple of attempts at doing this before I found a quick, neat and repeatable method of getting started. In order to get the Web App deployed and accessible correctly in Azure I found it easiest to use the Sample Azure NodeJS Hello World example from here. Download that sample and extract the contents to a new folder on your workstation. I created a new path on mine named …\NodeJS\nodejssite and dropped the sample in there so it looked like below.

After creating the folder structure and putting the sample in it, whilst in the sub-directory type:

code .

That will startup Visual Studio Code in the newly created folder with the starter sample.

Install Express for NodeJS

To that base sample site we’ll install Express. From the Terminal tab in the lower pane type:

npm install -g express-generator

Express App

With Express now on our machine, lets add the Express App to our new NodeJS site. Type express in the Terminal window.

express

Accept that the directory is not empty

This will create the folder structure for Express.

Now to get all the files and modules for our site configured for our app run npm install

Now type npm start in the terminal window to start our new site.

The NodeJS site will start. Open a Web Browser and go to http://localhost:3000 and you should see the Express empty site.

Navigate to views => index.jade Update the text like I have below.

Refresh your browser window and you should see the text updated.

In the terminal windows press Cntrl + C to stop NodeJS.

Test Deploy to Azure

Now let’s do a test deploy of our shell site as an Azure WebApp.

Press Cntrl + Shift + P or from the View menu select Command Palette.

Type Azure: Login 

This will generate a code and give you a link to open in your browser and login

Paste in the code from the clipboard and select continue

Then login with your account for the Tenant where you want to deploy the WebApp too. You’ll then be authorized.

From the Command Palette type azure sub and choose Azure: List Azure Subscriptions and choose the subscription where you will create and deploy the WebApp

Now from the Command Palette type Azure Create a Web App (Simple).

Give the WebApp a name. This will become the WebApp Name, and the basis for the all the associated WebApp components. Use Create a Web App (Advanced) if you want to be more specific about the name of the App Resources etc.

If you watch the bottom VS Code Status bar you will see the Azure Tools extension create the new Resource Group, Web App and Web App Plan.

Login to the Azure Portal, select the new Web App.

Select Deployment Options and then Local Git Repository. Select OK.

Select Deployment credentials and provide a username and password. You’ll need this shortly to publish your site.

Click Overview. Copy the Git clone url.

Back in VS Code, select the GIT icon (under the magnifying glass) and from the top choose Initialize Repository.

Then in the terminal window type git remote add azure <git clone url> obtained from the step above.

Type Initial Commit as the message and click the tick icon in the Source Control menu bar.

Select and select Publish

Select azure as the remote target we setup earlier.

You’ll be prompted to authenticate. Use the account you created above in Deployment Credentials.

Back in the Azure Portal under the Web App under Deployment Options you will see the initial commit.

Click on Overview and you should see that it is running. Click on URL and the site will open in a new tab in your browser.

Updating our WebApp

Now, let’s make a change to our WebApp.

Back in VS Code, click on the files and folder icon in the top left corner, navigate to views => index.jade and update the title. Hit Cntrl + S (or select Save from the File menu). In Terminal below type npm start to start our NodeJS site locally.

Check out the update locally. In a browser navigate to the local NodeJS site on localhost:3000. You’ll see the changed page.

Select the Git icon on the left menu, give the update some text eg. ‘updated page text’ and select the tick from the top menu.

Select the and choose Push to publish the changes to our Azure WebApp.

Go back to your browser which was on the Azure WebApp URL and reload. Our change and been push and reflected in the WebApp.

Summary

Very quickly and easily using Visual Studio Code (with NodeJS and Git Desktop installed locally) we have;

  • Created an Azure WebApp
  • Created a base NodeJS site
  • Have a local NodeJS site we can develop
  • Publish it to Azure

Now go create something awesome.