Using Azure Cognitive Services to Empower the IT Service Desk

Tonight (29 August 2018) I presented the following presentation on using Azure Cognitive Services to Empower the IT Service/build Business Applications to the Sydney Azure User Group.

I walked through how to leverage the following Azure Cognitive API Services;

  • Speech to Text
  • Text to Speech
  • Language Understanding Intelligent Service (LUIS)
  • Text Language Translation

For each I show a working example and demo.

I then walked through my Voice Assistant for Microsoft Identity Manager and how I integrated three of those services along with Azure IoT, Azure Serverless and Azure PaaS Services.

GitPitch Presents: github/darrenjrobinson/SydAzureUG-CognitiveServices

The Markdown Presentation Service on Git.

If you weren’t present I hope this presentation can still give you a start to integrating and leveraging Azure Services.

Find me on Twitter @darrenjrobinson

Exporting IoT Device Information from Azure IoT Hub(s) using PowerShell

Introduction

I have a number of Azure IoT Hubs each with a number of devices configured on them. I wanted to export the details for each IoT Device. This can’t be done via the Azure Portal (May 2018) so I looked to leverage the Azure.IoTHub New-AzureRmIotHubExportDevices cmdlet.

Now the documentation for New-AzureRmIotHubExportDevices is a little light on. When I was running the New-AzureRmIotHubExportDevices I kept getting the error ‘Operation returned an invalid status code ‘InternalServerError’.

After many attempts (over weeks) I finally was able to export my IoT devices using PowerShell. The key was to generate the SAS Storage Token for the Container rather than creating a blob file to export to and generating a SAS Token for the file. Simply specify the Storage Container to export too.

Overview

My sample script below uses the latest (as of May 2018) version of the Azure.IoTHub Module (v3.1.3). It;

  • enumerates all Resource Groups in an Azure Subscription and looks for IoT Hubs and puts them into a collection
  • then iterates through each IoT Hub, creates an associated Storage Account (if one doesn’t exist)
  • Exports the IoT Devices associated with the IoT Hub to Azure Storage
  • Downloads the IoT Devices Blob File, opens it and displays through PowerShell console output the IoT Device Names and Status

Export IoT Devices 640px

To use the script you will just need to;

  • change your Subscription Name in Line 4
  • The location where you want to download the blob files too in Line 31
  • if you want to display additional info on each device or do something else with the info change line 71 accordingly

The exported file can be found using Azure Storage Explorer as shown below.

Output Blob File.PNG

And the script outputs the status to the PowerShell console as shown below.

Exported IoT Devices.PNG

The exported object contains all the details for each IoT Device as shown below in the IoT Device PSObject.

IoT Device PSObj.PNG

Summary

It is obvious once you work out how the cmdlet works. Hopefully this working example will save someone else a few hours of head scratching.

Global Azure Bootcamp 2018 – Creating the Internet of YOUR Things

Today is the 6th Global Azure Bootcamp and I presented at the Sydney Microsoft Office on the Creating the Internet of YOUR Things.

In my session I gave an overview on where IoT is going and some of the amazing things we can look forward to (maybe). I then covered a number of IoT devices that you can buy now that can enrich your life.

I then moved on to building IoT devices and leveraging Azure, the focus of my presentation. How to get started quickly with devices, integration and automation. I provided a working example based off previous my previous posts Integrating Azure IoT Devices with MongooseOS MQTT and PowerShellBuilding a Teenager Notification Service using Azure IoT an Azure Function, Microsoft Flow, Mongoose OS and a Micro Controller, and Adding a Display to the Teenager Notification Service Azure IoT Device

I provided enough information and hopefully inspiration to get you started.

Here is my presentation.

Is it you, or is it me? When Cloud Services aren’t always on. “Failed to load external resource” while authenticating to Azure

The weekend just gone (24-25 March 2018) I was nearing the end of a personal project I’d been building around Internet of Things devices integrating with Azure. There were a few ends that needed a little tidying up and I’d planned to knock those off on Saturday morning. I opened my laptop and in browser hit portal.azure.com and got redirected for authentication at which point I had a blank webpage as shown below.

Error Blank Window for signin after redirect

Figured it was a little weird, but jumped to PowerShell thinking I’d quickly use that to do what I needed.

Error Signing into AzureRM via PowerShell

Huh? That’s not good. I had Teams open so check to see if that was working. Switched Identity for a different Team and ..Error Signing into teams

Mmmm, same error as PowerShell. What the hell was going on. I still had sessions that I was successfully using on Friday open and functioning.

Going through my thought process I thought it would be best to check the Service Status on Azure. That showed all green. Maybe if I pinged Azure Support? So I did via Twitter.

Azure Support Twitter

They were prompt in replying. After more investigation on my end I was thinking the problem was on my end.

Azure Support Twitter Responses

So I started troubleshooting, removing cookies, restarting my browser, trying a different browser, changing the default browser and forcing a logout on all Azure accounts via https://login.microsoftonline.com/logout.srf  all to no success. Same error messages (or in the case of the web pages nothing).

What happens if I try to access the URL where the script is located that the authentication pages are trying to load? The output is below. That doesn’t look good. I was out of time by now with other more pressing appointments, so I went to the pub with friends.

AADCDN Cant Access.PNG

Fast forward to Sunday morning and I wanted to finish off my latest extra-curricular project.

Getting back to my laptop the page above was still the primary window, as I hadn’t restarted or anything. I’d just locked my laptop when I left on Saturday. What would happen if I reloaded that page? Woah, that looks better. Page loads. Bad query which is expected.

AADCDN working.PNG

Switch back to PowerShell and run Login-AzureRmAccount and what do you know it worked. Switch over to Teams and again switched my Identity (for another Team) and it worked. Hit portal.azure.com in my browser and what do you know, it worked too.

My Conclusion – Go to the pub and come back tomorrow

My laptop hadn’t had any relevant changes other than me deleting a few cookies. After which the issue was still present.

I left my laptop to its own devices for 20 hours and then all of a sudden everything is working. If this happens again and I can’t access https://secure.aadcdn.microsoft-online-p.com and I’m getting the “Failed to load external resource” when authenticating to Azure I’ll just go to the pub, and come back the next day and all should be good. And it isn’t me, it was you 🙂

 

Quickly creating and using an Azure Key Vault with PowerShell

 

Introduction

A couple of weeks back I was messing around with the Azure Key Vault looking to centralise a bunch of credentials for my ever-growing list of Azure Functions that are automating numerous tasks. What I found was getting an Azure Key Vault setup and getting credentials in and out was a little more cumbersome than what I thought it should be. At that same point via Twitter this tweet appeared in my timeline from a retweet. I’m not too sure why, but maybe because I’m been migrating to VSCode myself I checked out Axel’s project.

Axel Agazoth tweet
Axel Agazoth tweet

Axel’s PowerShell Module simplifies creating and integrating with the Azure Key Vault. After messing with it and suggesting a couple of enhancements that Axel graciously entertained, I’m creating vaults, adding and removing credentials in the simplified way I’d wanted.

This quickstart guide to using this module will get you started too.

Create an Azure Key Vault

This is one of the beauties of Axel’s module. If the Resource Group and/or Storage Group you want associated with your Key Vault doesn’t exist then it creates them.

Update the following script for the location (line 8) and the name (line 10) that will be given to your Storage Account, Resource Group and Vault. Modify if you want to use different names for each.

Done, Key Vault created.

Create Azure KeyVault
Create Azure KeyVault
Azure Key Vault Created
Azure Key Vault Created

Connect to the Azure Key Vault

This script assumes you’re now in a new session and wanting to connect to the Key Vault. Again, a simplified version whereby the SG, RG and KV names are all the same.  Update for your location and Key Vault name.

Connected.

Connect to Azure Key Vault
Connect to Azure Key Vault

Add a Certificate to the Azure Key Vault

To add a certificate to our new Key Vault use the command below. It will prompt you for your certificate password and add the cert to the key vault.

Add Cert to Vault
Add Cert to Vault

Certificate added to Key Vault.

Cert Added to Vault
Cert Added to Vault

Retrieve a Certificate from the Azure Key Vault

To retrieve a certificate from the Key Vault is just as simple.

$VaultCert = Get-AzureCertificate -Name "AADAppCert" -ResourceGroupName $name -StorageAccountName $name -VaultName $name
Retrieve a Cert
Retrieve a Cert

Add Credentials to the Azure Key Vault

Adding username/password or clientID/clientSecret to the Key Vault is just as easy.

# Store credentials into the Azure Key Vault
Set-AzureCredential -UserName "serviceAccount" -Password ($pwd = Read-Host -AsSecureString) -VaultName $name -StorageAccountName $name -Verbose

Credentials added to vault

Add Creds to Key Vault
Add Creds to Key Vault
Creds Added to Vault
Creds Added to Vault

Retrieve Credentials from the Azure Key Vault

Retrieving credentials is just as easy.

# Get credentials from the Azure Key Vault
$AzVaultCreds = Get-AzureCredential -UserName "serviceAccount" -VaultName $name -StorageAccountName $name -Verbose

Credentials retrieved.

Retrieve Account Creds
Retrieve Account Creds

Remove Credentials from the Azure Key Vault

Removing credentials is also a simple cmdlet.

# Remove credentials from the Azure Key Vault
Remove-AzureCredential -UserName "serviceAccount" -VaultName $name -StorageAccountName $name -Verbose

Credentials removed.

Remove Credentials from Key Vault
Remove Credentials from Key Vault

Summary

Hopefully this gets you started quickly with the Azure Key Vault. Credit to Axel for creating the module. It’s now part of my toolkit that I’m using a lot.

Integration of Microsoft Identity Manager with Azure Platform-as-a-Service Services

Overview

This isn’t an out of the box solution. This is a bespoke solution that takes a number of elements and puts them together in a unique way. I’m not expecting anyone to implement this specific solution (but you’re more than welcome to) but to take inspiration from it to implement solutions relevant to your environment(s). This post supports a presentation I did to The MIM Team User Group on 14 June 2017.

This post describes a solution that;

  • Leverages an Azure WebApp (NodeJS) to present a simple website. That site can be integrated easily in the FIM/MIM Portal
  • The NodeJS website leverages an Azure Function App to get a list of users from the FIM/MIM Synchronization Server and allows the user to use typeahead functionality to find the user they want to generate a FIM/MIM object report on
  • On selection of a user, a request will be sent to another Azure Function App to generate and return the report to the user in a new browser window

This is shown graphically below.

 

Report Request UI

The NodeJS WebApp is integrated into the FIM/MIM portal. Bootstrap Typeahead is used to find the user to generate a report on. The Typeahead userlist if fulfilled by an Azure Function into the MIM Sync Metaverse. The Generate Report button fires off a call to FIM/MIM via another Azure Function into the MIM Sync and MIM Service to generate the report.

The returned report opens in a new tab in the users browser. The report contains details of the FIM/MIM connectors the user is represented on.

The values of all attributes for the users hologram from the Metaverse are displayed along with the MA the value came from and the last modified date.

Finally the metadata report from the MIM Service MA Connector Space and the MIM Service.

Prerequisites

These are numerous, but I’ve previously posted about them. You will need;

I encourage you to digest those posts to understand how to configure the prerequisites for this solution.

Additional Solution Requirements

To bring all the individual components together, there are a few additional tasks to enable this solution.

  • Enable CORS on your Azure Function App Configuration (see details further below)
  • If you want to display User Object Photos as part of the report, you will likely need to synchronize them into FIM/MIM from an authoritative source (e.g. Office365/Exchange Online)   Checkout this post  and additional details further below
  • In order to embed the NodeJS WebApp into the FIM/MIM Portal, this post provides the details. Change the target URL from PowerBI URL to your NodeJS site
  • Object Report Request WebApp (see below for sample site)

Azure Functions Cross Origin Resource Sharing (CORS)

You will need to configure CORS to allow the NodeJS WebApp to access the Azure Functions (from both local and Azure). Reflect your port number if it is different from 3000, and use the DNS name for your Azure WebApp.

Sample UI NodeJS HTML

Here is a sample HTML file for your NodeJS WebApp with the UI to provide Input for LoginID fulfilled by the NodeJS Javascript file further below.

Sample UI NodeJS JavaScript

The following NodeJS JavaScript supports the HTML UI above. It populates the LoginID typeahead box and takes the Submit Report button to fulfill the report for the desired object(s). Yes if you use the UI to select (individually) multiple different objects all will be returned in their separate output windows.

As the HTML file above indicates you will need to obtain and make available as part of your NodeJS project the typeahead.bundle.js library.

Azure PowerShell Trigger Function App for AccountNames Lookup

The following Azure Function takes the call from the load of the NodeJS WebApp to populate the typeahead userlist.

Azure PowerShell Trigger Function App for User Object Report

Similar in structure to the Username List Lookup Azure Function above, but in the ScriptBlock you embed the Report Generation Script that is detailed here. Modify for what you want to report on.

Photos in the Report

If you want to display images in your report, you will need to determine if the user has an image during the MV metadata report generation part of the script. Add the following lines (updating for the name of your Image attribute; mine is named EXOPhoto) after the Try {} Catch {} in this section $obj = @() ; foreach ($attr in $attributes.Keys)

 # Display the Objects Photo rather than Base64 string
 if ($attr.equals("EXOPhoto")){
     $objectphoto = "<img src=$([char]0x22)data:image/jpeg;base64,$($attributes.$attr.Values.Valuestring)$([char]0x22)>"
     $val = "System.Byte[]"
 }

Then in the output of the HTML report at the end of the report generation insert the $objectphoto variable into the HTML stream.

# Output MIM Service Object Data
 $MIMServiceObjOut = $MIMServiceObjectMetaData | Sort-Object -Property Attribute | ConvertTo-Html -Fragment
 $htmlreport = ConvertTo-HTML -Body "$htmlcss<h1>Microsoft Identity Manager User Object Report</h1><h2>Query</h2>$sourcequery</br><b><center>$objectphoto</br>NOTE: Only attributes with values are displayed.</center></b><h2>Connector(s) Summary</h2>$connectorsummary<h2>MetaVerse Data</h2>$objectmetadata <h2>MIM Service CS Object Data</h2>$MIMServiceCSobjectmetadata <h2>MIM Service Object Data</h2>$MIMServiceObjOut" -Title "MIM Object Report"

As you can see above I’ve also injected the CSS ($htmlcss) into the output stream at the beginning of the Body section.  Somewhere in your script block you will need to define your CSS values. e.g.

 # StyleSheet for nice pretty output
 $htmlcss = "<style>
    h1, h2, th { text-align: center; }
    table { margin: auto; font-family: Segoe UI; box-shadow: 10px 10px 5px #888; border: thin ridge grey; }
    th { background: #0046c3; color: #fff; max-width: 400px; padding: 5px 10px; }
    td { font-size: 11px; padding: 5px 20px; color: #000; }
    tr { background: #b8d1f3; }
    tr:nth-child(even) { background: #dae5f4; }
    tr:nth-child(odd) { background: #b8d1f3; }
 </style>"

Summary

An interesting solution integrating Azure PaaS Services with Microsoft Identity Manager via PowerShell and the extremely versatile Lithnet FIM/MIM PowerShell Modules.

Please share your implementations enhancing your FIM/MIM Solution.

How to build and deploy an Azure NodeJS WebApp using Visual Studio Code

Introduction

This week I had the need to build a small web application with a reasonably simple front end that will later be integrated inside a Portal. The web application isn’t going to be high use and didn’t necessitate deployment of infrastructure (VM’s). I’d messed with NodeJS a while back in this post where I configured a UI for Microsoft Identity Manager and Azure based functions.

In the back of my mind I knew I didn’t want to have to go for a full Visual Studio Project Solution for this, and with the recent updates to Visual Studio Code I figured it must be possible to do it using it. There wasn’t much around on doing it, so I dived in and worked it out for myself. Here I share the end-to-end process to make it easy for you to started.

Overview

What you will need on your development workstation before you start are the following components. Download and install them on your dev machine.

You will also need an Azure Subscription to where you will publish your NodeJS site.

This post details setting up Visual Studio Code to build a shell NodeJS site and deploy it to Azure using a local GIT Repository. Let’s get started.

Visual Studio Code Extensions

A really smart and handy extension for VS Code is Azure Tools for VS Code. Release a few months ago (January 2017), this extension allows you to quickly create a Web App (Resource Group, App Service, Application Service Plan etc) from within VS Code. With VSCode on your development machine from the prerequisites above click on the Extensions Icon (bottom left) in the VSCode menu and type Azure Tools. Click the green Install button.

Azure Tools for VS Code

Creating the NodeJS Site in VS Code

I had a couple of attempts at doing this before I found a quick, neat and repeatable method of getting started. In order to get the Web App deployed and accessible correctly in Azure I found it easiest to use the Sample Azure NodeJS Hello World example from here. Download that sample and extract the contents to a new folder on your workstation. I created a new path on mine named …\NodeJS\nodejssite and dropped the sample in there so it looked like below.

After creating the folder structure and putting the sample in it, whilst in the sub-directory type:

code .

That will startup Visual Studio Code in the newly created folder with the starter sample.

Install Express for NodeJS

To that base sample site we’ll install Express. From the Terminal tab in the lower pane type:

npm install -g express-generator

Express App

With Express now on our machine, lets add the Express App to our new NodeJS site. Type express in the Terminal window.

express

Accept that the directory is not empty

This will create the folder structure for Express.

Now to get all the files and modules for our site configured for our app run npm install

Now type npm start in the terminal window to start our new site.

The NodeJS site will start. Open a Web Browser and go to http://localhost:3000 and you should see the Express empty site.

Navigate to views => index.jade Update the text like I have below.

Refresh your browser window and you should see the text updated.

In the terminal windows press Cntrl + C to stop NodeJS.

Test Deploy to Azure

Now let’s do a test deploy of our shell site as an Azure WebApp.

Press Cntrl + Shift + P or from the View menu select Command Palette.

Type Azure: Login 

This will generate a code and give you a link to open in your browser and login

Paste in the code from the clipboard and select continue

Then login with your account for the Tenant where you want to deploy the WebApp too. You’ll then be authorized.

From the Command Palette type azure sub and choose Azure: List Azure Subscriptions and choose the subscription where you will create and deploy the WebApp

Now from the Command Palette type Azure Create a Web App (Simple).

Give the WebApp a name. This will become the WebApp Name, and the basis for the all the associated WebApp components. Use Create a Web App (Advanced) if you want to be more specific about the name of the App Resources etc.

If you watch the bottom VS Code Status bar you will see the Azure Tools extension create the new Resource Group, Web App and Web App Plan.

Login to the Azure Portal, select the new Web App.

Select Deployment Options and then Local Git Repository. Select OK.

Select Deployment credentials and provide a username and password. You’ll need this shortly to publish your site.

Click Overview. Copy the Git clone url.

Back in VS Code, select the GIT icon (under the magnifying glass) and from the top choose Initialize Repository.

Then in the terminal window type git remote add azure <git clone url> obtained from the step above.

Type Initial Commit as the message and click the tick icon in the Source Control menu bar.

Select and select Publish

Select azure as the remote target we setup earlier.

You’ll be prompted to authenticate. Use the account you created above in Deployment Credentials.

Back in the Azure Portal under the Web App under Deployment Options you will see the initial commit.

Click on Overview and you should see that it is running. Click on URL and the site will open in a new tab in your browser.

Updating our WebApp

Now, let’s make a change to our WebApp.

Back in VS Code, click on the files and folder icon in the top left corner, navigate to views => index.jade and update the title. Hit Cntrl + S (or select Save from the File menu). In Terminal below type npm start to start our NodeJS site locally.

Check out the update locally. In a browser navigate to the local NodeJS site on localhost:3000. You’ll see the changed page.

Select the Git icon on the left menu, give the update some text eg. ‘updated page text’ and select the tick from the top menu.

Select the and choose Push to publish the changes to our Azure WebApp.

Go back to your browser which was on the Azure WebApp URL and reload. Our change and been push and reflected in the WebApp.

Summary

Very quickly and easily using Visual Studio Code (with NodeJS and Git Desktop installed locally) we have;

  • Created an Azure WebApp
  • Created a base NodeJS site
  • Have a local NodeJS site we can develop
  • Publish it to Azure

Now go create something awesome.

How to configure a Graphical PowerShell Dev/Admin/Support User Interface for Azure/Office365/Microsoft Identity Manager

During the development of an identity management solution I find myself with multiple PowerShell/RDP sessions connected to multiple environments using different credentials often to obtain trivial data/information. It is easy to trip yourself up as well with remote powershell sessions to differing environments. If only there was a simple UI that could front-end a set of PowerShell modules and make those simple queries quick and painless. Likewise to allow support staff to execute a canned set of queries without providing them elevated permissions.

I figured someone would have already solved this problem and after some searching with the right keywords I found the powershell-command-executor-ui from bitsofinfo . Looking into it he had solved a lot of the issues with building a UI front-end for PowerShell with the powershell-command-executor and the stateful-process-command-proxy. That solution provided the framework for what I was thinking. The ability to provide a UI for PowerShell using powershell modules including remote powershell was exactly what I was after. And it was built on NodeJS and AngularJS so simple enough for some customization.

Introduction

In this blog post I’ll detail how I’ve leveraged the projects listed above for integration with;

Initially I had a vision of serving up the UI from an Azure WebApp. NodeJS on Azure WebApp’s is supported, however with all the solution dependencies I just couldn’t get it working.

My fallback was to then look to serve up the UI from a Windows Server 2016 Nano Server. I learnt from my efforts that a number of the PowerShell modules I was looking to provide a UI for, have .NET Framework dependencies. Nano Server does not have full .NET Framework support. Microsoft state to do so would mean the server would no longer be Nano.

For now I’ve deployed an Azure Windows Server 2016 Server secured by an Azure NSG to only allow my machine to access it. More on security later.

Overview

Simply, put the details in Github for the powershell-command-executor provide the architecture and integration. What I will detail is the modifications I’ve made to utilize the more recent AzureADPreview PowerShell Module over the MSOL PowerShell Module. I also updated the dependencies of the solution for the latest versions and hooked it into Microsoft Identity Manager. I also made a few changes to allow different credentials to be used for Azure and Microsoft Identity Manager.

Getting Started

I highly recommend you start with your implementation on a local development workstation/development virtual machine. When you have a working version you’re happy with you can then look at other ways of presenting and securing it.

NodeJS

NodeJS is the webserver for this solution. Download NodeJS for your Windows host here. I’m using the 64-bit version, but have also implemented the solution on 32-bit. Install NodeJS on your local development workstation/development virtual machine.

You can accept all the defaults.

Following the installation of NodeJS download the powershell-command-executor-ui from GitHub. Select Clone 0r Download, Download ZIP and save it to your machine.

Right click the download when it has finished and select Extract All. Select Browse and create a folder at the root of C:\ named nodejs. Extract powershell-command-executor-ui.

Locate the c:\nodejs\powershell-command-executor-ui-master\package.json file.

Using an editor such as Notepad++ update the package.json file ……

…… so that it looks like the following. This will utilise the latest versions of the dependencies for the solution.

From an elevated (Administrator) command prompt in the c:\nodejs\powershell-command-executor-ui-master directory run “c:\program files\nodejs npm” installThis will read the package.json file you edited and download the dependencies for the solution.

You can see in the screenshot below NodeJS has downloaded all the items in package.json including the powershell-command-executor and stateful-process-command-proxy.

When you now list the directories under C:\nodejs\powershell-command-executor-ui-master\node_modules you will see those packages and all their dependencies.

We can now test that we have a working PowerShell UI NodeJS website. From an elevated command prompt whilst still in the c:\nodejs\powershell-command-executor-ui-master directory run “c:\Program Files\nodejs\node.exe” bin\www

Open a browser on the same host and go to “http://localhost:3000”. You should see the default UI.

Configuration and Customization

Now it is time to configure and customize the PowerShell UI for our needs.

The files we are going to edit are:

  • C:\nodejs\powershell-command-executor-ui-master\routes\index.js
    • Update Paths to the encrypted credentials files used to connect to Azure, MIM. We’ll create the encrypted credentials files soon.
  • C:\nodejs\powershell-command-executor-ui-master\public\console.html
    • Update for your customizations for CSS etc.
  • C:\nodejs\powershell-command-executor-ui-master\node_modules\powershell-command-executor\O365Utils.js
    • Update for PowerShell Modules to Import
    • Update for Commands to make available in the UI

We also need to get a couple of PowerShell Modules installed on the host so they are available to the site. The two I’m using I’ve mentioned earlier. With WMF5 intalled using Powershell we can simply install them as per the commands below.

Install-Module AzureADPreview
Install-Module LithnetRMA

In order to connect to our Microsoft Identity Manager Synchronization Server we are going to need to enable Remote Powershell on our Microsoft Identity Manager Synchronization Server. This post I wrote here details all the setup tasks to make that work. Test that you can connect via RPS to your MIM Sync Server before updating the scripts below.

Likewise for the Microsoft Identity Manager Service Server. Make sure after installing the LithnetRMA Powershell Module you can connect to the MIM Service using something similar to:

# Import LithnetRMA PS Module
import-module lithnetrma

# MIM AD User Admin
$username = "mimadmin@mim.mydomain.com"
# Password 
$password = "Secr3tSq1rr3l!" | convertto-securestring -AsPlainText -Force
# PS Creds
$credentials = New-Object System.Management.Automation.PSCredential $Username,$password

# Connect to the FIM service instance
# Will require an inbound rule for TCP 5725 (or your MIM Service Server Port) in you Resource Group Network Security Group Config
Set-ResourceManagementClient -BaseAddress http://mymimportalserver.:5725 -Credentials $credentials

 

\routes\index.js

This file details the encrypted credentials the site uses. You will need to generate the encrypted credentials for your environment. You can do this using the powershell-credentials-encryption-tools. Download that script to your workstation and unzip it. Open the credentialEncryptor.ps1 script using an Administrator PowerShell ISE session.

I’ve changed the index.js to accept two sets of credentials. This is because your Azure Admin Credentials are going to be different from your MIM Administrator Credentials (both in name and password). The username for my Azure account looks something like myname@mycompany.com whereas for MIM it is Domainname\Username.

Provide an account name for your Azure environment and the associated password.

The tool will create the encrypted credential files.

Rename the encrypted.credentials file to whatever makes sense for your environment. I’ve renamed it creds1.encrypted.credentials.

Now we re-run the script to create another set of encrypted credentials. This time for Microsoft Identity Manager. Once created, rename the encrypted.credentials file to something that makes sense in your environment. I’ve renamed the second set to creds2.encrypted.credentials.

We now need to copy the following files to your UI Website C:\nodejs\powershell-command-executor-ui-master directory:

  • creds1.encrypted.credentials
  • creds2.encrypted.credentials
  • decryptUtil.ps1
  • secret.key

Navigate back to Routes.js and open the file in an editor such as Notepad++

Update the index.js file for the path to your credentials files. We also need to add in the additional credentials file.

The changes to the file are, the paths to the files we just copied above along with the addition var PATH_TO_ENCRYPTED_RPSCREDENTIALS_FILE for the second set of credentials used for Microsoft Identity Manager.

var PATH_TO_DECRYPT_UTILS_SCRIPT = "C:\\nodejs\\powershell-command-executor-ui-master\\decryptUtil.ps1";
var PATH_TO_ENCRYPTED_CREDENTIALS_FILE = "C:\\nodejs\\powershell-command-executor-ui-master\\creds1.encrypted.credentials";
var PATH_TO_ENCRYPTED_RPSCREDENTIALS_FILE = "C:\\nodejs\\powershell-command-executor-ui-master\\creds2-encrypted.credentials";
var PATH_TO_SECRET_KEY = "C:\\nodejs\\powershell-command-executor-ui-master\\secret.key";


Also to initCommands to pass through the additional credentials file


initCommands: o365Utils.getO365PSInitCommands(
 PATH_TO_DECRYPT_UTILS_SCRIPT,
 PATH_TO_ENCRYPTED_CREDENTIALS_FILE,
 PATH_TO_ENCRYPTED_RPSCREDENTIALS_FILE,
 PATH_TO_SECRET_KEY,
 10000,30000,3600000),

Here is the full index.js file for reference.

 

public/console.html

The public/console.html file is for formatting and associated UI components. The key things I’ve updated are the Bootstrap and AngularJS versions. Those are contained in the top of the html document. A summary is below.

https://ajax.googleapis.com/ajax/libs/angularjs/1.6.1/angular.min.js
https://cdnjs.cloudflare.com/ajax/libs/angular.js/1.6.1/angular-resource.min.js
http://javascripts/ui-bootstrap-tpls-2.4.0.min.js
http://javascripts/console.js
<link rel="stylesheet" href="https://maxcdn.bootstrapcdn.com/bootstrap/3.3.7/css/bootstrap.min.css">
<link rel="stylesheet" href="https://maxcdn.bootstrapcdn.com/bootstrap/3.3.7/css/bootstrap-theme.min.css">

You will also need to download the updated Bootstrap UI (ui-bootstrap-tpls-2.4.0.min.js). I’m using v2.4.0 which you can download from here. Copy it to the javascripts directory.

I’ve also updated the table types, buttons, colours, header, logo etc in the appropriate locations (CSS, Tables, Div’s etc). Here is my full file for reference. You’ll need to update for your colours, branding etc.

powershell-command-executor\O365Utils.js

Finally the O365Utils.js file. This contains the commands that will be displayed along with their options, as well as the connection information for your Microsoft Identity Manager environment.

You will need to change:

  • Line 52 for the address of your MIM Sync Server
  • Line 55 for the addresses of your MIM Service Server
  • Line 141 on-wards for what commands and parameters for those commands you want to make available in the UI

Here is an example with a couple of AzureAD commands, a MIM Sync and a MIM Service command.

Show me my PowerShell UI Website

Now that we have everything configured let’s start the site and browse to it. If you haven’t stopped the NodeJS site from earlier go to the command window and press Cntrl+C a couple of times. Run “c:\Program Files\nodejs\node.exe” bin\www again from the C:\nodejs\powershell-command-executor-ui-master directory unless you have restarted the host and now have NodeJS in your environment path.

In a browser on the same host go to http://localhost:3000 again and you should see the site as it is below.

Branding and styling from the console.html, menu options from the o365Utils.js and when you select a command and execute it data from the associated service …….

… you can see results. From the screenshot below a Get-AzureADUser command for the associated search string executed in milliseconds.

 

Summary

The powershell-command-executor-ui from bitsofinfo is a very extensible and powerful NodeJS website as a front-end to PowerShell.

With a few tweaks and updates the look and feel can be easily changed along with the addition of any powershell commands that you wish to have a UI for.

As it sits though keep in mind you have a UI with hard-coded credentials that can do whatever commands you expose.

Personally I am running one for my use only and I have it hosted in Azure in its own Resource Group with an NSG allowing outgoing traffic to Azure and my MIM environment. Incoming traffic is only allowed from my personal management workstations IP address. I also needed to allow port 3000 into the server on the NSG as well as the firewall on the host. I did that quickly using the command below.

# Enable the WebPort NodeJS is using on the firewall 
netsh advfirewall firewall add rule name="NodeJS WebPort 3000" dir=in action=allow protocol=TCP localport=3000

Follow Darren on Twitter @darrenjrobinson

Using Azure Functions with the Lithnet MIIS Automation Powershell Module to query your Microsoft Identity Manager Metaverse

This is the 2nd blog continuing on from this post which is an introduction to using Azure Functions with the Lithnet FIM/MIM Powershell Modules. If you haven’t read that one please do so to get up to speed before this one as it has more detail around the setup.

Overview

This post details similar functionality to the first post but with integration to the FIM/MIM Synchronisation Server and the FIM/MIM Metaverse rather than the FIM/MIM Service.

The solution is based around an Azure Function that;

  • takes a HTTP WebRequest that contains a payload with the ObjectType, AttributeName and AttributeValue to search for in the Metaverse
  • The Azure Function uses Remote Powershell to call the Lithnet MIIS Automation Powershell Module installed on the FIM/MIM Sync Server
  • The Lithnet Powershell Module takes the query from the Azure Function, executes the query and returns the result to the Azure Function and the requesting client
  • Note: My MIM Infrastructure is all located in Azure so there are configuration steps in this solution to allow access into my Azure environment. If your FIM/MIM infrastructure is elsewhere you’ll need to transpose the appropriate firewall rules for your architecture

Let’s get started.

Prerequisites

The prerequisites for this solution are;

  • An Azure Tenant
  • FIM/MIM Sync Server (as per the diagram above) with data in your Metaverse from a connected directory service (such as Active Directory)
  • I’ll also be using the awesome Lithnet MIIS Powershell Module from here for Microsoft FIM/MIM from Ryan Newington. A fantastic contribution to the FIM/MIM community
    • You’ll need to download and install it on your FIM/MIM Synchronisation Server. This differs from the Lithnet Module from the first post in this series as this one is specific to the Metaverse not the FIM/MIM Service.

Enable Powershell Remoting on the FIM/MIM Sync Server

On the FIM/MIM Sync Server where we will be sending requests from the Function App we need to enable Powershell Remoting. This is so we can leverage the Lithnet MIIS Automation Powershell module (that is a prerequisite to be installed on your FIM/MIM Sync Server).

On the FIM/MIM Synchronisation Server open Powershell (as Administrator) and execute the command  Enable-PSRemoting -Force 

Test from another server in your network that you can access the MIM Sync Server. I did this from my MIM Service Server.

PSRemote Inbound Security Rule (Azure NSG)

Using Powershell Remote means we need to have an incoming rule into the Azure Network where my MIM Sync Server is located to allow connections from Azure Functions to my MIM Sync Server. Create an Inbound Rule in your Azure Network Security Group for TCP Port 5986 as per the rule below.

Create a Self Signed Cert on the FIM/MIM Sync Server

To secure the connection using Remote Powershell we will secure the HTTPS connection with a certificate. This is because the Azure Function is not a member of the domain where your FIM/MIM Sync Server is located. In this example I’m using a self-signed certificate.

In Powershell (as Administrator) on your FIM/MIM Sync Server run the following command where the DNSName is the DNS name of your FIM/MIM Sync that will resolve from Azure Functions to your FIM/MIM Sync server.

New-SelfSignedCertificate -DnsName mymimsyncserver.westus.cloudapp.azure.com -CertStoreLocation Cert:\LocalMachine\My

Create a Remote Powershell HTTPS Listener

Copy the thumbprint from the self-signed certificate above and use it along with the DNS name of your FIM/MIM Sync Server to run the following command in an Administrator command prompt on your FIM/MIM Sync Server.

winrm create winrm/config/Listener?Address=*+Transport=HTTPS @{Hostname=”mymimsyncserver.westus.cloudapp.azure.com”;CertificateThumbp
rint=”536E41D6089F35ABCDEFD8C52BE754EFF0B279B”}

Allow Powershell Remote (HTTPS) through your firewall on your FIM/MIM Sync Server

In an Administrator command prompt run the following command to create a new inbound firewall rule for the Remote Powershell session from your Azure Function.

netsh advfirewall firewall add rule name="WinRM-HTTPS" dir=in localport=5986 protocol=TCP action=allow

Check that the new firewall rule was created successfully.

Create your HTTP Request Function

Create a new HTTP Trigger Function choosing Powershell as the language. More detailed steps to do this is in the first post in this series here.

Search FIM/MIM Metaverse Function App Script

Here is the base script to get you started. This differs a little from the first blog post example in that I’ve secured the username and password for connection to my MIM Sync Server. Details on how to do that are also linked to in the first blog post.

Also in this example I’m running Remote Powershell to execute the command on the FIM/MIM Sync Server as that is where the Lithnet MIIS Automation Powershell Module is installed and needs to run.

The following script;

  • Takes an HTTP request with Object Type, AttributeName, AttributeValue
  • It uses a Script Block to take the input variables from the HTTP request and perform a a Powershell Remote command (in this example Get-MVObject)
  • Returns the object to the output

Save the function once you’ve added the script (and updated it for your credentials, target FIM/MIM Sync Server etc).

Bring up the Test dialog and give the script some input values in the Request Body that will result in a successful query result from your Metaverse. Select Run. If you’ve done everything correctly you’ll see an object returned from the Metaverse.

Test the Function App

Execute the Azure Function from an HTTP Trigger

Now lets try it remotely. Here is a quick Powershell query to the Azure Function using the Powershell Invoke Rest Method using the same input to the Azure Function. And huzzah a returned object.

Summary

This concept provides a framework to allow a plethora of possibilities all possible through a combination of Azure Functions and the Lithnet MIIS Automation PS Module. The Lithnet MIIS PS Module provides all the functionality you get from being on the MIM Sync Server, but now you can retrieve information remotely or trigger functions remotely.

Follow Darren on Twitter @darrenjrobinson

 

 

 

Leveraging the Microsoft Graph API with PowerShell and OAuth 2.0

Background

Microsoft Graph is the evolvement of API’s into Microsoft Cloud Services. For me not being a developer, a key difference is interacting with with Graph API using OAuth 2.0 via PowerShell. Through a number of my previous posts I’ve interacted with the Graph API using client libraries such as the Microsoft.IdentityModel.Clients.ActiveDirectory library. This post details using PowerShell to talk directly to Graph API and managing Authentication and Authorization using OAuth 2.0 and Azure WebApp.

Leveraging the Graph API opens up access to the continually evolving Azure services as shown in the graphic below. Source graph.microsoft.io

MicrosoftGraph_DevStack.png

Getting started with the Graph API, PowerShell and OAuth 2.0

The key difference between using a client library and going direct is you need to register and configure an Azure WebApp. It is super simple to do. Jump on over to the Office 365 App Registration Tool here. Sign in with an account associated with the Azure Tenant you are going to interact with. Depending on what you’re doing you’ll need to select the appropriate access.

Here’s the settings I selected for access to user profiles. Give the WebApp a name. This is the name that you’ll see in the OAuth Authorization step later on. I’ve highlighted the other key settings. Don’t worry about the SignIn and RedirectURL’s other than configuring HTTPS as we’ll be using PowerShell to access the WebApp.

WebApp

Once you’ve registered the WebApp successfully you’ll get a Client ID and Client Secret. RECORD/WRITE THESE DOWN NOW as you won’t get access to your Client Secret again.

ClientID.PNG

If you want to change what your WebApp has access to after creating it you can access it via the Classic Azure Portal. Select your Active Directory => Select Applications => Select the WebApp you created earlier  => Select Configure => (scroll to the bottom) Select Add Application. Depending on what you have in your subscription you can add access to your services as shown below.

WebAppPermissions

Authenticating & Authorizing

In order to access the Graph API we need to get our Authorization Code. We only need to do this the first time.

This little script (modify with your Client ID and your Client Secret obtained earlier when we registered our WebApp) will prompt you to authenticate.

Using the account associated with the Web App you registered in the previous step authenticate.

AuthCode

You’ll be requested to authorize your application.

AuthZ.PNG

You will then have your AuthCode.

AuthCode-Scope

One thing to note above is admin_consent. This is from the URL passed to get the Authorization Code. &prompt=admin_consent is giving Admin Consent to all entities configured on the WebApp over just access for and to a single user.

Accessing the Graph API with OAuth 2.0 Access Token

Using your ClientID, ClientSecret and AuthCode you can now get your access token. I tripped up here and was getting Invoke-RestMethod : {“error”:”invalid_client”,”error_description”:”AADSTS70002: Error validating credentials.
AADSTS50012: Invalid client secret is provided.  Tracing back my steps and looking at my “Client Secret” I noticed the special characters in it that I hadn’t URL Encoded. After doing that I was successfully able to get my access token.

Looking at our AuthZ we can see that the Scope is what was selected when registering the WebApp.

Token

Now using the Access Token I can query the Graph API. Here is getting my AAD Object.

me

 

Summary

In my case I now can access all users via the API. Here is what’s available. Using the Access Token and modifying the Invoke-RestMethod URI and Method (including -Body if you are doing a Post/Patch action) you are ready to rock and roll and all via PowerShell.

Your Access Token is valid for an hour. Before then you will need to refresh it. Just run the $Authorization = Invoke-RestMethod ….. line again. 

Follow Darren on Twitter @darrenjrobinson