Geographically Visualizing your workforce using Microsoft Identity Manager, xMatters and Power BI

 

Introduction

In the last couple of weeks I’ve posted about visualizing relationships of data from Microsoft Identity Manager using Power BI. Earlier this week I posted about building a Management Agent for Microsoft Identity Manger to integrate with xMatters.

In this post I combine data from the last two in order to allow us to visualise the geographic office locations for an organisation and then summary data about it (how many employees are located there, and what departments).

Prerequisites

You’ll need an Azure AD and Office 365 subscription to allow you to create a Power BI Application. Too create a Power BI Application see Registering a Power BI Application in this post here.

You’ll also need the Power BI PowerShell Module. I’m using 2.0.0.9 available from the PowerShell Gallery here and of course the Lithnet MIIS PowerShell Module available from here.

Overview

Using our registered Power BI Application we’ll create a Dataset consisting of two tables. One for the xMatters Sites (that we also get the geographic co-ordinates of from the xMatters Management Agent), and the other with our xMatters Users that contains the officeLocation that maps to an xMatters Site.

I create a relationship between the two tables on xMattersSite displayName (which is the location name) and the xMattersUsers officeLocation. We can then create a nice visual using data from both tables.

Create the Dataset (two tables with relationship)

Initially I tried to create the dataset with a relationship as I’ve previously shown here. However that didn’t work. After some debugging I got the result I wanted after some trial and error using the Power BI API Explorer. So I’ll provide you with the raw JSON format for creating a New Dataset, Two Tables (xMattersSites and xMattersUsers) and a relationship between them (where xMattersSites\displayName joins with xMattersUsers\officeLocation) as per my xMatters Management Agent detailed here.

Start by authenticating to the Power BI API Explorer with an account in the environment where you created your Power BI Application and navigate to the Create Dataset section here.

Create Dataset

Update this JSON formatted object that details the Dataset, Tables and Relationships for your environment.

Paste your validated JSON object into the Body section of the API Explorer and select Call Resource.

Dataset Body

If your JSON object is formatted corrected you’ll get a 201 response and your DataSet and Tables with Relationship will be created.

Create Success

Switching over to Power BI you’ll see the xMatters Dataset in the bottom left, then the two tables in on the right hand side with their columns.

xMatters DataSet PBI.PNG

Load xMatters User Data into Power BI

Now that we have somewhere to put the data, lets populate the dataset. I’m using the Lithnet MIIS Automation PowerShell Module (detailed in the prerequsites to query the Metaverse and return all users. Then I refine the list down to those that are Active (based on my employeeActive Boolean attribute) then finally, only those users that are connected on the xMatters Management Agent (see lines 14 & 18).

The script will drop any existing values from the xMatters Users table then upload what we have retrieved from the Metaverse (and refined).

Upload Users.PNG

Load xMatters Site Data into Power BI

Again I’m also using the Lithnet MIIS Automation PowerShell Module to query the Metaverse and return all xMatters Sites.

The script will drop any existing values from the xMatters Sites table then upload what we have retrieved from the Metaverse.

Upload Sites.PNG

Creating the Power BI Visual

Now we have data we can build the visual. I’m using the ArcGIS Maps for Power BI visual which is available in the default set of visuals. Then by selecting displayName and geo the map will automagically show all xMatters Sites in their respective co-ordinates.

xMatters Sites to Map

We can then add a Card Visual and choose officeLocation and then configure the visual for Count of officeLocation and we’ll get a count of the employees at that location. As we can see below with the Sydney location selected from the map the card updates to tell me there are 665 Employees at that officeLocation.

Count of Employees at Selected Location

Pretty quickly we can also expand out other data points, like departments at a location, employees etc as shown below (I’ve obfuscated the departments and a number of the other office locations).

Summary.PNG

Conclusion

We haven’t generated any new data. We’ve taken information we already have in Microsoft Identity Manager from connected systems and quickly visualized it via Power BI. However providing this to the business and with the ability for consumers of the information to export it from the visual can be pretty powerful.

Building a FIM/MIM Management Agent for xMatters

Introduction

A couple of weeks ago one of my customers had a requirement to provision and manage identities into xMatters. The xMatters API Documentation looked straight-forward and I figured it would be pretty quick to knock up an PowerShell Management Agent.

The identification of users (People) in xMatters was indeed pretty quick. I was quickly able to enumerate all users (that had initially been seeded independent of FIM/MIM) and join them to corresponding users in the MetaVerse.

It was then as I started digging deeper that the relationship between Sites (Locations) and Email/Mobile (Devices) attributes became apparent. This post details how I approached it and a base xMatters MA that should get you started if you need to do something similar.

Overview

A key concept to keep in mind is that at the simplest level there are 3 key Object Types in xMatters;

  • People
    • User Objects along with basic naming attributes
  • Device
    • Each contact medium is a device. Email Address, Mobile Phone, Home Phone, Text Phone (SMS) etc.
  • Site
    • Location of the entity (person)

Associated with each is an id which can be either dynamically created on provisioning (by xMatters) or specified. For People there is also targetName which is the equivalent of UID/sAMAccountName. When using the API (for people) you can use either their ID or their targetName. For all other entities you need to use the ID.

For each entity as you’d expect there are different API URI’s. They are;

  • Base URI https://customer.hosted.xmatters.com
  • People URI https://customer.hosted.xmatters.com/api/xm/1/people
  • Devices URI https://customer.hosted.xmatters.com/api/xm/1/devices
  • Sites URI (legacy API) https://customer.hosted.xmatters.com/reapi/2015-04-01/sites

Finally to retrieve devices for a person use;

  • Devices associated with a person https://customer.hosted.xmatters.com/api/xm/1/people/{ID}/devices

Other key points to consider that I uncovered are;

  • if you are updating a Device (e.g. someones Email Address or Phone Number) don’t specify the owner attribute (as you do when you create the Device). It considers that you are trying to change the owner and won’t allow it.
  • to update a Device you need to know the ID of the Device. I catered for this on my Import by bringing through People and Device ID’s.
  • When creating/updating a users location you need to specify the Site ID and Site Name. I brought these through as a separate ObjectClass into FIM/MIM and query the MV for them when Exporting
  • In my initial testing the API returned a number of different errors 400 (Bad Request), 409 Conflict (when trying to Add a Device that already exists), 404 (Not Found) along with API Timeouts. You need to account for these and perform processing appropriately
  • On success of Update, Create or Delete the API returns the full object that you performed the operation on. You need to capture this and let MIM know that on Success a full object being returned is Success and not an error
  •  xMatters expects phone numbers to be in E164 format (e.g +61 400 123 456). I catered for this on an import on another Management Agent
  • xMatters timezone is in the format of Country/Region. For Australia these are as follows. Correct, it doesn’t accept Australia/Canberra for ACT;
    • “NSW”  = “Australia/Sydney”
      “VIC”  = “Australia/Melbourne”
      “QLD”  = “Australia/Brisbane”
      “ACT”  = “Australia/Sydney”
      “WA”  = “Australia/Perth”
      “TAS”  = “Australia/Hobart”
      “NT”  = “Australia/Darwin”

xMatters PowerShell Management Agent

With all that introduction, here is a base xMatters PowerShell MA (implemented using the Granfeldt PowerShell MA) to get you started. You’ll need to tailor for your environment and trigger Provisioning, Deletes and Flow Rules for your environment and look to handle the xMatters API for your integration.

Schema Script

I’ve created two Object Classes. User and Site. User incorporates User Devices. Site is the locations (Sites) from xMatters.

Import Script

Credentials for the Import script to connect to xMatters are flowed in from the Management Agent Username and Password attributes. This isn’t using Paged Imports. If you have a large number of users you may want to consider that. After retrieving all of the People entities each is queried to obtain their Devices. I’m only bringing through SMS and Email Devices. You’ll need to modify for additional Devices.

Ensure that you flow into the MetaVerse (onto custom attributes) the IDs associated with your Devices (e.g MobileID and EmailID). That will allow you to use the ID when updating those attributes.

For Sites, I created a custom ObjectClass (Site) in the MV and used objectID of the SiteID and displayName for the Site Name (as shown below).

Attribute Flows.png

Export Script

This is where it gets a little more complicated. As PowerShell is not good at reporting webrequest responses we have to deal with the return from each API call and determine if we were successful or not. Then let FIM/MIM know so it can report that via the UI.

The Export script below deals with Adding, Deleting and Updating users. Update line 31 for your API URI for xMatters.

Summary

The detail above will get you started and give you a working Management Agent to import Users and Sites. You’ll need to do the usual steps (Set, Workflow, Sync Rule and MPR) to trigger Provisioning on the MA along with how you handle deletes.

Graphically Visualizing Identity Hierarchy and Relationships

Almost 15 years ago Microsoft released Microsoft Identity Integration Server (MIIS) 2003. Microsoft also released a couple of Resource Toolkits for MIIS to assist customers and IT Integrators’ implement the product as up to that time it’s predecessor (Microsoft Metadirectory Services) was only available as part of a Microsoft Consulting engagement.

At the same time Microsoft provided a Beta product – Microsoft PolyArchy Server. For someone who’s brain is wired in highly visually way, this was a wow moment. PolyArchy Server took a dataset from the Synchronisation Server and wrapped a small IIS website around it to expose intersecting relationships between data. When you selected a datapoint the visual would flip to the new context and display a list of entities associated with that relationship.

Microsoft proposed to deliver PolyArchy Server in calendar year 2006. However the product never made it to market. The concept of visualizing identity data was seeded in my brain and something I’ve always surfaced in one method or another as part of many Identity Management projects.

In this post I’ll detail how I’ve recently used Power BI to visualize relationship data from Microsoft Identity Manager.  The graphic below is an example (with node labels turned off) that represents Managers by Department by State.

Managers by Dept by State - Graphical.png

Using filters in the same report allows whoever is viewing the report to refine the visual based on State and Dept. By selecting a State from the map the visual will dynamically update to show that state only. Selecting a department only will show that department in each state.

Managers by Dept by State - Filtered.png

Hovering over the nodes will display the detail. I’ve turned off the node labels that show each nodes label to not expose the source of my dataset.

Managers by Dept by State - NSW Detail.png

Getting MIM MV User MetaData into Power BI

My recent post here details the necessary steps to get started publishing data directly in a Power BI Dataset using PowerShell. Follow the details listed there to register a Power BI Application.

Creating the DataSet

With that done the script below will create a DataSet in Power BI. My dataset is obviously specific to the environment I developed it in. You probably won’t have some of the attributes so you will need to update accordingly. The script is desinged to run on the MIM Sync Server. The MIM Sync Server will need to be able to connect to Azure and Power BI.

Publish data to the DataSet

Now that we have a Power BI DataSet (Table) we need to extract the data from the MIM MV and push it into the table. Using the Lithnet MIIS Automation PowerShell Module makes this extremely simple. Using the table schema created above I retrieve the values for each Active User, build a PowerShell Object and use the Power BI PowerShell Module to push the data to Power BI.

Creating the Power BI Visualization

The visualisation I’m using is the Journey Chart by MAQ Software which is available in the Power BI Store (free).

Journey Visual.PNG

With the Journey Visualization selected and dropped in we just have to select the attributes we want to visualize and the order of the relationships. The screenshot below shows the data sorted by State => managerName => accountName with Measure Data being accountName.

Visual Config.PNG

Conclusion

We never got PolyArchy Server from Microsoft, but we can quickly visualize basic relationship data from MIM with Power BI.

Automate the update of the data into Power BI, embed the Power BI Reports into your MIM Portal and provide access to the appropriate personnel.

 

A modern way to track FIM/MIM Attribute Value History utilizing Power BI

Introduction

Microsoft Identity Manager is fantastic for keeping data consistent between connected systems. Often however you want to know what a previous value of an attribute was. FIM/MIM however can only tell you the current value and the Management Agent it was received on and when.

In the past where I’ve had to provide a solution to either make sure an attribute has a unique value forever (e.g email address or loginID (don’t reuse email addresses or loginID)) or just attribute value history I’ve used two different approaches;

  • Store previous values in an SQL Table and have an SQL MA that flows out the values
  • Store historical values in a Multi-Valued attribute on the user object in the Metaverse

Both are valid approaches but often fall down when you want to quickly get a report on that metadata.

Recently we had a similar request to be able to know when Employees EndDates were updated in HR. Specifically useful for contractors who have their contracts extended. Instead of stuffing the info into a Multi-Valued attribute or an SQL DB this time I used Power BI. This provides the benefit of being able to quickly develop a graphical report and embed it in the FIM/MIM Portal.

Such a report looks like the screenshot below. Power BI Report

Using the filters on the right hand side of the report you can find a user (by EmployeeID or DisplayName), select them and see attribute value history details for that user in the main part of the report. As per the screenshot below Andrew’s EndDate was originally the 8th of December (as received on the 5th of November), but was changed to the 24th of November on the 13th of November.

End Date History

In this Post I describe how I quickly built the solution.

Overview

The process to do this involves;

  • creating a Power BI Application
  • creating a Power BI Dataset
  • creating a script to retrieve the data from the MV and inject it into the Power BI Dataset
  • creating a Power BI Report for the data
  • embedding the Report in the MIM Portal

Registering a Power BI Application

Head over to Power BI for Developers and Register an Application for Power BI. Login to Power BI with an account for the tenant you’ll be reporting data for. Give your Application a name and choose Native Application. Set the Redirect URL to https://localhost

CreatePBIApp

Choose the permissions for you Application. As we’ll be writing data into Power BI you’ll need a minimum of Read and Write all Datasets. Select Register App.

Create PBI App Permissions

Record your Client ID for your Application. We’ll need this to connect to Power BI.

Register the App

We need to authenticate to Power BI the first time using a UI to provide Authorization for our Application. In order to do that we need to add another Reply URL to our application. Head to the Apps Dev Portal, select your application and Edit the Application Manifest. Add an additional Reply URL for https://login.live.com/oauth20_desktop.srf as shown below.

Add Reply URL for AuthZ

The following PowerShell commands will then allow us to Authenticate utilizing the Power BI PowerShell module. If you don’t have the Power BI PowerShell Module installed un-comment Install-Module PowerBIPS -RequiredVersion 1.2.0.9 -Force  to install the PowerShell Power BI PowerShell Module.

Update for your Client ID for the App you registered in the previous steps.

# Install-Module PowerBIPS -RequiredVersion 1.2.0.9 -Force
Import-Module PowerBIPS -RequiredVersion 1.2.0.9

# PowerBI App
$clientID = "4036df76-4de6-43cb-afe6-1234567890"

$authtoken = Get-PBIAuthToken -ClientId $clientID

Sign in with an account for the Tenant where you created the Power BI App.

Interactive Login for Dataset Creation

Accept the permissions you chose when registering the Power BI App.

Authorize PowerBI App

Creating the Power BI Dataset

Now we will create the Power BI Table (Dataset) that we will use when we insert the records.

My table is named Employee and the DataSet EmployeeEndDateReport.  I’m keeping the table slim to enough info for our purpose. Date added to the dataset, employees Accountname, Displayname, Active state, EndDate and EndDateReceived. The following script will create the Dataset.

Populating the Dataset

With our table created, lets populate the table with employees that have an EndDate. As this is the first time we run it, we set a watermark date to add people from. I’ve gone with the previous year.  I then query the MV for Employees with an EndDate within the last 365 days, build a PowerShell Object with the columns from our table and insert them into Power BI. I also set a watermark of the last time we had an EndDate Received from the MA and output that to the watermark file. This is so next time we can quickly get only users that have an EndDate that was received since the last time we ran the process.

NOTE: for full automation you’ll need to change line 6 for your secure method of choice of providing credentials to scripts. 

Create a Power BI Report

Now in Power BI select your Data Set and design your report. Here is a sample one that I’ve put together. I simply selected the columns from the dataset and updated the look and feel. I then added in a column (individually for AccountName, DisplayName and Active) and chose it as Filter so that I have various ways of filtering whoever I’m looking for.

Power BI Report.png

Once you have run the process for a while and you have changed values for the attribute you are keeping history for, you will see when you select a user with changed values, you will see the history.

End Date History

Summary

To complete the solution you’ll want to automate the script that queries the MV for changes (probably after each run from the MA that provides the attribute you are recording history for), and you’ll want to embed the report in the MIM Portal. In this post here I detail how to do that step by step.

 

 

A quick start guide for Deploying and Configuring Node-RED as an Azure WebApp

 

Introduction

I’ve been experimenting and messing around with IoT devices for well over 10 years. Back then it wasn’t called IoT, and it was very much a build it and write it yourself approach.

Fast forward to 2017 and you can buy a microprocessor for a couple of dollars that includes WiFi. Environmental sensors are available for another couple of dollars and we can start to publish environmental telemetry without having to build circuitry and develop code. And rather than having to design and deploy a database to store the telemetry (as I was doing 10+ years ago) we can send it to SaaS/PaaS services and build dashboards very quickly.

This post provides a quick start guide to those last few points. Visualising data from IoT devices using Azure Platform-as-a-Service services. Here is a rudimentary environment dashboard I put together very quickly.

NodeRedDashboardUI.PNG

Overview

Having played with numerous services recently for my already API integrated IoT devices, I knew I wanted a solution to visualise the data, but I didn’t want to deploy dedicated infrastructure and I wanted to keep the number of moving parts to a minimum. I looked at getting my devices to publish their telemetry via MQTT which is  great solution (for scale or rapidly changing data), but when you are only dealing with a handful of sensors and data that isn’t highly dynamic it is over-kill. I wanted to simply poll the devices as required and obtain the current readings and visualise it. Think temperature, pressure, humidity.

Through my research I like the look of Node-RED for its quick and simplistic approach to obtaining data and manipulating it for presentation. Node-RED relies on NodeJS which I figured I could deploy as an Azure WebApp (similar to what I did here). Sure enough I could. However not long after I got it working I discovered this project. A Node-RED enabled NodeJS Web App you can deploy straight from Github. Awesome work Juan Manuel Servera.

Prerequisites

The quickest way to start then is to use Juan’s Azure WebApp wrapper for Node-RED. Follow his instructions to get that deployed to your Azure Subscription. Once deployed you can navigate to your Node-RED WebApp and you should see something similar to the image below.

DefaultNode-Red.PNG

The first thing you need to do is secure the app. From your WebApp Application Settings in the Azure Portal, use Kudu to navigate to the WebApp files. Under your wwwroot/WebApp directory you will find the settings.js file. Select the file and select the Edit (pencil) icon.

Node-Red-Settings

Comment in the adminAuth section around lines 93-100. To generate the encrypted password on a local install of NodeJS I ran the following command and copied the hash. Change ‘whatisagoodpassword’ for your desired password.

node -e "console.log(require('bcryptjs').hashSync(process.argv[1], 8));" whatisagoodpassword

Select Save, then Restart your application.

Node-Red-Password

On loading your WebApp again you will be prompted to login. Login and lets get started.

Login

Configuring Node-RED

Now it is time to pull in some data and visualise it. I’m not going to go into too much detail as what you want to do is probably quick different to me.

At its simplest though I want to trigger on a timer a call to my sensor API’s to return the values and display them as either text, a graph or a gauge. Below graphically shows he configuration for the dashboard shown above.

ConfigureNodeRED.PNG

For each entity on the dashboard there is and input. For me this is trigger every 15 minutes. That looks like this.

Get-15mins

Next is the API to get the data. The API I’m calling is an open GET with the API key in the URL so looks like this.

HTTPRequest

With the JSON response from the API I retrieve the temperature value and return it as msg for use in the UI.

Function

I then have the Gauge for Temperature. I’ve set the minimum and maximum values and gone with the defaults for the colours.

Guage

I’m also outputting debug info during setup for the raw response from the Function ….

DebugParsedOutput

….. and from the parsed function.

DebugHTTPRequest

These appear in the Debug pane on the right hand side.

DebugWindow

After each configuration change simply select Deploy and then switch over to your Node-RED WebApp. That will looks like your URI for your WebApp with UI on the end eg. http://.azurewebsites.net/ui

Conclusion

Thanks to Azure PaaS services and the ability to use a graphical IoT tool like Node-RED we can quickly deploy a solution to visualise IoT data without having to deploy any backend infrastructure. The only hardware is the IoT sensors, everything else is serverless.

Awarded Microsoft MVP for Enterprise Mobility – Identity and Access Management

This week I was awarded Microsoft Most Valuable Professional for Enterprise Mobility for my work in the area of Identity and Access Management.

This is quite an honor and something I had never considered as for the majority of my professional career I’ve been employed by integrators that require all my work to be considered intellectual property that must not be shared publicly. I’ve always appreciated what others have shared within my specialist domain and always felt a little guilty that I wasn’t able to reciprocate.

Two years ago (to the day of receiving the MVP award) I joined the Australian based integrator Kloud Solutions. The culture within Kloud is quite different to my previous employers where supporting our technical communities is encouraged. After finding my feet at Kloud and getting my work-life balance back after coming out of a Global Research & Development role I started contemplating sharing some of my knowledge and experiences with Identity and Access Management especially as I was undertaking interesting projects that were integrating traditional identity management systems with Cloud services.

That was when I started blogging, presenting in a couple of forums as well as starting to leverage a number of open source tools and services. Around mid-year I was nominated for MVP and it was at that time I appreciated the time and effort required to qualify. In Australia we are well endowed with IDM MVP’s and I’m very honored and humbled to join Bob Bradley, Carol Wapshere and Ryan Newington.

I’m still amazed that after 20+ years of experience with Identity and Access Management the projects I’m doing still incorporate new technologies, services and are interesting. Integration of Cloud Services, Artificial Intelligence and Machine Learning with identity solutions are all exciting areas that have an identity management aspect to them.

A big thank you to Kloud for the support to give back to the IDAM community, fellow IDAM MVP’s for entertaining my sometimes bespoke thoughts on integrating their tools and my family for the many nights I’ve sat in front of a computer writing up the latest post or presentation.

Validate Your Authoritative Sources – Creating a Fuse for FIM/MIM Import and Sync run cycles

 

Introduction

The Microsoft Identity Manager Synchronisation Engine has been around for close to 20 years and is highly functional and very reliable.

The Achilles heal though for any IDAM Sync Engine will always be an authoritative source and the information it provides to the Sync Engine.

I’m seeing more and more SaaS services being used as the Authoritative Source for identity management systems. Think Success Factors and Workday. Connecting across the internet to these and the rate of change within organisations means the amount of change data I’m seeing as well as the common human factor of changes en-mass means it is even more important to validate your import feeds before processing through your Sync Engines business logic.

Overview

This post details the foundation of a little logic that will call an Import from the Authoritative Source and analyse what is returned, evaluate it against the previous Import to understand the number of objects expected and determine if it is within an acceptable tolerance (I’m using 1% changes of total managed objects). If it doesn’t checkout, don’t run a Sync and send an email to someone to check it out and make a rational human decision (and maybe a manual sync). If the Import is valid then run the Sync.

Simply put;

  • Query the Authoritative Management Agent and get the last Import Run
    • set variables for the total number of objects processed; Adds, Renames, Updates and Deletes (and Total)
  • Run an Import cycle only
    • set variables for the total number of objects processed; Adds, Renames, Updates and Deletes (and Total)
    • Evaluate each of the Staging Adds, Renames, Updates and Deletes and see if the number of changes is less that the tolerance (1%)
      • if yes proceed with a Sync Run
      • if no send a notification and don’t run the Sync

Enhancements

This will need to be tailored for each environment. What is normal for the number of changes expected in your environment? You may see a lot of Updates and the global 1% tolerance I have here doesn’t work for that. So you may want a tolerance per Adds, Renames, Updates and Deletes.

Implementation

Where I’ve used this, I’ve saved the PowerShell script below into the same directory as the other scripts that automate the MIM Sync Run Sequence. I’ve updated the previous automation script and removed the Authoritative Delta Import / Delta Sync, Full Import / Delta Sync etc and called this Auth Fuse Script instead.

The Script

As always this uses the awesome LithnetMIISAutomation PowerShell Module from Ryan. Update;

  • lines 5-11 for your Auth Source.
  • lines 20-24 for SMTP Server and Notification settings
  • $smtpBody lines for what you want the notification emails to say

Summary

A simple piece of logic to check and validate your imports can save hours/days of work.

If your Auth Source doesn’t provide a full dataset and you haven’t checked your import before processing you could be deleting a LOT of accounts.

If HR changed the Org Structure and didn’t inform you or take into account IDAM Business Logic you could be about to process a lot of AD Account Moves. If it involved redundancies and they haven’t informed anyone yet, you could be exposing that information to an entire organisation. CHECK AND VALIDATE YOUR AUTHORITATIVE SOURCE IMPORTS !!

Getting started developing Custom Actions for the Google Assistant (Home)

Introduction

Whilst I was in the USA recently I bought myself a Google Home. My home already had Hue Lights, Chromecast on a couple of TV’s and I’m a big user of Spotify (Premium). It was very quick to get it up and running and doing simple tasks, but I started thinking about what custom things I could get it to do. Could I get it to call custom/private API to get some information and let me know the result? The answer is yes, and that is what this post will cover. I have a few temperature sensors that have a RestAPI that can be used to query the temperature. I got it working pretty quickly (one evening) for not having messed with the Google App Engine for many years. But there were a number of steps required, and as this functionality will form the basis for more complex functionatlity I’ve documented the step by step process I used.

But here is the result. A command, a query and custom response.

Prerequisites

  • First up you are going to need to have a Google Account. Seeing as you are probably reading this because you want to do something similar, you will have a Google Home and thereby a Google account, so you’re already covered on that point. The account associated with your Google Home is the one you NEED to use with the additional services, as it is what will tie everything together
  • You will need to register for an api.ai account. Sign in with the Google Account you have linked to your Google Home
  • You will need to also register for a Google Cloud account. Sign in with the Google Account you have linked to your Google Home
  • Download and install the Google Cloud CLI. Even though I did this on a Windows machine, I actually installed the CLI in Ubuntu via the Windows Subsystem for Linux. If you want that option, here is how to get started with the Windows Subsystem for Linux
    • change into your homedir eg cd ~/ and then run curl https://sdk.cloud.google.com | bash

Getting Started

Now it is time to start our Project. In your gCloud CLI window create a project directory. I named mine insidetemp as I will be calling an IOT Temperature sensor to get the temperature inside one of my beer brewing sheds.

inside temp.PNG

We will now create a small Javascript script that will call a RestAPI to retrieve the current temperature from an IOT Temperature Sensor. The script below does that. Modify it for whatever unauthenticated API call you want to make. You’ll need to change lines 6, 7 and 24 for building the URL you will call to get data. Line 8 contains the Function name. Mine is called insideTemp. If you name yours different, then everywhere I use ‘insideTemp’ (not case) substitute what you changed your function name to.

With your modified script (I’ve changed the api key so it won’t work as is), in your application directory create a new file named index.js and paste in your script. e.g in Linux run nano index.js 

After pasting in your script, use Cntrl + O to save as index.js and Cntrl + X to quit.

custom action function
custom action function

Run gcloud auth login and go through the authentication process using the same Google Account you used for api.ai and Google Cloud. The CLI will give you a URL you need to paste into your browser and authorize access to Google Cloud for your Google Account. Once completed get the code generated from the authorization and paste that back into the CLI

gcloud authentication
gcloud authentication

Now we will create a Google Cloud Storage Bucket for our script. I named mine the same as the Function in the script (line 8) but all in lowercase (as lower case is required).
gsutil mb gs://insidetemp

GCloud Storage Bucket.PNG

You should then be able to browse to your storage. My URL is below. Updated for your project name.

https://console.cloud.google.com/storage/browser/insidetemp?project=insidetemp

Google Cloud Storage
Google Cloud Storage

We will now upload our script and create the Function that we will call from Google Home. Change the command below for your project and script function name.

gcloud beta functions deploy insideTemp –stage-bucket insidetemp –trigger-http 

Upload and create function
Upload and create function

Once uploaded and the Function has been created you will get a HTTP Trigger URL. Copy this as you’ll need it shortly. If you need to update or change the script do that locally and run the gcloud beta functions deploy command again.

Function Created
Function Created

Validating the Function

Using a browser you should be able to browse to your Function. The URL should look something like this. Change for your project name. https://console.cloud.google.com/functions/details/us-central1/insideTemp?project=insidetemp&tab=general&duration=PT1H

If the Storage account is different (because you already had one) then navigate to https://console.cloud.google.com/functions and use select your project from the menu in the top left next to Google Cloud Platform.

Select the Testing menu then click on the Test the function button. If your script is all good it will execute and get the value back from the API. You can see mine returned 21 degrees from my IOT Temp Sensor.

Test the Function
Test the Function

Wiring up our Project to Google Home

Now that we have an HTTP Function that can retrieve info from a RestAPI let’s have that as an action in Google Home. Head on over to the Actions API here;

https://console.cloud.google.com/apis/api/actions.googleapis.com/overview

At the top of the middle pane click enable if it isn’t already enabled.

Enable Google Actions API
Enable Google Actions API

Now head over to the api.ai Agents Console https://console.api.ai/api-client/#/agents and select Create Agent.

Create Agent
Create Agent

Give it a name, choose your timezone and language and select Save.

New Agent Details
New Agent Details

Select Create Intent

Create Intent
Create Intent

Provide a couple of phrases you will speak to Google Home. Don’t worry about any other settings for now. Select Save.

Create Intent Detail
Create Intent Detail

From the Left Menu select Fulfillment.

Fulfillment
Fulfillment

Enable Webhook and paste in the URL that you got after uploading the Function Script via the gCloud cli. Select Save from the bottom of the page.

Fulfillment Webhook
Fulfillment Webhook

Back in Intents, scroll to the bottom and click Fulfillment and enable Use webhook. Select Save

Intent Webhook
Intent Webhook

From the left menu select Integrations 

Integrations
Integrations

Enable Actions on Google

Actions on Google
Actions on Google

Open Settings and select  Update Draft.

Actions Trigger
Actions Trigger

then select Test 

Actions Test
Actions Test

Update: I did also remove the Default Welcome Intent and use one of my Intent phrases. 

Test will new be active. Select Close

Actions Test
Actions Test

Head back to api.ai and select Intents. Under Response choose the Actions on Google menu and enable it. Select Save.

Intent Actions on Google
Intent Actions on Google

Testing the Custom Action via the API Console

Within the api.ai console in the right hand pane in the Try it now box type in your Intent. One of mine is what is the inside temp. This tests our integration and successfully returns the result from querying the API via our Function.

Test Success
Test Success

If you select Show JSON from the bottom right you can see the processing that went on. We can see that the Agent got the query, used the Webhook to go to Fulfillment to use the Function to call the API to get our information and provide the response.

Success JSON.PNG

Testing the Custom Action via Google Home

Go to Actions on Google https://console.actions.google.com/u/0/ and select Add/Import Project. You should see a project named NewAgent. Select it

New Actions Project
New Actions Project

Then Select Import Project

Import Project
Import Project

Step through to provide info for the Agent, including images etc as if you are going to publically publish this function. Select Save.

App Info
App Info

Finally select Test Draft. DO NOT SELECT SUBMIT DRAFT FOR REVIEW.

Test Draft
Test Draft

As we are in ‘test’ mode we follow the  Ok Google wake command with Talk to . So for my project it is Ok Google, Talk to Inside Temperature.

And we are done.

Summary

Very quickly we’ve created a custom command that queries our own API to get data and have the response spoken to us. This can then be expanded to do any number of different tasks as long as you have something to query or update and the desire to do it from your Google Home.

Enabling and using Managed Service Identity to access an Azure Key Vault with Azure PowerShell Functions

Introduction

At the end of last week (14 Sept 2017) Microsoft announced a new Azure Active Directory feature – Managed Service Identity. Managed Service Identity helps solve the chicken and egg bootstrap problem of needing credentials to connect to the Azure Key Vault to retrieve credentials. When used in conjunction with Virtual Machines, Web Apps and Azure Functions that meant having to implement methods to obfuscate credentials that were stored within them. I touched on one method that I’ve used a lot in this post here whereby I encrypt the credential and store it in the Application Settings, but it still required a keyfile to allow reversing of the encryption as part of the automation process. Thankfully those days are finally behind us.

I strongly recommend you read the Managed Service Identity announcement to understand more about what MSI is.

This post details using Managed Service Identity in PowerShell Azure Function Apps.

Enabling Managed Service Identity on your Azure Function App

In the Azure Portal navigate to your Azure Function Web App. Select it and then from the main-pane select the Platform Features tab then select Managed service identity.

Enable Managed Service Identity
Enable Managed Service Identity

Turn the toggle the switch to On for Register with Azure Active Directory then select Save.

Enable Managed Service Identity
Enable Managed Service Identity

Back in Platform Features under General Settings select Application Settings. 

Azure Function App Settings
Azure Function App Settings

Under Application Settings you will see a subset of the environment variables/settings for your Function App. In my environment I don’t see the Managed Service Identity variables there. So lets keep digging.

Azure Function App Settings
Azure Function App Settings

Under Platform Features select Console.

Azure Function App Console
Azure Function App Console

When the Console loads, type Set. Scroll down and you should see MSI_ENDPOINT and MSI_SECRET.

NOTE: These variables weren’t immediately available in my environment. The next morning they were present. So I’m assuming there is a back-end process that populates them once you have enabled Managed Service Identity. And it takes more than a couple of hours 

MSI Variables
MSI Variables

Creating a New Azure Function App that uses Managed Service Identity

We will now create a new PowerShell Function App that will use Managed Service Identity to retrieve credentials from an Azure Key Vault.

From your Azure Function App, next to Functions select the + to create a New Function. I’m using a HttpTrigger PowerShell Function. Give it a name and select Create.

New Azure Function
New Azure Function

Put the following lines into the top of your function and select Save and Run.

# MSI Variables via Function Application Settings Variables
# Endpoint and Password
$endpoint = $env:MSI_ENDPOINT
$endpoint
$secret = $env:MSI_SECRET
$secret

You will see in the output the values of these two variables.

Managed Service Identity Variables
Managed Service Identity Variables

Key Vault

Now that we know we have Managed Service Identity all ready to go, we need to allow our Function App to access our Key Vault. If you don’t have a Key Vault already then read this post where I detail how to quickly get started with the Key Vault.

Go to your Key Vault and select Access Polices from the left menu list.

Azure Key Vault Access Policy
Azure Key Vault Access Policy

Select Add new, Select Principal and locate your Function App and click Select.

Azure Key Vault Access Policy
Azure Key Vault Access Policy

As my vault contains multiple credential types, I enabled the policy for Get for all types. Select Ok. Then select Save.

Azure Key Vault Access Policy
Azure Key Vault Access Policy

We now have our Function App enabled to access the Key Vault.

Azure Key Vault Access Policy
Azure Key Vault Access Policy

Finally in your Key Vault, select a secret you want to retrieve via your Function App and copy out the Secret Identifier from the Properties.

Azure Key Vault Secret Identifier URI
Azure Key Vault Secret Identifier URI

Function App Script

Here is my Sample PowerShell Function App script that will connect to the Key Vault and retrieve credentials. Line 12 should be the only line you need to update for your Key Vault Secret that you want to retrieve. Ensure you still have the API version at the end (which isn’t in the URI you copy from the Key Vault) /?api-version=2015-06-01

When run the output if you have everything correct will look below.

KeyVault Creds Output

Summary

We now have the basis of a script that we can use in our Azure Functions to allow us to use the Managed Service Identity function to connect to an Azure Key Vault and retrieve credentials. We’ve limited the access to the Key Vault to the Azure Function App to only GET the credential. The only piece of information we had to put in our Function App was the URI for the credential we want to retrieve. Brilliant.

Display Microsoft Identity Manager Sync Engine Statistics in the MIM Portal

Introduction

In the Microsoft / Forefront Identity Manager Synchronization Service Manager under Tools we have a Statistics Report. This gives a break down of each of the Management Agents and the Connectors on each MA.

I had a recent requirement to expose this information for a customer but I didn’t want them to have to connect to the Synchronization Server (and be given the permissions to allow them to). So I looked into another way of providing a subset of this information in the MIM Portal itself.  This post details that solution.

MIM / FIM Synchronization Server Management Agent & Metaverse Statistics
MIM / FIM Synchronization Server Management Agent & Metaverse Statistics

Overview

I approached this in a similar way I did for the User Object Report I recently developed. The approach is;

  • Azure PowerShell Function App that uses Remote PowerShell to connect to the MIM Sync Server and leverage the Lithnet MIIS Automation PowerShell Module to enumerate all Management Agents and build a report on the information required in the report
  • A NodeJS WebApp calls the Azure PowerShell Function App onload to generate the report and display it
  • The NodeJS WebApp is embedded in the MIM Portal as a new Nav Bar Resource and Page

The graphic below details the basic logical integration.

MVStatsReportOverview

Prerequisites

The prerequisites to perform this I’ve covered in other posts. In concept as described above it is similar to the User Object report, that has the same prerequisites and I did a pretty good job on detailing those here. To implement this then that post is the required reading to get you ready.

Azure PowerShell Function App

Below is the raw script from my Function App that connects to the MIM Sync Server and retrieves the Management Agent Statistics for the report.

NodeJS Web App

The NodeJS Web App is the app that gets embedded in the MIM Portal that calls the Azure Function to retreive the data and then display it. To get started you’ll want to start with a based NodeJS WebApp. This post will get you started. Implementing a NodeJS WebApp using Visual Studio Code 

The only extension I’m using on top of what is listed there is JQuery. So once you have NodeJS up and running in your VSCode Terminal type npm install jquery and then npm install.

I’ve kept it simple and contained all in a single HTML file using JQuery.

In you NodeJS project you will need to reference your report.html file. It should look like this (assuming you name your report report.html)

var express = require('express');
var router = express.Router();
/* GET - Report page */
router.get('/', function(req, res, next) {
   res.sendFile('report.html', { root:'./public'});
});

module.exports = router;

The Embedded Report

This is what my report looks like embedded in the MIM Portal.

Microsoft Identity Manager Statistics Report
Microsoft Identity Manager Statistics Report

Summary

Integration of FIM / MIM with Azure Platform as a Service Services opens a world of functionality including the ability to expose information that was previously only obtainable by the FIM / MIM Administrator.