Getting started developing Custom Actions for the Google Assistant (Home)

Introduction

Whilst I was in the USA recently I bought myself a Google Home. My home already had Hue Lights, Chromecast on a couple of TV’s and I’m a big user of Spotify (Premium). It was very quick to get it up and running and doing simple tasks, but I started thinking about what custom things I could get it to do. Could I get it to call custom/private API to get some information and let me know the result? The answer is yes, and that is what this post will cover. I have a few temperature sensors that have a RestAPI that can be used to query the temperature. I got it working pretty quickly (one evening) for not having messed with the Google App Engine for many years. But there were a number of steps required, and as this functionality will form the basis for more complex functionatlity I’ve documented the step by step process I used.

But here is the result. A command, a query and custom response.

Prerequisites

  • First up you are going to need to have a Google Account. Seeing as you are probably reading this because you want to do something similar, you will have a Google Home and thereby a Google account, so you’re already covered on that point. The account associated with your Google Home is the one you NEED to use with the additional services, as it is what will tie everything together
  • You will need to register for an api.ai account. Sign in with the Google Account you have linked to your Google Home
  • You will need to also register for a Google Cloud account. Sign in with the Google Account you have linked to your Google Home
  • Download and install the Google Cloud CLI. Even though I did this on a Windows machine, I actually installed the CLI in Ubuntu via the Windows Subsystem for Linux. If you want that option, here is how to get started with the Windows Subsystem for Linux
    • change into your homedir eg cd ~/ and then run curl https://sdk.cloud.google.com | bash

Getting Started

Now it is time to start our Project. In your gCloud CLI window create a project directory. I named mine insidetemp as I will be calling an IOT Temperature sensor to get the temperature inside one of my beer brewing sheds.

inside temp.PNG

We will now create a small Javascript script that will call a RestAPI to retrieve the current temperature from an IOT Temperature Sensor. The script below does that. Modify it for whatever unauthenticated API call you want to make. You’ll need to change lines 6, 7 and 24 for building the URL you will call to get data. Line 8 contains the Function name. Mine is called insideTemp. If you name yours different, then everywhere I use ‘insideTemp’ (not case) substitute what you changed your function name to.

With your modified script (I’ve changed the api key so it won’t work as is), in your application directory create a new file named index.js and paste in your script. e.g in Linux run nano index.js 

After pasting in your script, use Cntrl + O to save as index.js and Cntrl + X to quit.

custom action function
custom action function

Run gcloud auth login and go through the authentication process using the same Google Account you used for api.ai and Google Cloud. The CLI will give you a URL you need to paste into your browser and authorize access to Google Cloud for your Google Account. Once completed get the code generated from the authorization and paste that back into the CLI

gcloud authentication
gcloud authentication

Now we will create a Google Cloud Storage Bucket for our script. I named mine the same as the Function in the script (line 8) but all in lowercase (as lower case is required).
gsutil mb gs://insidetemp

GCloud Storage Bucket.PNG

You should then be able to browse to your storage. My URL is below. Updated for your project name.

https://console.cloud.google.com/storage/browser/insidetemp?project=insidetemp

Google Cloud Storage
Google Cloud Storage

We will now upload our script and create the Function that we will call from Google Home. Change the command below for your project and script function name.

gcloud beta functions deploy insideTemp –stage-bucket insidetemp –trigger-http 

Upload and create function
Upload and create function

Once uploaded and the Function has been created you will get a HTTP Trigger URL. Copy this as you’ll need it shortly. If you need to update or change the script do that locally and run the gcloud beta functions deploy command again.

Function Created
Function Created

Validating the Function

Using a browser you should be able to browse to your Function. The URL should look something like this. Change for your project name. https://console.cloud.google.com/functions/details/us-central1/insideTemp?project=insidetemp&tab=general&duration=PT1H

If the Storage account is different (because you already had one) then navigate to https://console.cloud.google.com/functions and use select your project from the menu in the top left next to Google Cloud Platform.

Select the Testing menu then click on the Test the function button. If your script is all good it will execute and get the value back from the API. You can see mine returned 21 degrees from my IOT Temp Sensor.

Test the Function
Test the Function

Wiring up our Project to Google Home

Now that we have an HTTP Function that can retrieve info from a RestAPI let’s have that as an action in Google Home. Head on over to the Actions API here;

https://console.cloud.google.com/apis/api/actions.googleapis.com/overview

At the top of the middle pane click enable if it isn’t already enabled.

Enable Google Actions API
Enable Google Actions API

Now head over to the api.ai Agents Console https://console.api.ai/api-client/#/agents and select Create Agent.

Create Agent
Create Agent

Give it a name, choose your timezone and language and select Save.

New Agent Details
New Agent Details

Select Create Intent

Create Intent
Create Intent

Provide a couple of phrases you will speak to Google Home. Don’t worry about any other settings for now. Select Save.

Create Intent Detail
Create Intent Detail

From the Left Menu select Fulfillment.

Fulfillment
Fulfillment

Enable Webhook and paste in the URL that you got after uploading the Function Script via the gCloud cli. Select Save from the bottom of the page.

Fulfillment Webhook
Fulfillment Webhook

Back in Intents, scroll to the bottom and click Fulfillment and enable Use webhook. Select Save

Intent Webhook
Intent Webhook

From the left menu select Integrations 

Integrations
Integrations

Enable Actions on Google

Actions on Google
Actions on Google

Open Settings and select  Update Draft.

Actions Trigger
Actions Trigger

then select Test 

Actions Test
Actions Test

Update: I did also remove the Default Welcome Intent and use one of my Intent phrases. 

Test will new be active. Select Close

Actions Test
Actions Test

Head back to api.ai and select Intents. Under Response choose the Actions on Google menu and enable it. Select Save.

Intent Actions on Google
Intent Actions on Google

Testing the Custom Action via the API Console

Within the api.ai console in the right hand pane in the Try it now box type in your Intent. One of mine is what is the inside temp. This tests our integration and successfully returns the result from querying the API via our Function.

Test Success
Test Success

If you select Show JSON from the bottom right you can see the processing that went on. We can see that the Agent got the query, used the Webhook to go to Fulfillment to use the Function to call the API to get our information and provide the response.

Success JSON.PNG

Testing the Custom Action via Google Home

Go to Actions on Google https://console.actions.google.com/u/0/ and select Add/Import Project. You should see a project named NewAgent. Select it

New Actions Project
New Actions Project

Then Select Import Project

Import Project
Import Project

Step through to provide info for the Agent, including images etc as if you are going to publically publish this function. Select Save.

App Info
App Info

Finally select Test Draft. DO NOT SELECT SUBMIT DRAFT FOR REVIEW.

Test Draft
Test Draft

As we are in ‘test’ mode we follow the  Ok Google wake command with Talk to . So for my project it is Ok Google, Talk to Inside Temperature.

And we are done.

Summary

Very quickly we’ve created a custom command that queries our own API to get data and have the response spoken to us. This can then be expanded to do any number of different tasks as long as you have something to query or update and the desire to do it from your Google Home.

Enabling and using Managed Service Identity to access an Azure Key Vault with Azure PowerShell Functions

Introduction

At the end of last week (14 Sept 2017) Microsoft announced a new Azure Active Directory feature – Managed Service Identity. Managed Service Identity helps solve the chicken and egg bootstrap problem of needing credentials to connect to the Azure Key Vault to retrieve credentials. When used in conjunction with Virtual Machines, Web Apps and Azure Functions that meant having to implement methods to obfuscate credentials that were stored within them. I touched on one method that I’ve used a lot in this post here whereby I encrypt the credential and store it in the Application Settings, but it still required a keyfile to allow reversing of the encryption as part of the automation process. Thankfully those days are finally behind us.

I strongly recommend you read the Managed Service Identity announcement to understand more about what MSI is.

This post details using Managed Service Identity in PowerShell Azure Function Apps.

Enabling Managed Service Identity on your Azure Function App

In the Azure Portal navigate to your Azure Function Web App. Select it and then from the main-pane select the Platform Features tab then select Managed service identity.

Enable Managed Service Identity
Enable Managed Service Identity

Turn the toggle the switch to On for Register with Azure Active Directory then select Save.

Enable Managed Service Identity
Enable Managed Service Identity

Back in Platform Features under General Settings select Application Settings. 

Azure Function App Settings
Azure Function App Settings

Under Application Settings you will see a subset of the environment variables/settings for your Function App. In my environment I don’t see the Managed Service Identity variables there. So lets keep digging.

Azure Function App Settings
Azure Function App Settings

Under Platform Features select Console.

Azure Function App Console
Azure Function App Console

When the Console loads, type Set. Scroll down and you should see MSI_ENDPOINT and MSI_SECRET.

NOTE: These variables weren’t immediately available in my environment. The next morning they were present. So I’m assuming there is a back-end process that populates them once you have enabled Managed Service Identity. And it takes more than a couple of hours 

MSI Variables
MSI Variables

Creating a New Azure Function App that uses Managed Service Identity

We will now create a new PowerShell Function App that will use Managed Service Identity to retrieve credentials from an Azure Key Vault.

From your Azure Function App, next to Functions select the + to create a New Function. I’m using a HttpTrigger PowerShell Function. Give it a name and select Create.

New Azure Function
New Azure Function

Put the following lines into the top of your function and select Save and Run.

# MSI Variables via Function Application Settings Variables
# Endpoint and Password
$endpoint = $env:MSI_ENDPOINT
$endpoint
$secret = $env:MSI_SECRET
$secret

You will see in the output the values of these two variables.

Managed Service Identity Variables
Managed Service Identity Variables

Key Vault

Now that we know we have Managed Service Identity all ready to go, we need to allow our Function App to access our Key Vault. If you don’t have a Key Vault already then read this post where I detail how to quickly get started with the Key Vault.

Go to your Key Vault and select Access Polices from the left menu list.

Azure Key Vault Access Policy
Azure Key Vault Access Policy

Select Add new, Select Principal and locate your Function App and click Select.

Azure Key Vault Access Policy
Azure Key Vault Access Policy

As my vault contains multiple credential types, I enabled the policy for Get for all types. Select Ok. Then select Save.

Azure Key Vault Access Policy
Azure Key Vault Access Policy

We now have our Function App enabled to access the Key Vault.

Azure Key Vault Access Policy
Azure Key Vault Access Policy

Finally in your Key Vault, select a secret you want to retrieve via your Function App and copy out the Secret Identifier from the Properties.

Azure Key Vault Secret Identifier URI
Azure Key Vault Secret Identifier URI

Function App Script

Here is my Sample PowerShell Function App script that will connect to the Key Vault and retrieve credentials. Line 12 should be the only line you need to update for your Key Vault Secret that you want to retrieve. Ensure you still have the API version at the end (which isn’t in the URI you copy from the Key Vault) /?api-version=2015-06-01

When run the output if you have everything correct will look below.

KeyVault Creds Output

Summary

We now have the basis of a script that we can use in our Azure Functions to allow us to use the Managed Service Identity function to connect to an Azure Key Vault and retrieve credentials. We’ve limited the access to the Key Vault to the Azure Function App to only GET the credential. The only piece of information we had to put in our Function App was the URI for the credential we want to retrieve. Brilliant.

Display Microsoft Identity Manager Sync Engine Statistics in the MIM Portal

Introduction

In the Microsoft / Forefront Identity Manager Synchronization Service Manager under Tools we have a Statistics Report. This gives a break down of each of the Management Agents and the Connectors on each MA.

I had a recent requirement to expose this information for a customer but I didn’t want them to have to connect to the Synchronization Server (and be given the permissions to allow them to). So I looked into another way of providing a subset of this information in the MIM Portal itself.  This post details that solution.

MIM / FIM Synchronization Server Management Agent & Metaverse Statistics
MIM / FIM Synchronization Server Management Agent & Metaverse Statistics

Overview

I approached this in a similar way I did for the User Object Report I recently developed. The approach is;

  • Azure PowerShell Function App that uses Remote PowerShell to connect to the MIM Sync Server and leverage the Lithnet MIIS Automation PowerShell Module to enumerate all Management Agents and build a report on the information required in the report
  • A NodeJS WebApp calls the Azure PowerShell Function App onload to generate the report and display it
  • The NodeJS WebApp is embedded in the MIM Portal as a new Nav Bar Resource and Page

The graphic below details the basic logical integration.

MVStatsReportOverview

Prerequisites

The prerequisites to perform this I’ve covered in other posts. In concept as described above it is similar to the User Object report, that has the same prerequisites and I did a pretty good job on detailing those here. To implement this then that post is the required reading to get you ready.

Azure PowerShell Function App

Below is the raw script from my Function App that connects to the MIM Sync Server and retrieves the Management Agent Statistics for the report.

NodeJS Web App

The NodeJS Web App is the app that gets embedded in the MIM Portal that calls the Azure Function to retreive the data and then display it. To get started you’ll want to start with a based NodeJS WebApp. This post will get you started. Implementing a NodeJS WebApp using Visual Studio Code 

The only extension I’m using on top of what is listed there is JQuery. So once you have NodeJS up and running in your VSCode Terminal type npm install jquery and then npm install.

I’ve kept it simple and contained all in a single HTML file using JQuery.

In you NodeJS project you will need to reference your report.html file. It should look like this (assuming you name your report report.html)

var express = require('express');
var router = express.Router();
/* GET - Report page */
router.get('/', function(req, res, next) {
   res.sendFile('report.html', { root:'./public'});
});

module.exports = router;

The Embedded Report

This is what my report looks like embedded in the MIM Portal.

Microsoft Identity Manager Statistics Report
Microsoft Identity Manager Statistics Report

Summary

Integration of FIM / MIM with Azure Platform as a Service Services opens a world of functionality including the ability to expose information that was previously only obtainable by the FIM / MIM Administrator.

Quickly creating and using an Azure Key Vault with PowerShell

 

Introduction

A couple of weeks back I was messing around with the Azure Key Vault looking to centralise a bunch of credentials for my ever-growing list of Azure Functions that are automating numerous tasks. What I found was getting an Azure Key Vault setup and getting credentials in and out was a little more cumbersome than what I thought it should be. At that same point via Twitter this tweet appeared in my timeline from a retweet. I’m not too sure why, but maybe because I’m been migrating to VSCode myself I checked out Axel’s project.

Axel Agazoth tweet
Axel Agazoth tweet

Axel’s PowerShell Module simplifies creating and integrating with the Azure Key Vault. After messing with it and suggesting a couple of enhancements that Axel graciously entertained, I’m creating vaults, adding and removing credentials in the simplified way I’d wanted.

This quickstart guide to using this module will get you started too.

Create an Azure Key Vault

This is one of the beauties of Axel’s module. If the Resource Group and/or Storage Group you want associated with your Key Vault doesn’t exist then it creates them.

Update the following script for the location (line 8) and the name (line 10) that will be given to your Storage Account, Resource Group and Vault. Modify if you want to use different names for each.

Done, Key Vault created.

Create Azure KeyVault
Create Azure KeyVault
Azure Key Vault Created
Azure Key Vault Created

Connect to the Azure Key Vault

This script assumes you’re now in a new session and wanting to connect to the Key Vault. Again, a simplified version whereby the SG, RG and KV names are all the same.  Update for your location and Key Vault name.

Connected.

Connect to Azure Key Vault
Connect to Azure Key Vault

Add a Certificate to the Azure Key Vault

To add a certificate to our new Key Vault use the command below. It will prompt you for your certificate password and add the cert to the key vault.

Add Cert to Vault
Add Cert to Vault

Certificate added to Key Vault.

Cert Added to Vault
Cert Added to Vault

Retrieve a Certificate from the Azure Key Vault

To retrieve a certificate from the Key Vault is just as simple.

$VaultCert = Get-AzureCertificate -Name "AADAppCert" -ResourceGroupName $name -StorageAccountName $name -VaultName $name
Retrieve a Cert
Retrieve a Cert

Add Credentials to the Azure Key Vault

Adding username/password or clientID/clientSecret to the Key Vault is just as easy.

# Store credentials into the Azure Key Vault
Set-AzureCredential -UserName "serviceAccount" -Password ($pwd = Read-Host -AsSecureString) -VaultName $name -StorageAccountName $name -Verbose

Credentials added to vault

Add Creds to Key Vault
Add Creds to Key Vault
Creds Added to Vault
Creds Added to Vault

Retrieve Credentials from the Azure Key Vault

Retrieving credentials is just as easy.

# Get credentials from the Azure Key Vault
$AzVaultCreds = Get-AzureCredential -UserName "serviceAccount" -VaultName $name -StorageAccountName $name -Verbose

Credentials retrieved.

Retrieve Account Creds
Retrieve Account Creds

Remove Credentials from the Azure Key Vault

Removing credentials is also a simple cmdlet.

# Remove credentials from the Azure Key Vault
Remove-AzureCredential -UserName "serviceAccount" -VaultName $name -StorageAccountName $name -Verbose

Credentials removed.

Remove Credentials from Key Vault
Remove Credentials from Key Vault

Summary

Hopefully this gets you started quickly with the Azure Key Vault. Credit to Axel for creating the module. It’s now part of my toolkit that I’m using a lot.

Configuring Remote PowerShell to a Remote Active Directory Forest for FIM/MIM GalSync

Introduction

Windows Remote Management (aka Remote PowerShell) is a wonderful thing; when it works straight out of the box when you’re in the same domain. Getting it working across Forests though can feel like jumping through hoop after hoop, and sometimes like the hoops are on fire.  When configuring GALSync ([Exchange] Global Address List Synchronisation) with FIM/MIM this always means across AD Forests. The graphic below shows the simplest relationship. If there is a firewall(s) in between then you’ll have additional hoops to jump through.

GALSync

This article here is the most definitive I’ve found  about what is required, but it isn’t easily found even when you know it exists. In the last few months I’ve had to set up GALSync with FIM/MIM a number of times, and I have visibility that I’ll be needing to do it again in the future. So here is my consolidated version of the process using PowerShell to make the configuration changes. If nothing else it’ll help me find it quickly next time I need to do it.

This post assumes you have the other prerequisites all sorted. They are pretty clear in the linked article above such as a One-way Cross Forest Trust, connectivity on the necessary ports if there are firewalls in-between FIM/MIM and the Exchange CAS Server and Domain Controllers in the remote environment.

Configuring Remote PowerShell for FIM/MIM GALSync

My tip is to start from the MIM Sync Server.

  1. Get the details for the Service Account that you have/will specify on your GALSync Active Directory Management Agent that connects to the Remote Forest.
  2. Have that account be given (temporarily) Remote Desktop permissions to the Remote Exchange CAS Server that you will be configuring the Active Directory Management Agent to connect to.  Or use another Admin account that has permissions to Remote Desktop into the CAS Server, then …
  3. … start a Remote Terminal Services Session to the Exchange CAS Server in the Remote Forest

On the Exchange CAS Server (non SSL WinRM)

  • WinRM must have Kerberos authentication enabled
    • Kerberos requires TCP and UDP port 88 to be opened from the FIM/MIM server to ALL Domain Controllers in the target Forest. Run the following two commands in an elevated (Administrator) Powershell ISE/Shell session to enable Kerberos
      • set-item wsman:\localhost\service\auth\Kerberos -value true
      • set-item wsman:\localhost\service\AllowUnencrypted -value true 

4. then on the MIM Sync Server perform the following …

On the MIM Sync Server (non SSL WinRM)

  • WinRM on the MIM Sync Server must have Kerberos authentication enabled also. Run the following commands in an elevated (Administrator) Powershell ISE/Shell session. The first is to enable Kerberos.
    • set-item wsman:\localhost\client\auth\Kerberos -value true
  • Add the Exchange Server to the list of trusted hosts on the FIM Server
    • Set-item wsman:localhost\client\trustedhosts -value ExchangeCASFQDN
  • Allow unencrypted traffic
    • set-item wsman:\localhost\client\AllowUnencrypted -value true 

Verification (from the MIM Sync Server)

  1. Using PowerShell ISE select File => New Remote Powershell Tab
  2. enter the ExchangeCASFQDN for the Computer field
  3. enter the  Service Account that you have specified on your GALSync Active Directory Management Agent that connects to the Remote Forest for User name in the format NetBIOSDOMAINName\Username
  4. If you have done everything correctly you will get a remote powershell command prompt on the Exchange CAS host.
  5. To confirm you have all your other Exchange Dependencies correct (and your AD MA Service account has the necessary permissions in Exchange) run the following script line-by-line. If you have configured Remote PowerShell correctly and have met all the prerequisites you should have are remote session into Exchange.
Set-ExecutionPolicy RemoteSigned
$Creds = Get-Credential
# NBDomain\ADMAServiceAccountUser
$Session = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri http://.customer.com/PowerShell/ -Credential $Creds -Authentication kerberos
Import-PSSession $Session
# Get a list of Exchange Servers
Get-ExchangeServer
# Get a list of Mailboxes
Get-Mailbox
# Get a list of Mail Users
Get-MailUser

# Close and remove the session 
Remove-PSSession $Session

Cleanup

Remove Remote Desktop permissions from the Active Directory Management Agent Service Account if you enabled it to configure the Exchange CAS Server.

Receive Push Notifications from Microsoft/Forefront Identity Manager on your Mobile/Tablet/Computer

Background

Recently in a FIM/MIM environment a daily automated process was executing but the task it was performing was dependent on an upstream process that generates a feed, and the schedule for that feed had changed (without notice to me). Needless to say FIM/MIM wasn’t getting the information it needed to process. This got me thinking about notifications.

If you’re anything like me you probably have numerous email accounts and your subconscious has all but programmed itself to ignore “new email” notifications. However Push Notifications I typically do notice. Whilst in the example above I did have some error handling in place if the process completely failed (it is a development environment), I didn’t have anything for partial failures. Anyway it did get me thinking that I’d like to receive a notification if something that should happen didn’t.

Overview

This post details using push notifications to advise when expected events don’t transpire. In this particular example, I have an Azure Function App that connects once a day to a FTP Server and retrieves a series of exports and puts them on my FIM/MIM Synchronisation Sever. The Push Notification service I am using is Push Bullet. Push Bullet for free accounts (without a Pro subscription) are limited to 500 pushes per month. That should be more than enough. If I’ve got errors in excess of 500 per month I’ve got much bigger problems.

Getting Started

First up you will need to sign up for Push Bullet. It is very straight forward if you have a Facebook or Google account. As you’re probably wanting multiple people to receive the notifications it would pay to set up a shared Google Account that your team can use to connect to with their devices. Now you have an account head to your new Account Settings page and create an Access Token. Record it for use in the scripts below.

Connecting to the API

Test you can access the Push Bullet API using your Access Token and PowerShell. Update the following script for your Access Token in line 3 and execute. You should see information returned associated with your new Push Bullet account.

Next you will want to install the Push Bullet App on the device(s) you want to get the notification(s) on. I installed it on my Apple iPhone and also installed the Chrome Browser extension.

Using PowerShell we can then query to get the devices connected to the account. In the same PowerShell session you tested the API with above run this API call

$devices = Invoke-RestMethod -Method Get -Headers $header -Uri ($apiURI +"v2/devices")
$devices

This will return your registered devices.

Apple iOS iPhone Push Notification registered devices
Apple iOS iPhone Push Notification registered devices

If we want a notification to target a particular device we need to provide the Iden value associated with that device. If we don’t specify a target, the push notification will hit all devices. In my example above with two devices registered my iPhone was device two. So the target Iden I could get with;

$iphoneIden = $devices.devices[1].iden

Push Bullet allows for different notification types (Note, Link and File). Note is the one that’ll I’ll be using. More info on the other types here.

Sending Test Notification

To perform a notification test, update the following script for your Access Token (line 3). I’ve omitted the Device Identifier to send the message to all devices. I also had to logout of the iOS Push Bullet App and log in again to get the notifications to show.

Success. I received the notification on my iPhone and also in my Chrome browser.

Apple iOS iPhone Push Notification from FIM/MIM Identity Manager
Apple iOS iPhone Push Notification from FIM/MIM Identity Manager

Implementation

Getting back to my requirement of being notified when a job didn’t find what it expected, I updated my PowerShell Function App that is based off this blog post here to evaluate what it processed and if it didn’t find what is expected, it sends me a notification. I already had some error handling in my implementation based off that blog post but it was based on full failure, not partial (which is what I was experiencing whereby only one part of the process wasn’t returning data).

NOTE: I had to also add the ServerCertificateValidationCallback line into my Function App script before calling the API POST to send the notification as I was getting the dreaded following PowerShell Invoke-RestMethod / Invoke-WebRequest error when sending the notification via the Function App. I didn’t get that error on my dev workstation which is a bit weird.

Invoke-WebRequest : The underlying connection was closed: Could not establish trust relationship for the SSL/TLS secure 
channel.

If you also receive the error above (or you will be sending Push Notifications via Azure Function Apps) insert this line before your invoke-restmethod call.

 [System.Net.ServicePointManager]::ServerCertificateValidationCallback = {$true}

Summary

Essentially this is my first foray into enabling anything for Push Notifications and this post is food for thought on what can be easily enabled within FIM/MIM to give timely visibility to automated scheduled functions when they don’t perform as expected. It was incredibly simple to set up and get working. I see myself enabling more FIM/MIM functions with Push Notifications in the future.

Creating an AzureAD WebApp using PowerShell to leverage Certificate Based Authentication

Introduction

Previously I’ve posted about using PowerShell to access the Microsoft AzureAD/Graph API in a number of different ways. Two such examples I’ve listed below. The first uses a Username and Password method for Authentication, whilst the second uses a registered application and therefore ClientID and Client Secret.

As time has gone on I have numerous WebApp’s doing all sorts of automation. However they all rely on accounts with a username and password, or clientid and secret, where the passwords and secrets expire. Granted the secrets have a couple of years of life and are better than passwords which depending on the environment roll every 30-45 days.

However using Certificates would allow for a script that is part of an automated process to run for much longer than the key lifetime available for WebApps and definitely longer than passwords. Obviously there is security around the certificate to be considered so do keep that in mind.

Overview

This post is going to detail a couple of simple but versatile scripts;

  1. Using PowerShell we will;
    1.  Configure AzureAD
      1. Create a Self Signed 10yr Certificate
      2. Create an AzureAD WebApp and assign the Certificate to it
      3. Apply permissions to the WebApp (this is manual via the Azure Portal)
      4. Record the key parameters for use in the second script
    2. Connect to AzureAD using our Certificate and new WebApp

Creating the AzureAD WebApp, Self Signed Certificate and Assigning Application Permissions

The script below does everything required. Run it line by line, or in small chunks as you step through the process. You will need the AzureRM and Azure AD Utils Powershell Modules installed on the machine you run this script on.

Change;

  • Lines 3 & 4 if you want a certificate with a time-frame other than 10yrs
  • Line 5 for the password you want associated with the certificate for exporting/importing the private key
  • Line 6 for the certificate subject name and location it’ll be stored
  • Line 8 for a valid location to export it too
  • Line 11 for the same path as provided in Line 8
  • Lines 24 & 25 for an account to automatically connect to AAD with
  • Line 31 for the name of your WebApp

Before running line 37 login to the Azure Portal and assign permissions to the WebApp. e.g. AzureAD Directory Permissions. When you then run Line 37 it will trigger a GUI for AuthN and AuthZ to be presented. Sign in as an Admin and accept the oAuth2 Permission Authorizations for whatever you have request on the WebApp.

Windows Azure Active Directory Permissions
Windows Azure Active Directory Permissions

e.g Graph API Read/Write Permissions

AzureAD WebApp GraphAPI permissions
AzureAD WebApp GraphAPI permissions

Connecting to AzureAD using our Certificate and new WebApp

Update lines 3, 4, 6 and 7 as you step through lines 40-43 from the configuration script above which copies key configuration settings to the clipboard.

The following script then gets our certificate out of the local store and takes the Tenant and WebApp parameters and passes them to Connect-AzureAD in Line 15 which will connect you to AAD and allow you to run AzureAD cmdlets.

If you wish to go direct to the GraphAPI, lines 20 and 23 show leveraging the AzureADUtils Module to connect to AzureAD via the GraphAPI.

Notes on creating your Self-Signed Certificate in PowerShell

I’m using the PowerShell New-SelfSignedCertificate cmdlet to create the self signed certificate. If when you run New-SelfSignedCertificate you get the error as shown below, make sure you have Windows Management Framework 5.1 and if you don’t have Visual Studio or the Windows 8.1/10 SDK, get the Windows 8.1 SDK from here and just install the base SDK as shown further below.

New-SelfSignedCertificate Error
New-SelfSignedCertificate Error
Windows SDK Install for makecert.exe
Windows Software Development Kit

Once the install is complete copy C:\Program Files (x86)\Windows Kits\8.1\bin\x86\makecert.exe to C:\windows\system32

Summary

The two scripts above show how using PowerShell we can quickly create a Self Signed Certificate, Create an Azure AD WebApp and grant it some permissions. Then using a small PowerShell script we can connect and query AAD/GraphAPI using our certificate and not be concerned about passwords or keys expiring for 10yrs (in this example which can be any timeframe you wish).

Synchronizing Passwords from Active Directory to the IBM/Lotus Domino Identity Vault using Microsoft Identity Manager – Part 3

Introduction

As the title suggests this is Part 3, and the final part in a three-part post on configuring FIM/MIM to synchronise users passwords from AD to the Domino ID Vault via PCNS and FIM/MIM.
Part 1 here detailed the creation of a PowerShell Management Agent to join users from Domino to the MIM Sync Metaverse.
Part 2 here detailed the creation and configuration of the Domino Agents to receive password changes via the PS MA into the ID Vault.

This post will wrap it all up with the details on calling the Domino Agents on password sync events (from PCNS via MIM)

Prerequisites

You will need the IBM Notes client installed and configured on your MIM Sync Server in order to put a document in the database we created in Part 2 and start the agent to process the document(s).

Overview

Essentially this is the process;

  • Password changed for a user (either by an admin, or by the user via their domain joined workstation, password reset or any other password change mechanism)
  • Password change is captured by the AD PCNS Filter installed and configured on each (writeable) Domain Controller
  • The DC using the PCNS Config in the domain locates the MIM Sync Server to send the password change too
  • The MIM Sync Server has the associated AD Domain configured as a Password Sync Source
  • Our new PowerShell ID Vault Notes MA is configured as a Password Target
  • MIM Sync passes off the password change event for MIM joined users to the ID Vault Password Change MA which initiates the Password.ps1 script (below)
  • The password.ps1 script creates a document (that contains the details for the password change) in our ID Vault Password Sync Database we created in Part 2 of this series and then tells the MIMPwdTrigger Agent to start
  • The MIMPwdTrigger Agent picks up the document, passes it to the MIMPasswordSync Agent which sends the password change to the ID Vault

Domino PowerShell Management Agent Password.ps1 Script

Put this Password.ps1 script in the same location you put the Schema, Import and Export scripts earlier.

Testing Password Sync End to End (Active Directory to the ID Vault)

The following screen shots show me tracing through the logs for a password change as it makes it way from the AD Domain Controller to MIM Sync to the MA to the MA Password script to the Notes DB as a document triggered to be process by the Notes Agent and the user updated in the ID Vault.

First the password change event is initiated to the MIM Sync Service by the Domain Controller that captured the password change.

PCNS provides all the details for the password change.

The MIM Sync Server determines where to send the change which includes our PS Notes MA.

Our PS Notes MA logged the process.

Notes MA LOG

=============================================================

Display Name: Jane XXX/xxx/xxxxx-Aus

Action: Set

Old pwd:

New pwd: Password123456

Unlock: False

Force change: False

Validate: False

Database: System.__ComObject

As did the Notes Agent as it process the change.

Notes Agent Log

MIMPasswordSync|mimpasswordsync: 08/03/2017 02:56:22 PM: Reseting password …

MIMPasswordSync|mimpasswordsync: 08/03/2017 02:56:22 PM: Server: xxxNotes1/xxxxx-Aus User:Jane xxx/xxx/xxxxx-Aus

MIMPasswordSync|mimpasswordsync: 08/03/2017 02:56:23 PM: Return value: true

MIMPasswordSync|mimpasswordsync: 08/03/2017 02:56:23 PM: Removed User ID Vault change document from ‘xxxNotes1/xxxxx-Aus’

And finally we see the change reflected in the ID Vault. Looking at the time-stamps along the way we see that it all happened in approximately 2 seconds.

Summary

This three-part blog post has shown how to get passwords from Active Directory to the MIM Sync connected source across to IBM Domino and into the ID Vault using the Granfeldt PowerShell Management Agent and some configuration with a Database in Domino with two Domino Agents.

What have you synchronised passwords too using FIM/MIM ?

UPDATED: Identifying Active Directory Users with Pwned Passwords using Microsoft/Forefront Identity Manager

Earlier this week I posted this blog post that showed a working example of using a custom Pwned Password FIM/MIM Management Agent to flag a boolean attribute in the MIM Service to indicate whether a users password is in the pwned password dataset or not. If you haven’t read that post this won’t make a lot of sense, so read that then come back.

The solution when receiving a new password for a user (via Microsoft Password Change Notification Service) was checking against the Have I Been Pwned API. The disclaimer at the start of the blog post detailed why this is a bad idea for production credentials. The intent was to show a working example of what could be achieved.

This update post shows a working solution that you can implement internal to a network. Essentially taking the Pwned Password Datasets available here and loading them into a local network SQL Server and then querying that from the FIM/MIM Pwned Password Management Agent rather than calling the external public API.

Creating an SQL Server Database for the Pwned Passwords

On my SQL Server using SQL Server Management Studio I right-clicked on Databases and chose New Database. I gave it the name PwnedPasswords and told it where I wanted my DB and Logs to go to.

Then in a Query window in SQL Server Management Studio I used the following script to created a table (dbo.pwnedPasswords).

use PwnedPasswords;
 CREATE TABLE dbo.pwnedPasswords
( password_id int NOT IDENTITY(1,1) NULL,
 passwords varchar(max) NOT NULL,
 CONSTRAINT passwords_pk PRIMARY KEY (password_id)
);

Again using a query window in SQL Server Management Studio I used the following script to create an index for the passwords.

USE [PwnedPasswords]USE [PwnedPasswords]
GO
SET ANSI_PADDING ON

GO
CREATE UNIQUE NONCLUSTERED INDEX [PasswordIndex] ON [dbo].[pwnedPasswords]( [password_id] ASC)INCLUDE ( [passwords]) WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, SORT_IN_TEMPDB = OFF, IGNORE_DUP_KEY = OFF, DROP_EXISTING = OFF, ONLINE = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON)
GO

The last thing I did on the DB was to take the MIM Sync Server Active Directory Service Account (that was already in the SQL Server Logins) and give that account Reader Access to my new PwnedPasswords Database. I gave this account access as I’m using Integrated Authentication for login to SQL and as the MA is initiated by the MIM Sync Service Account, that is the account that needs the access.

Getting the Pwned Password Datasets into the new Database

I’m far from a DBA. I’m an identity guy. So using tools I was most familiar with (PowerShell) I created a simple script to open the password dump files as a stream (as Get-Content wasn’t going to handle the file sizes), read in the lines, convert the format and insert the rows into SQL. I performed the inserts in batches of 1000 and I performed it locally on the SQL Server.

In order to get the content from the dump file, add another column and get it in a format quickly to insert into the SQL DB I used the Out-DataTable function available from here.

The script could probably be improved as I only spend about 20-30 minutes on it. It is opening and closing a connection to the SQL DB each time it inserts 1000 rows. That could be moved outside the Insert2DB Function and maybe the batch size increased. Either way it is a starting point and I used it to write millions of rows into the DB successfully.

Updated FIM/MIM Pwned Passwords Management Agent Password.ps1 script

This then is the only other change to the solution. The Password.ps1 script rather than querying the PwnedPasswords API queries the SQL DB and sets the pwned boolean flag accordingly.

Summary

This enhancement shows a working concept that will be more appealing to Security Officers within corporate organisations if you have an appetite to know what your potential exposure is based on your Active Directory Users Passwords.

 

Synchronizing Passwords from Active Directory to the IBM/Lotus Domino Identity Vault using Microsoft Identity Manager – Part 2

Introduction

As the title suggests this is Part 2 of a three-part post on configuring FIM/MIM to synchronise users passwords from AD to the Domino ID Vault via PCNS and FIM/MIM.
Part 1 here detailed the creation of a PowerShell Management Agent to join users from Domino to the MIM Sync Metaverse.

This post details the creation and configuration of the Domino Agents to receive password changes via the PS MA into the ID Vault.

Part 3 here details calling the Domino Agents on password sync events (from PCNS via MIM)

Creating a New Domino Application

As mentioned above and in Part 1 we need to create Domino Agents to process password change events into the ID Vault. Domino Agents are required as Domino security will not allow password change events (called using the resetUserPassword method) to be run remotely.  The resetUserPassword method is only supported using the RunOnServer method.

In order to create a Domino Agent we need to install and run the IBM Domino Designer.

With that installed we can start with our first Domino Agent. We will create two Agents. The first will be the one that will perform the execution of the resetUserPassword method. The second will be the trigger that will retrieve the details of the user to change the password for and pass it to the first agent to execute.

In IBM Domino Designer select File => New => Application

Specify the Server to create the new Application on (and subsequently where it will run) and give the Application a name. I used ID Vault PWD Sync.

Create the MIM Password Sync Domino Agent

With the New Application created we can navigate to Code => Agents and select New Agent

Give the Agent a name. I named this one MIMPasswordSync and make sure the type is Java

With the Agent created we need to give it the script that will perform the password changes. Double click on the agent then in the Agent Contents double-click on JavaAgent.java and paste in the script (from Github further below). The only change you may need to make is the location where you want the logging to go to. You will need to create that path if it doesn’t exist as well.

Selecting the Agent Tab in the main pain locate the Agent Properties and configure as per the screenshot below.

Select the Security Tab in the Properties pane and set the Runtime security level to 3. If the options are blanked out and you can’t select them, close the agent and re-open it and you will be able to configure this option.

Create the MIM Password Trigger Domino Agent

Create the MIM Password Trigger Domino Agent just as you did the MIM Password Sync Domino Agent. Name it MIMPwdTrigger and make sure the type is also Java. Double click on the Agent and then double-click on JavaAgent.java in the Agent Contents. Use the following script. Note it calls the MIMPasswordSync Agent so if you called yours something different you will need to change it in this script (line 12).

Select the MIMPwdTrigger Agent in the main pane and look at the Properties. Make Runtime to be On event and After documents are created or modified. 

Select the Security Tab in the Properties pane and set the Runtime security level to 3. If the options are blanked out and you can’t select them, close the agent and re-open it and you will be able to configure this option

Configuration ID Vault Password Reset Authority

In order for our MIMPasswordSync Agent to actually change users passwords in the ID Vault we need to configure the ID Vault to allow the account that created and signs the Agents and the Server that the Agents will run on to be Trusted Password Reset Authorities.

Using the IBM Domino Administrator select the Administration menu item and then Configuration. Expand Security from the left hand pane and select ID Vaults.

Having selected the ID Vault from the main pane you will be changing passwords in, on the right hand menu pane expand ID Vaults and double-click on Password Reset Authority.

From the Password reset authority by organisation box select the OrgU/Org you will be sync’ing passwords too. If you have many you will need to complete this step for each one. You will need to do one OrgU/Org at a time if they have different certifiers.

From the Available users, groups and servers box select the Server that you run the Agents with, and select Add. Repeat for selecting the user that you created the Agents with and that will sign the Agents. Then select the user you just added in the Password reset authority by organisation and then select the Password reset agent authority check box.  That will put the red @ symbol on the user which identifies it as a Password Reset Agent Authority.

Select Next/Configure, locate the certifier ID for that OrgU/Org, provide the password and complete the process. Repeat for each OrgU/Org.

Signing the Agents

Back using the Domino Designer double-click Agents in the left menu pane. Select each Agent and then click Sign.

Creating a Form to test the Agents

Now we will create a form to allow us to create a document in the DB easily and test that our agents work. In Domino Designer, right-click on Forms and select New Form.

Give the Form a name and an alias. It doesn’t matter what you call it. We’re just using it to test the agents.

Double Click on your new Form. Click in the empty pane and then from the Create menu select Field. Name it server. Repeat for another field and name it username.

Repeat for the third text field, but name it password and select the Type Password.

Finally from the Create menu select Hotspot => Button. Name it Submit and then select it. In the Properties of the button for Run select Client. For Formula enter the formula below.

@Command([FileSave]);
@Command([FileCloseWindow]);
@Command([ToolsRunMacro];"MIMPwdTrigger")

Testing the Agents

In the Domino Designer right-click on the form and select Preview in Notes.

The format for the fields is;

  • server: Server/Org
  • user: Joe Smith/OrgU/Org
  • password: P@SSw0rd

Enter valid input for your environment.

And click on the Submit button. If you have everything correct the document you just created will be processed by the Trigger Agent and then the MIM Password Sync Agent.

If there is an error you will likely have a document still in the IDVault PWD Sync database as shown below. Check the document to make sure you got the details for the user and server correct.

Also check the log file. C:\PWDSync\AgentLog.txt by default as per the script path. When working correctly you will see an entry as per below. If it wasn’t successful the error message should point you to where you have gone wrong. More than likely different names for the Agents, or incorrect format or name for the user and/or server. Or Trusted Password Authority not set for the account the agent was signed with to the OrgU/Org containing the user you are trying to change a password for.

MIMPasswordSync|mimpasswordsync: 08/08/2017 02:08:24 PM: Reseting password ...
MIMPasswordSync|mimpasswordsync: 08/08/2017 02:08:24 PM: Server: XXXNotes1 User:Jane Doe/OrgU/Org-Aus
MIMPasswordSync|mimpasswordsync: 08/08/2017 02:08:26 PM: Return value: true
MIMPasswordSync|mimpasswordsync: 08/08/2017 02:08:26 PM: Removed User ID Vault change document from 'XXXNotes1'

Summary

Now that we have our Agents built and working we need to be able to call them from our MIM Sync Server. That will be covered in the third and final post in this series.