Provisioning Hybrid Exchange/Exchange Online Mailboxes with Microsoft Identity Manager

Introduction

Working for Kloud all our projects involve Cloud services, and all our customers have varying and unique requirements. Recently one of our customers embarked on their migration from On-Premise Exchange to Exchange Online. Nothing really groundbreaking there though, however they had a number of unique requirements including management of Litigation Hold. And that needed to be integrated with their existing Microsoft Identity Manager implementation (that currently provisions new users to their Exchange 2013 environment). They also required that management of the Exchange environment still be possible via the Exchange Management Console against a local Exchange server. This post details how I integrated the environments using MIM.

Overview

In order to integrate the Provisioning and Lifecycle management of Exchange Online Mailboxes in a Hybrid Exchange with Microsoft Identity Manager I created a custom PowerShell Management Agent simply because it was going to provide the flexibility I needed.

Provisioning is based on the following process;

  1. MIM Creates new user in Active Directory (no changes to existing MIM provisioning process)
  2. Azure Active Directory Connect synchronises the user to Azure Active Directory
  3. The Exchange Online MIM Management Agent sees the corresponding AAD account for the new user
  4. MIM Declarative Rules trigger the creation of a new Remote Mailbox for the AD/AAD user against the local Exchange 2013 On Premise Server. This allows the EMC to be used to manage mailboxes On Premise even though the mailbox resides in Office365/Exchange Online
  5. AADC/Exchange synchronises the information as part of the Hybrid Exchange topology
  6. MIM sees the EXO Mailbox configuration for the new user and enables Litigation Hold against the EXO Mailbox (if required)

The following diagram graphically depicts this process.

EXO IDM Provisioning Solution.png

Exchange Online PowerShell MA

As always I’m using my favourite PowerShell Management Agent, the Grandfeldt PS MA now available on Github here.

Schema Script

The Schema script configures the schema required for current and future EXO management requirements. The Schema is based on a single Object Class “MailUser” but pulls the information from a combination of Azure AD User and Exchange Online Mailbox object classes for an associated account. Azure AD User objects are prefixed by ‘AAD’. Non AAD prefixed attributes are EXO Mailbox attributes.

Import Script

The Import script connects to both Azure AD and Exchange Online to retrieve Azure AD User accounts and if present the associated mailbox for a user.

It retrieves all Member AAD User Accounts and puts them into a Hash Table. Connectivity to AAD is via the AzureADPreview PowerShell module. It retrieves all Mailboxes and puts them into a Hash Table. It then processes all the mailboxes first including the associated AAD User account (utilising a join via userPrincipalName).

Following processing all mailboxes the remainder of the AAD Accounts (without mailboxes) are processed.

Export Script

The Export script performs the necessary integration against OnPremise Exchange Server 2013 for Provisioning and Exchange Online for the rest of management. Both utilise Remote Powershell. It also leverages the Lithnet MIIS Automation PowerShell Module to query the Metaverse to validate current object statuses.

Wiring it all up

The scripts above will allow you to integrate a FIM/MIM implementation with AAD/EXO for management of users EXO Mailboxes. You’ll need connectivity from the MIM Sync Server to AAD/O365 in order to manage them.  Everything else I wired up using a few Sets, Workflows, Sync Rules and MPR’s.

A quick start guide for Deploying and Configuring Node-RED as an Azure WebApp

 

Introduction

I’ve been experimenting and messing around with IoT devices for well over 10 years. Back then it wasn’t called IoT, and it was very much a build it and write it yourself approach.

Fast forward to 2017 and you can buy a microprocessor for a couple of dollars that includes WiFi. Environmental sensors are available for another couple of dollars and we can start to publish environmental telemetry without having to build circuitry and develop code. And rather than having to design and deploy a database to store the telemetry (as I was doing 10+ years ago) we can send it to SaaS/PaaS services and build dashboards very quickly.

This post provides a quick start guide to those last few points. Visualising data from IoT devices using Azure Platform-as-a-Service services. Here is a rudimentary environment dashboard I put together very quickly.

NodeRedDashboardUI.PNG

Overview

Having played with numerous services recently for my already API integrated IoT devices, I knew I wanted a solution to visualise the data, but I didn’t want to deploy dedicated infrastructure and I wanted to keep the number of moving parts to a minimum. I looked at getting my devices to publish their telemetry via MQTT which is  great solution (for scale or rapidly changing data), but when you are only dealing with a handful of sensors and data that isn’t highly dynamic it is over-kill. I wanted to simply poll the devices as required and obtain the current readings and visualise it. Think temperature, pressure, humidity.

Through my research I like the look of Node-RED for its quick and simplistic approach to obtaining data and manipulating it for presentation. Node-RED relies on NodeJS which I figured I could deploy as an Azure WebApp (similar to what I did here). Sure enough I could. However not long after I got it working I discovered this project. A Node-RED enabled NodeJS Web App you can deploy straight from Github. Awesome work Juan Manuel Servera.

Prerequisites

The quickest way to start then is to use Juan’s Azure WebApp wrapper for Node-RED. Follow his instructions to get that deployed to your Azure Subscription. Once deployed you can navigate to your Node-RED WebApp and you should see something similar to the image below.

DefaultNode-Red.PNG

The first thing you need to do is secure the app. From your WebApp Application Settings in the Azure Portal, use Kudu to navigate to the WebApp files. Under your wwwroot/WebApp directory you will find the settings.js file. Select the file and select the Edit (pencil) icon.

Node-Red-Settings

Comment in the adminAuth section around lines 93-100. To generate the encrypted password on a local install of NodeJS I ran the following command and copied the hash. Change ‘whatisagoodpassword’ for your desired password.

node -e "console.log(require('bcryptjs').hashSync(process.argv[1], 8));" whatisagoodpassword

Select Save, then Restart your application.

Node-Red-Password

On loading your WebApp again you will be prompted to login. Login and lets get started.

Login

Configuring Node-RED

Now it is time to pull in some data and visualise it. I’m not going to go into too much detail as what you want to do is probably quick different to me.

At its simplest though I want to trigger on a timer a call to my sensor API’s to return the values and display them as either text, a graph or a gauge. Below graphically shows he configuration for the dashboard shown above.

ConfigureNodeRED.PNG

For each entity on the dashboard there is and input. For me this is trigger every 15 minutes. That looks like this.

Get-15mins

Next is the API to get the data. The API I’m calling is an open GET with the API key in the URL so looks like this.

HTTPRequest

With the JSON response from the API I retrieve the temperature value and return it as msg for use in the UI.

Function

I then have the Gauge for Temperature. I’ve set the minimum and maximum values and gone with the defaults for the colours.

Guage

I’m also outputting debug info during setup for the raw response from the Function ….

DebugParsedOutput

….. and from the parsed function.

DebugHTTPRequest

These appear in the Debug pane on the right hand side.

DebugWindow

After each configuration change simply select Deploy and then switch over to your Node-RED WebApp. That will looks like your URI for your WebApp with UI on the end eg. http://.azurewebsites.net/ui

Conclusion

Thanks to Azure PaaS services and the ability to use a graphical IoT tool like Node-RED we can quickly deploy a solution to visualise IoT data without having to deploy any backend infrastructure. The only hardware is the IoT sensors, everything else is serverless.

Enabling and using Managed Service Identity to access an Azure Key Vault with Azure PowerShell Functions

Introduction

At the end of last week (14 Sept 2017) Microsoft announced a new Azure Active Directory feature – Managed Service Identity. Managed Service Identity helps solve the chicken and egg bootstrap problem of needing credentials to connect to the Azure Key Vault to retrieve credentials. When used in conjunction with Virtual Machines, Web Apps and Azure Functions that meant having to implement methods to obfuscate credentials that were stored within them. I touched on one method that I’ve used a lot in this post here whereby I encrypt the credential and store it in the Application Settings, but it still required a keyfile to allow reversing of the encryption as part of the automation process. Thankfully those days are finally behind us.

I strongly recommend you read the Managed Service Identity announcement to understand more about what MSI is.

This post details using Managed Service Identity in PowerShell Azure Function Apps.

Enabling Managed Service Identity on your Azure Function App

In the Azure Portal navigate to your Azure Function Web App. Select it and then from the main-pane select the Platform Features tab then select Managed service identity.

Enable Managed Service Identity
Enable Managed Service Identity

Turn the toggle the switch to On for Register with Azure Active Directory then select Save.

Enable Managed Service Identity
Enable Managed Service Identity

Back in Platform Features under General Settings select Application Settings. 

Azure Function App Settings
Azure Function App Settings

Under Application Settings you will see a subset of the environment variables/settings for your Function App. In my environment I don’t see the Managed Service Identity variables there. So lets keep digging.

Azure Function App Settings
Azure Function App Settings

Under Platform Features select Console.

Azure Function App Console
Azure Function App Console

When the Console loads, type Set. Scroll down and you should see MSI_ENDPOINT and MSI_SECRET.

NOTE: These variables weren’t immediately available in my environment. The next morning they were present. So I’m assuming there is a back-end process that populates them once you have enabled Managed Service Identity. And it takes more than a couple of hours 

MSI Variables
MSI Variables

Creating a New Azure Function App that uses Managed Service Identity

We will now create a new PowerShell Function App that will use Managed Service Identity to retrieve credentials from an Azure Key Vault.

From your Azure Function App, next to Functions select the + to create a New Function. I’m using a HttpTrigger PowerShell Function. Give it a name and select Create.

New Azure Function
New Azure Function

Put the following lines into the top of your function and select Save and Run.

# MSI Variables via Function Application Settings Variables
# Endpoint and Password
$endpoint = $env:MSI_ENDPOINT
$endpoint
$secret = $env:MSI_SECRET
$secret

You will see in the output the values of these two variables.

Managed Service Identity Variables
Managed Service Identity Variables

Key Vault

Now that we know we have Managed Service Identity all ready to go, we need to allow our Function App to access our Key Vault. If you don’t have a Key Vault already then read this post where I detail how to quickly get started with the Key Vault.

Go to your Key Vault and select Access Polices from the left menu list.

Azure Key Vault Access Policy
Azure Key Vault Access Policy

Select Add new, Select Principal and locate your Function App and click Select.

Azure Key Vault Access Policy
Azure Key Vault Access Policy

As my vault contains multiple credential types, I enabled the policy for Get for all types. Select Ok. Then select Save.

Azure Key Vault Access Policy
Azure Key Vault Access Policy

We now have our Function App enabled to access the Key Vault.

Azure Key Vault Access Policy
Azure Key Vault Access Policy

Finally in your Key Vault, select a secret you want to retrieve via your Function App and copy out the Secret Identifier from the Properties.

Azure Key Vault Secret Identifier URI
Azure Key Vault Secret Identifier URI

Function App Script

Here is my Sample PowerShell Function App script that will connect to the Key Vault and retrieve credentials. Line 12 should be the only line you need to update for your Key Vault Secret that you want to retrieve. Ensure you still have the API version at the end (which isn’t in the URI you copy from the Key Vault) /?api-version=2015-06-01

When run the output if you have everything correct will look below.

KeyVault Creds Output

Summary

We now have the basis of a script that we can use in our Azure Functions to allow us to use the Managed Service Identity function to connect to an Azure Key Vault and retrieve credentials. We’ve limited the access to the Key Vault to the Azure Function App to only GET the credential. The only piece of information we had to put in our Function App was the URI for the credential we want to retrieve. Brilliant.

Display Microsoft Identity Manager Sync Engine Statistics in the MIM Portal

Introduction

In the Microsoft / Forefront Identity Manager Synchronization Service Manager under Tools we have a Statistics Report. This gives a break down of each of the Management Agents and the Connectors on each MA.

I had a recent requirement to expose this information for a customer but I didn’t want them to have to connect to the Synchronization Server (and be given the permissions to allow them to). So I looked into another way of providing a subset of this information in the MIM Portal itself.  This post details that solution.

MIM / FIM Synchronization Server Management Agent & Metaverse Statistics
MIM / FIM Synchronization Server Management Agent & Metaverse Statistics

Overview

I approached this in a similar way I did for the User Object Report I recently developed. The approach is;

  • Azure PowerShell Function App that uses Remote PowerShell to connect to the MIM Sync Server and leverage the Lithnet MIIS Automation PowerShell Module to enumerate all Management Agents and build a report on the information required in the report
  • A NodeJS WebApp calls the Azure PowerShell Function App onload to generate the report and display it
  • The NodeJS WebApp is embedded in the MIM Portal as a new Nav Bar Resource and Page

The graphic below details the basic logical integration.

MVStatsReportOverview

Prerequisites

The prerequisites to perform this I’ve covered in other posts. In concept as described above it is similar to the User Object report, that has the same prerequisites and I did a pretty good job on detailing those here. To implement this then that post is the required reading to get you ready.

Azure PowerShell Function App

Below is the raw script from my Function App that connects to the MIM Sync Server and retrieves the Management Agent Statistics for the report.

NodeJS Web App

The NodeJS Web App is the app that gets embedded in the MIM Portal that calls the Azure Function to retreive the data and then display it. To get started you’ll want to start with a based NodeJS WebApp. This post will get you started. Implementing a NodeJS WebApp using Visual Studio Code 

The only extension I’m using on top of what is listed there is JQuery. So once you have NodeJS up and running in your VSCode Terminal type npm install jquery and then npm install.

I’ve kept it simple and contained all in a single HTML file using JQuery.

In you NodeJS project you will need to reference your report.html file. It should look like this (assuming you name your report report.html)

var express = require('express');
var router = express.Router();
/* GET - Report page */
router.get('/', function(req, res, next) {
   res.sendFile('report.html', { root:'./public'});
});

module.exports = router;

The Embedded Report

This is what my report looks like embedded in the MIM Portal.

Microsoft Identity Manager Statistics Report
Microsoft Identity Manager Statistics Report

Summary

Integration of FIM / MIM with Azure Platform as a Service Services opens a world of functionality including the ability to expose information that was previously only obtainable by the FIM / MIM Administrator.

Configuring Remote PowerShell to a Remote Active Directory Forest for FIM/MIM GalSync

Introduction

Windows Remote Management (aka Remote PowerShell) is a wonderful thing; when it works straight out of the box when you’re in the same domain. Getting it working across Forests though can feel like jumping through hoop after hoop, and sometimes like the hoops are on fire.  When configuring GALSync ([Exchange] Global Address List Synchronisation) with FIM/MIM this always means across AD Forests. The graphic below shows the simplest relationship. If there is a firewall(s) in between then you’ll have additional hoops to jump through.

GALSync

This article here is the most definitive I’ve found  about what is required, but it isn’t easily found even when you know it exists. In the last few months I’ve had to set up GALSync with FIM/MIM a number of times, and I have visibility that I’ll be needing to do it again in the future. So here is my consolidated version of the process using PowerShell to make the configuration changes. If nothing else it’ll help me find it quickly next time I need to do it.

This post assumes you have the other prerequisites all sorted. They are pretty clear in the linked article above such as a One-way Cross Forest Trust, connectivity on the necessary ports if there are firewalls in-between FIM/MIM and the Exchange CAS Server and Domain Controllers in the remote environment.

Configuring Remote PowerShell for FIM/MIM GALSync

My tip is to start from the MIM Sync Server.

  1. Get the details for the Service Account that you have/will specify on your GALSync Active Directory Management Agent that connects to the Remote Forest.
  2. Have that account be given (temporarily) Remote Desktop permissions to the Remote Exchange CAS Server that you will be configuring the Active Directory Management Agent to connect to.  Or use another Admin account that has permissions to Remote Desktop into the CAS Server, then …
  3. … start a Remote Terminal Services Session to the Exchange CAS Server in the Remote Forest

On the Exchange CAS Server (non SSL WinRM)

  • WinRM must have Kerberos authentication enabled
    • Kerberos requires TCP and UDP port 88 to be opened from the FIM/MIM server to ALL Domain Controllers in the target Forest. Run the following two commands in an elevated (Administrator) Powershell ISE/Shell session to enable Kerberos
      • set-item wsman:\localhost\service\auth\Kerberos -value true
      • set-item wsman:\localhost\service\AllowUnencrypted -value true 

4. then on the MIM Sync Server perform the following …

On the MIM Sync Server (non SSL WinRM)

  • WinRM on the MIM Sync Server must have Kerberos authentication enabled also. Run the following commands in an elevated (Administrator) Powershell ISE/Shell session. The first is to enable Kerberos.
    • set-item wsman:\localhost\client\auth\Kerberos -value true
  • Add the Exchange Server to the list of trusted hosts on the FIM Server
    • Set-item wsman:localhost\client\trustedhosts -value ExchangeCASFQDN
  • Allow unencrypted traffic
    • set-item wsman:\localhost\client\AllowUnencrypted -value true 

Verification (from the MIM Sync Server)

  1. Using PowerShell ISE select File => New Remote Powershell Tab
  2. enter the ExchangeCASFQDN for the Computer field
  3. enter the  Service Account that you have specified on your GALSync Active Directory Management Agent that connects to the Remote Forest for User name in the format NetBIOSDOMAINName\Username
  4. If you have done everything correctly you will get a remote powershell command prompt on the Exchange CAS host.
  5. To confirm you have all your other Exchange Dependencies correct (and your AD MA Service account has the necessary permissions in Exchange) run the following script line-by-line. If you have configured Remote PowerShell correctly and have met all the prerequisites you should have are remote session into Exchange.
Set-ExecutionPolicy RemoteSigned
$Creds = Get-Credential
# NBDomain\ADMAServiceAccountUser
$Session = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri http://.customer.com/PowerShell/ -Credential $Creds -Authentication kerberos
Import-PSSession $Session
# Get a list of Exchange Servers
Get-ExchangeServer
# Get a list of Mailboxes
Get-Mailbox
# Get a list of Mail Users
Get-MailUser

# Close and remove the session 
Remove-PSSession $Session

Cleanup

Remove Remote Desktop permissions from the Active Directory Management Agent Service Account if you enabled it to configure the Exchange CAS Server.

Receive Push Notifications from Microsoft/Forefront Identity Manager on your Mobile/Tablet/Computer

Background

Recently in a FIM/MIM environment a daily automated process was executing but the task it was performing was dependent on an upstream process that generates a feed, and the schedule for that feed had changed (without notice to me). Needless to say FIM/MIM wasn’t getting the information it needed to process. This got me thinking about notifications.

If you’re anything like me you probably have numerous email accounts and your subconscious has all but programmed itself to ignore “new email” notifications. However Push Notifications I typically do notice. Whilst in the example above I did have some error handling in place if the process completely failed (it is a development environment), I didn’t have anything for partial failures. Anyway it did get me thinking that I’d like to receive a notification if something that should happen didn’t.

Overview

This post details using push notifications to advise when expected events don’t transpire. In this particular example, I have an Azure Function App that connects once a day to a FTP Server and retrieves a series of exports and puts them on my FIM/MIM Synchronisation Sever. The Push Notification service I am using is Push Bullet. Push Bullet for free accounts (without a Pro subscription) are limited to 500 pushes per month. That should be more than enough. If I’ve got errors in excess of 500 per month I’ve got much bigger problems.

Getting Started

First up you will need to sign up for Push Bullet. It is very straight forward if you have a Facebook or Google account. As you’re probably wanting multiple people to receive the notifications it would pay to set up a shared Google Account that your team can use to connect to with their devices. Now you have an account head to your new Account Settings page and create an Access Token. Record it for use in the scripts below.

Connecting to the API

Test you can access the Push Bullet API using your Access Token and PowerShell. Update the following script for your Access Token in line 3 and execute. You should see information returned associated with your new Push Bullet account.

Next you will want to install the Push Bullet App on the device(s) you want to get the notification(s) on. I installed it on my Apple iPhone and also installed the Chrome Browser extension.

Using PowerShell we can then query to get the devices connected to the account. In the same PowerShell session you tested the API with above run this API call

$devices = Invoke-RestMethod -Method Get -Headers $header -Uri ($apiURI +"v2/devices")
$devices

This will return your registered devices.

Apple iOS iPhone Push Notification registered devices
Apple iOS iPhone Push Notification registered devices

If we want a notification to target a particular device we need to provide the Iden value associated with that device. If we don’t specify a target, the push notification will hit all devices. In my example above with two devices registered my iPhone was device two. So the target Iden I could get with;

$iphoneIden = $devices.devices[1].iden

Push Bullet allows for different notification types (Note, Link and File). Note is the one that’ll I’ll be using. More info on the other types here.

Sending Test Notification

To perform a notification test, update the following script for your Access Token (line 3). I’ve omitted the Device Identifier to send the message to all devices. I also had to logout of the iOS Push Bullet App and log in again to get the notifications to show.

Success. I received the notification on my iPhone and also in my Chrome browser.

Apple iOS iPhone Push Notification from FIM/MIM Identity Manager
Apple iOS iPhone Push Notification from FIM/MIM Identity Manager

Implementation

Getting back to my requirement of being notified when a job didn’t find what it expected, I updated my PowerShell Function App that is based off this blog post here to evaluate what it processed and if it didn’t find what is expected, it sends me a notification. I already had some error handling in my implementation based off that blog post but it was based on full failure, not partial (which is what I was experiencing whereby only one part of the process wasn’t returning data).

NOTE: I had to also add the ServerCertificateValidationCallback line into my Function App script before calling the API POST to send the notification as I was getting the dreaded following PowerShell Invoke-RestMethod / Invoke-WebRequest error when sending the notification via the Function App. I didn’t get that error on my dev workstation which is a bit weird.

Invoke-WebRequest : The underlying connection was closed: Could not establish trust relationship for the SSL/TLS secure 
channel.

If you also receive the error above (or you will be sending Push Notifications via Azure Function Apps) insert this line before your invoke-restmethod call.

 [System.Net.ServicePointManager]::ServerCertificateValidationCallback = {$true}

Summary

Essentially this is my first foray into enabling anything for Push Notifications and this post is food for thought on what can be easily enabled within FIM/MIM to give timely visibility to automated scheduled functions when they don’t perform as expected. It was incredibly simple to set up and get working. I see myself enabling more FIM/MIM functions with Push Notifications in the future.

Creating an AzureAD WebApp using PowerShell to leverage Certificate Based Authentication

Introduction

Previously I’ve posted about using PowerShell to access the Microsoft AzureAD/Graph API in a number of different ways. Two such examples I’ve listed below. The first uses a Username and Password method for Authentication, whilst the second uses a registered application and therefore ClientID and Client Secret.

As time has gone on I have numerous WebApp’s doing all sorts of automation. However they all rely on accounts with a username and password, or clientid and secret, where the passwords and secrets expire. Granted the secrets have a couple of years of life and are better than passwords which depending on the environment roll every 30-45 days.

However using Certificates would allow for a script that is part of an automated process to run for much longer than the key lifetime available for WebApps and definitely longer than passwords. Obviously there is security around the certificate to be considered so do keep that in mind.

Overview

This post is going to detail a couple of simple but versatile scripts;

  1. Using PowerShell we will;
    1.  Configure AzureAD
      1. Create a Self Signed 10yr Certificate
      2. Create an AzureAD WebApp and assign the Certificate to it
      3. Apply permissions to the WebApp (this is manual via the Azure Portal)
      4. Record the key parameters for use in the second script
    2. Connect to AzureAD using our Certificate and new WebApp

Creating the AzureAD WebApp, Self Signed Certificate and Assigning Application Permissions

The script below does everything required. Run it line by line, or in small chunks as you step through the process. You will need the AzureRM and Azure AD Utils Powershell Modules installed on the machine you run this script on.

Change;

  • Lines 3 & 4 if you want a certificate with a time-frame other than 10yrs
  • Line 5 for the password you want associated with the certificate for exporting/importing the private key
  • Line 6 for the certificate subject name and location it’ll be stored
  • Line 8 for a valid location to export it too
  • Line 11 for the same path as provided in Line 8
  • Lines 24 & 25 for an account to automatically connect to AAD with
  • Line 31 for the name of your WebApp

Before running line 37 login to the Azure Portal and assign permissions to the WebApp. e.g. AzureAD Directory Permissions. When you then run Line 37 it will trigger a GUI for AuthN and AuthZ to be presented. Sign in as an Admin and accept the oAuth2 Permission Authorizations for whatever you have request on the WebApp.
Permissions1.PNG

e.g Graph API Read/Write Permissions

Permissions2.PNG

Connecting to AzureAD using our Certificate and new WebApp

Update lines 3, 4, 6 and 7 as you step through lines 40-43 from the configuration script above which copies key configuration settings to the clipboard.

The following script then gets our certificate out of the local store and takes the Tenant and WebApp parameters and passes them to Connect-AzureAD in Line 15 which will connect you to AAD and allow you to run AzureAD cmdlets.

If you wish to go direct to the GraphAPI, lines 20 and 23 show leveraging the AzureADUtils Module to connect to AzureAD via the GraphAPI.

Notes on creating your Self-Signed Certificate in PowerShell

I’m using the PowerShell New-SelfSignedCertificate cmdlet to create the self signed certificate. If when you run New-SelfSignedCertificate you get the error as shown below, make sure you have Windows Management Framework 5.1 and if you don’t have Visual Studio or the Windows 8.1/10 SDK, get the Windows 8.1 SDK from here and just install the base SDK as shown further below.

No Makecert.PNG

Win8.1 SDK.PNG

Once the install is complete copy C:\Program Files (x86)\Windows Kits\8.1\bin\x86\makecert.exe to C:\windows\system32

Summary

The two scripts above show how using PowerShell we can quickly create a Self Signed Certificate, Create an Azure AD WebApp and grant it some permissions. Then using a small PowerShell script we can connect and query AAD/GraphAPI using our certificate and not be concerned about passwords or keys expiring for 10yrs (in this example which can be any timeframe you wish).

UPDATED: Identifying Active Directory Users with Pwned Passwords using Microsoft/Forefront Identity Manager

Earlier this week I posted this blog post that showed a working example of using a custom Pwned Password FIM/MIM Management Agent to flag a boolean attribute in the MIM Service to indicate whether a users password is in the pwned password dataset or not. If you haven’t read that post this won’t make a lot of sense, so read that then come back.

The solution when receiving a new password for a user (via Microsoft Password Change Notification Service) was checking against the Have I Been Pwned API. The disclaimer at the start of the blog post detailed why this is a bad idea for production credentials. The intent was to show a working example of what could be achieved.

This update post shows a working solution that you can implement internal to a network. Essentially taking the Pwned Password Datasets available here and loading them into a local network SQL Server and then querying that from the FIM/MIM Pwned Password Management Agent rather than calling the external public API.

Creating an SQL Server Database for the Pwned Passwords

On my SQL Server using SQL Server Management Studio I right-clicked on Databases and chose New Database. I gave it the name PwnedPasswords and told it where I wanted my DB and Logs to go to.

Then in a Query window in SQL Server Management Studio I used the following script to created a table (dbo.pwnedPasswords).

use PwnedPasswords;
 CREATE TABLE dbo.pwnedPasswords
( password_id int NOT IDENTITY(1,1) NULL,
 passwords varchar(max) NOT NULL,
 CONSTRAINT passwords_pk PRIMARY KEY (password_id)
);

Again using a query window in SQL Server Management Studio I used the following script to create an index for the passwords.

USE [PwnedPasswords]USE [PwnedPasswords]
GO
SET ANSI_PADDING ON

GO
CREATE UNIQUE NONCLUSTERED INDEX [PasswordIndex] ON [dbo].[pwnedPasswords]( [password_id] ASC)INCLUDE ( [passwords]) WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, SORT_IN_TEMPDB = OFF, IGNORE_DUP_KEY = OFF, DROP_EXISTING = OFF, ONLINE = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON)
GO

The last thing I did on the DB was to take the MIM Sync Server Active Directory Service Account (that was already in the SQL Server Logins) and give that account Reader Access to my new PwnedPasswords Database. I gave this account access as I’m using Integrated Authentication for login to SQL and as the MA is initiated by the MIM Sync Service Account, that is the account that needs the access.

Getting the Pwned Password Datasets into the new Database

I’m far from a DBA. I’m an identity guy. So using tools I was most familiar with (PowerShell) I created a simple script to open the password dump files as a stream (as Get-Content wasn’t going to handle the file sizes), read in the lines, convert the format and insert the rows into SQL. I performed the inserts in batches of 1000 and I performed it locally on the SQL Server.

In order to get the content from the dump file, add another column and get it in a format quickly to insert into the SQL DB I used the Out-DataTable function available from here.

The script could probably be improved as I only spend about 20-30 minutes on it. It is opening and closing a connection to the SQL DB each time it inserts 1000 rows. That could be moved outside the Insert2DB Function and maybe the batch size increased. Either way it is a starting point and I used it to write millions of rows into the DB successfully.

Updated FIM/MIM Pwned Passwords Management Agent Password.ps1 script

This then is the only other change to the solution. The Password.ps1 script rather than querying the PwnedPasswords API queries the SQL DB and sets the pwned boolean flag accordingly.

Summary

This enhancement shows a working concept that will be more appealing to Security Officers within corporate organisations if you have an appetite to know what your potential exposure is based on your Active Directory Users Passwords.

 

Identifying Active Directory Users with Pwned Passwords using Microsoft/Forefront Identity Manager

Update: An element of this solution details checking passwords online (using the Have I Been Pwned API). Troy explains succinctly in his blog-post announcing the pwned passwords list why this is a bad idea. If you are looking to implement the concept I detail in this post then WE STRONGLY recommend using a local copy of the pwned password list.
THIS POST HERE details using a local SQL Database to hold the Pwned Passwords Datasets and the change to the Management Agent to query the SQL DB instead of the HIBP API.    

Introduction

Last week (3 Aug 2017) Troy Hunt released a sizeable list of Pwned Passwords. 320 Million in fact. I encourage you strongly to have a read about the details here.

Troy also extended his HaveIBeenPwned API to include the ability to query as to whether a password has been pwned and is likely to be used in a brute force attack.

Microsoft provide a premium license feature in Azure Active Directory (Azure Active Directory Identity Protection) whereby leaked credential sets are checked and Admins alerted via reports. But what if you aren’t licensed for the Azure AD Premium Features, or you want something a little more customised and you have Microsoft/Forefront Identity Manager? That is what this post covers.

Overview

The following diagram looks a little more complicated than what it really is. The essence though is that password changes can come from a multitude of different scenarios. Using Microsoft’s Password Change Notification Service (PCNS) we can capture password changes and send them to Microsoft Identity Manager so that we can synchronise the password to other systems, or for this use case we can lookup to see if the users new password is on the pwned password list.

This post will cover creating the Pwned Password FIM/MIM Management Agent and flagging a boolean attribute in the MIM Service to indicate whether a users password is on the pwned password or not.

Pwned Password Overview
Password change and sync architecture

Prerequisites

There are a few components to this solution depicted above. You will need;

  • FIM/MIM Synchronisation Server
    • with an Active Directory Management Agent configured (most likely you will have a Projection Rule on this MA to get your users into the Metaverse)
    • not shown in the diagram above you will also need the MIM MA configured to sync users from the Metaverse to the MIM Service
  • FIM/MIM Service and Portal Server (can be on the same server as above)
  • Microsoft Password Change Notification Service (PCNS). This MS PFE PCNS implementation document covers it quite well and you will need;
    • the PCNS AD Schema Extension installed
    • the PCNS AD Password Filters installed on all your (writeable) Domain Controllers
    • PCNS configured to send password changes to your FIM/MIM Sync Server
  • Granfeldt PowerShell Management Agent (that we will use to check users passwords against the Have I Been Pwned pwned password API)
  • Lithnet Resource Management PowerShell Module
    • download it from here and install it on your FIM/MIM Server as the Pwned Password MA will use this module to populate the Pwned Password Status for users in the MIM Service
  • Windows Management Framework (PowerShell) 5.x

Getting Started with the Granfeldt PowerShell Management Agent

If you don’t already have it go get it from here. Søren’s documentation is pretty good but does assume you have a working knowledge of FIM/MIM and this blog post is no different.

Four items of note for this solution;

  • You must have an Export.ps1 file. Even though we’re not doing exports on this MA, the PS MA configuration requires a file for this field. The .ps1 doesn’t need to have any logic/script inside it. It just needs to be present
  • The credentials you give the MA to run this MA are the credentials for the account that has permissions to the On Premise Active Directory where we will be importing users from to join to our Metaverse so we can pass password changes to this Management Agent
  • The same account as above will also need to have permissions in the MIM Service as we will be using the account to update the new attribute we are going to create
  • The path to the scripts in the PS MA Config must not contain spaces and be in old-skool 8.3 format. I’ve chosen to store my scripts in an appropriately named subdirectory under the MIM Extensions directory. Tip: from a command shell use dir /x to get the 8.3 directory format name. Mine looks like C:\PROGRA~1\MICROS~4\2010\SYNCHR~1\EXTENS~2\PwnedPWD

With the Granfeldt PowerShell Management Agent downloaded from Codeplex and installed on your FIM/MIM Server we can create our Pwned Password Management Agent.

Creating the Pwned PowerShell Management Agent

On your FIM/MIM Sync Server create a new sub-directory under your Extensions Directory. eg. PwnedPWD in C:\Program Files\Microsoft Forefront Identity Manager\2010\Synchronization Service\Extensions then create a sub-directory under PwnedPWD named DebugC:\Program Files\Microsoft Forefront Identity Manager\2010\Synchronization Service\Extensions\PwnedPWD\Debug

Copy the following scripts (schema.ps1, import.ps1, export.ps1, password.ps1) and put them into the C:\Program Files\Microsoft Forefront Identity Manager\2010\Synchronization Service\Extensions\PwnedPWD directory

Schema.ps1

The following schema.ps1 script sets up the object class (user) and a handful of attributes from Active Diretory that will be useful for logic that we may implement in the future based on users password status.

Import.ps1

The import.ps1 script connects to Active Directory to import our AD users into the Pwned Password Management Agent so we can join to the Metaverse object already present for users on the Active Directory Management Agent. The user needs to be joined to the Metaverse on our new MA so they are addressable as a target for PCNS.

Export.ps1

As detailed earlier, we aren’t using an Export script in this solution.

Password.ps1

The Password script receives password changes as they occur from Active Directory and looks up the Have I Been Pwned API to see if the new password is present on the list or not and sets a boolean attribute for the pwned password status in the MIM Service.

On your FIM/MIM Sync Server from the Synchronisation Manager select Create Management Agent from the right hand side pane.  Select PowerShell from the list of Management Agents. Select Next.

create pwned passwords management agent
create pwned passwords management agent

Give your MA a Name and a Description. Select Next. 

create pwned passwords management agent
create pwned passwords management agent

Provide the 8.1 style path to your Schema.ps1 script copied from the steps earlier. Provide an AD sAMAccountName and Password that also has permissions to the MIM Service as detailed in the Prerequisites. Select Next.

connectivity pwned passwords management agent
connectivity pwned passwords management agent

Provide the paths to the Import.ps1, Export.ps1 and Password.ps1 scripts copied in earlier. Select Next.

global parameters pwned passwords management agent
global parameters pwned passwords management agent

Select Next.

partitions and hierarchies pwned passwords management agent
partitions and hierarchies pwned passwords management agent

Select the user checkbox. Select Next.

object types pwned passwords management agent
object types pwned passwords management agent

Select all the attributes in the list. Select Next.

attributes pwned passwords management agent
attributes pwned passwords management agent

Select Next.

anchors pwned passwords management agent
anchors pwned passwords management agent

Select Next.

connector filter pwned passwords management agent
connector filter pwned passwords management agent

Create a Join Rule for your environment. eg.  sAMAccountName => person:Accountname  Select Next.

join and projection rule pwned passwords management agent
join and projection rule pwned passwords management agent

Create an import flow rule for user:pwdLastSet => person:pwdLastSet. Select Next.

attribute flow pwned passwords management agent
attribute flow pwned passwords management agent

Select Next.

deprovisioning pwned passwords management agent
deprovisioning pwned passwords management agent

Ensure that Enabled password management is selected, then select Finish.

Enabled password management pwned passwords management agent
Enabled password management pwned passwords management agent

With the Pwned Password MA created and configured we need to create at least a Stage (Full Import) and Full Sync Run Profiles and execute them to bring in the users from AD and join them to the Metaverse.

This should be something you’re already familiar with.

Run Profiles Pwned Passwords Management Agent
Run Profile pwned passwords MA

When running the Synchronisation we get the joins we expect. In my environment PwdLastSet was configured to sync to the MIM Service and hence the Outbound Sync to on the MIM Service MA.

Sync and join pwned passwords MA
MIM Synchronisation pwned passwords MA

MIM Service Configuration

In the MIM Service we will create a custom boolean attribute that will hold the pwned status of the users password.

Schema

Connect to your MIM Portal Server with Administrator privileges and select Schema Management from the right hand side menu.

create attribute pwned password
create attribute pwned password

Select All Attributes then select New

create attribute pwned password
create attribute pwned password

Provide an attribute name (System name) and a Display Name with a Data Type of Boolean. Provide a Description and select Finish

create attribute pwned password
create attribute pwned password

Select Submit

create attribute pwned password
create attribute pwned password

Search for User in Resource Types then select the User checkbox from the search results and select Binding then select New.

MIM Schema create binding for pwned password attribute
MIM Schema create binding for pwned password attribute

In the Resource Type box type User then click the validate field button (the one with the green tick). In the Attribute Type box type Pwned Password then click the validate field button (the one with the green tick). Select Finish

create binding for pwned password attribute
create binding for pwned password attribute

Select  Submit

create binding for pwned password attribute
create binding for pwned password attribute

Configure the Active Directory MA to send passwords to the Pwned Passwords MA

On your existing Active Directory Management Agent select Properties. Select Configure Directory Partitions then under Password Synchronization enable the checkbox Enable this partition as a password synchronization source. Select Targets and select your newly created Pwned Password MA. Select Ok then Ok again.

Pwned Password MA
Pwned Password MA

Testing the End to End Pwned Password Check

Now you should have configured;

  • PCNS including installation of the Active Directory filters
  • The existing Active Directory Management Agent as a Password Source
  • The existing Active Directory Management Agent to send password change events to the Pwned Password MA

Select a user in Active Directory Users and Computers, right click the user and select Reset Password.

Active Directory Users and Computers, right click the user and select Reset Password
Active Directory Users and Computers, right click the user and select Reset Password

I first provided a password I know is on the pwned list, Password1

 password I know is on the pwned list
password I know is on the pwned list
Password Changed
Password Changed

With PCNS Logging enabled on the MIM Sync Server I can see the password event come through.

PCNS Logging enabled on the MIM Sync Server
PCNS Logging enabled on the MIM Sync Server

Checking in the Pwned Password MA debug log we can see in the debug logging for the user we changed the password for and that when it was checked against Have I Been Pwned the password is flagged as pwned.

Note: If you implement the solution in a production environment obviously remove the password from being logged. 

Pwned Password MA debug log
Pwned Password MA debug log

In the MIM Portal search for and locate the user the password we just changed the password for.

MIM Portal search for and locate the user
MIM Portal search for and locate the user

Select the user. Scroll to the bottom and select Advanced View. Select the Extended Attributes tab. Scroll down and we can see the Pwned Password shows as checked.

Pwned Password shows as checked
Pwned Password shows as checked

Now repeating the process with a password that isn’t in the Pwned Password list. After changing the password in Active Directory Users and Computers the password went through its sync path. The log shows the password isn’t in the list.

log shows the password isn't in the pwned password list
log shows the password isn’t in the pwned password list

And the MIM Portal shows the Boolean value for Pwned Password is now not selected.

Pwned Password is now not selected
Pwned Password is now not selected

Summary

Using PCNS and FIM/MIM we can check whether our Active Directory users are using passwords that aren’t in the Pwned Password list.

What we can then do if their password is in the Pwned Password list is a number of things based on what the security policy is and even what type of user it is. You’ll notice that I’ve included additional attributes in the MA that we can flow through the Metaverse and into the MIM Service that may help with some of those decisions (such as adminCount which indicates if the user is an Administrator).

Potentially for Admin users we could create a workflow in the MIM Service that forces their account to change password on next logon. For other users we could create a workflow that sends them a notification letting them know that they should change their password.

Either way, we now have visibility of the state of users passwords. Big thanks to Troy for adding Pwned Passwords to his Have I Been Pwned API.

 

Reiterating: An element of this solution details checking passwords online (using the Have I Been Pwned API). Troy explains succinctly in his blog-post announcing the pwned passwords list why this is a bad idea. If you are looking to implement the concept I detail in this post then WE STRONGLY recommend using a local copy of the pwned password list.  

Why and how I rebuilt my home network with Ubiquiti UniFi Networking

Remember the good old days of working from home, or checking your email/doing research for whatever you were working on and you had to plug-in the phone line to the modem and dialup your ISP or employer to access the internet? The upgrade to ISDN and having quick dial on demand access? Then the consumerization of WiFi and DSL and having always on connectivity to the internet from home.

Now in 2017 with the ubiquity of WiFi and typical house renovations and extensions you end up with a myriad of devices providing connectivity for entertainment, home automation and everything in-between. That is where my house was at (until a month ago). Add to it two high school students leveraging the interwebs for study, social and gaming means the heterogeneous organically grown network environment no longer holds up.

Something needed to change, but what? During my research I stumbled across others investigating the same predicament. I didn’t want another stop-gap band-aid solution. I wanted enterprise grade services that were reliable and affordable. Inspired by Troy Hunt’s posts here and here and some similar conversations with colleagues I sat in front of the TV and YouTube for a day over a recent long weekend and devised a plan that would work for my house and my family.

What did I want ?

With my consultant hat on I defined my requirements;

  1. Centralised administration
    • i.e. not having to log into 3 different Wi-Fi access points to change configuration, find connected devices etc
  2. Maximise throughput across Wi-Fi and ethernet to the internet
    • 802.11ac for Wi-Fi
    • Full 5G coverage
    • Gbit ethernet for connected wired devices
  3. Enterprise Grade VPN for integration with IaaS Services
  4. Security integration (video monitoring). Tactical first, ubiquitous coverage later
  5. Reporting and Auditing
    • Who’s connecting
    • What devices are doing what

Ubiquiti UniFi ticked all the boxes and based on positive local reviews I jumped in.

The Plan

What replaces what?

  1. Telstra cable broadband all-in-one modem, router, firewall, access point
    • The modem function stays enabled as it is HFC but will be put into Bridge Mode with the other functions (firewall and routing) to now be provided by the Ubiquiti Edge Router
  2. Billion switch and access point
    • At the other end of the house from the incoming Telstra connection I had a Billion all-in-one device. That was decommissioned and replaced with a Ubiquiti in-wall 802.11ac uni
  3. Cisco 8 port Gb Switch
    • This was replaced with the Ubiquiti 8 port managed switch (which includes 4 POE switch ports)
  4. Apple Airport Expresses (x2)
    • These were located in two dead zones wired in via ethernet but also acting as access points. They provided coverage for the outdoor entertaining areas, the pool, the downstairs lounge and the brewhouse. They were replaced with Ubiquiti 802.11ac Lite WiFi access points

In addition I also purchased the;

  1. Ubiquiti CloudKey to control everything and store the configuration
  2. I added an additional Ubiquiti in-wall 802.11ac unit in my sons bedroom. This gives maximum Wi-Fi coverage for him and his sister, but also an ethernet outlet to connect to his docking station for his MacBook and online gaming
  3. UniFi Video G3 IR HD Camera. Only one to start with to test a few scenarios out. If it works out it will be supplemented with additional units
  4. I had two Apple TV’s. One wired and one on Wi-Fi only. These were the 3rd Gen units and whilst they worked they really were apple centric. So streaming music to all units (including Airport Express) reliably meant using iTunes on a computer. I replaced these with Chromecast units (a combo of Audio and Video)
  5. Google Home to play nicely with the Chromecast and introduce a home automation assistant to the house though my existing Hue lights.

Putting it all together

Having invested the time in research, I’d watched a number of videos/tutorials from Chris and Willie. Check out this quick start guide from Chris and Willie’s tutorials here.

I took note of the configuration I had on my existing Telstra unit as I had some static address leases configured and a couple of ports enabled on the firewall. I installed the Ubiquiti Discovery Tool Chrome Extension on the laptop.

Essentially the process went like this:

  • Telstra Cable Broadband Ethernet connection to UniFi Security Gateway WAN Port
  • LAN Port of the UniFi Security Gateway connected to UniFi Switch
  • CloudKey connected to a UniFi POE Switch Port
  • Laptop connected via Ethernet to a UniFi Switch Port
  • UniFi AP-AC-Lite POE injector LAN Port connected to a UniFi Switch Port. UniFi AP-AC-Lite POE injector POE port connected to a UniFi AP-AC-Lite
  • UniFi Switch POE Port connected to the UniFi AC AP InWall unit
  • Via WiFi I connected to the existing Telstra modem configuration page and changed WAN port to Bridged Mode
  • I then turned off WiFi on the Telstra Modem using the button on the front
  • Powered on all the UniFi equipment and waited 5 mins

That all looked pretty similar to this

Unifi Setup.png

  • In the browser on the Ethernet connected laptop I went to 192.168.1.1 and could see the Unify Gateway had got an address and the configuration looked good
  • I started the Device Discovery Tool Chrome Extension, clicked on the UniFi Family button in the top right and let it find the devices on the network. It found all the network devices except the CloudKey. After clicking Find CloudKey it found the CloudKey.

I then followed this quick start guide from Chris. It was pretty straight forward and I followed my nose once I’d got through the first few steps. Pretty much everything had an update which I performed. I re-used the existing WiFi name I had previously to make reconfiguration easier to start with. But I did use a different subnet. As everything was pretty much DHCP enabled this didn’t cause any major probs. Just a couple of manual updates for the devices with static addresses.

Having just gotten off a 17hr flight from Dallas to Sydney and having been up for 14 hours before that, I forced myself back into the local timezone by performing the above. I had it all up and running within a couple of hours. Over the next few days I familiarised myself with the equip and the configs before physically locating all the components in their final resting spots.

Did I get what I wanted?

In fact it was the start of school holidays as I set everything up. As part of the initial configuration I enabled Deep Packet Inspection (DPI). The very next day looking into some of the features I noticed the following in the Statistics. Yes, my son had updated his Playstation games and had enjoyed a solid multi-hour gaming session with his mates online.

Playstation.jpg

A couple of days later I noticed his sibling was keeping occupied with YouTube and iTunes whilst keeping in touch with her friends via Instagram and Snapchat. Combined with the Clients view I now had visibility what was connected and what was doing what on the home network. Even better with Cloud Access enabled I can do this at any time from anywhere.

Social.PNG

WiFi Network Coverage

This was one of the big things I wanted to fix. The footprint of the property where I require full coverage is just over 350m2 (3767 sq ft). And ideally that coverage should be 5G. Using the Map functionality I uploaded the house/property plan and placed the units where I installed them and configured the map for the appropriate dimensions. I started out with all the Access Points configured with full power and Auto channel. The 2G coverage map looks like this. That’s some pretty good coverage.

Coverage2G.PNG

The 5G coverage map looks like this. A small spot where coverage isn’t anticipated to be full, but I haven’t encountered any issues with connectivity possibly as the devices used there aren’t using 5G because they can’t or have dropped back to 2G. I’ll keep an eye on it, but maybe another AP in the pantry to cover the kitchen with full 5G maybe a future option.

Coverage5G.PNG

Configuration Updates

There was a series of firmware updates for all the devices about a week after the initial setup. After updating the Cloud Key, USG and Switch I spotted the “managed rolling upgrade” option and used that for the Access Points. Nice and simple. The Group Config option is also very nice allow the selection of multiple devices and making the same config change to all of them.

Camera Setup

The camera setup was so simple I was second guessing myself as to whether I’d done it correctly. I’d purchased the G3 HD Camera but held off on the NVR as I wanted to have a play and make sure I was happy with it first as I’ve had many unfavourable experiences with video cameras in the past. In doing my pre-purchase research I’d identified that Ubiquiti provide the NVR software for free for Windows and Linux.

I had an old laptop doing nothing, so I put Ubuntu on it, installed the NVR software and bam, it discovered my camera. Integrated with my Ubiquiti account means I can also access the camera and recordings from anywhere. I have the camera permanently fixed and configured to record on motion. It just works. Nighttime IR motion recording also works well.

Video.PNG

The Timeline feature is very nice. Quickly catchup on the guard dogs irregular patrols of his back yard 🙂

Timeline

VPN Setup

This is one piece I haven’t finished exploring and getting to where I want yet. I’ve attempted to create a site-to-site VPN from home to Azure which works but doesn’t appear to hold connection. More testing and configuration required.

Summary

Three weeks on and I’m extremely happy with the equip and its performance. I took the opportunity to also check each fly-lead attached to each device in the house and I biffed anything that wasn’t rated or was less that Cat 5E.

With school holidays coming to an end, and the network being performance tested daily with YouTube, Netflix, Sony Entertainment Network, FaceTime, Skype for Business etc, it hasn’t missed a beat. In fact I haven’t heard a peep about poor internet access, lagging online game performance, long ping times etc which I’m sure are common phrases other parents of millennial teenagers would know.