Enabling and Scripting Azure Virtual Machine Just-In-Time Access

Last week (19 July 2017) one of Microsoft’s Azure Security Center’s latest features went from Private Preview to Public Preview. The feature is Azure Just in time Virtual Machine Access.

What is Just in time Virtual Machine access?

Essentially JIT VM Access is a wrapper for automating an Azure Network Security Group rule set for access to an Azure VM(s) for a temporal period on a set of network ports restricted to a source IP/Network.

Personally I’d done something a little similar earlier in the year by automating the update of an NSG inbound rule to allow RDP only for my current public IP Address. Details on that are here. But that is essentially now redundant.

Enabling Just in time VM Access

In the Azure Portal Select the Security Center icon.

In the central pane you will find an option to Enable Just in time VM Access. Select that link.

In the right hand pane you will then see a link for Try Just in time VM Access. Select that.

If you have not previously enabled the Security Center you will need to select a Pricing Tier. The Free Tier does not include the JIT VM Access, but you should get an option for a 60 day trial for the Standard Tier that does.

With everything enabled you can select Recommended to see a list of VM’s that JIT VM Access can be enabled for.

I’ve selected one of mine from the list and then selected Enable JIT on 1 VM.

In the Enable JIT VM Config you can add and remove ports as required. Also the maximum timeframe for the access. The Per-Request for source IP will enable the rule for the requester and their current IP.  Select Ok.

With the rule configured you can now Request access

When requesting access we can tailor the access based on what is in the rule. Select the ports we want from within the policy and IP Range or Current IP and reduce the timeframe if required. Then select Open Ports.

For the VM we can now see that JIT VM Access has been requested and is currently active.

Looking at the Network Security Group that is associated with the VM we can see the rules that JIT VM Access has put in place. We can also see that the rules are against my current IP Address.

Automating JIT VM Access Requests via PowerShell

Now that we have Just-in-time VM Access all configured for our VM, the reality is I just want to invoke the access request via PowerShell, start-up my VM (as they would normally be stopped unless in use) and utilise the resource.

The script below is a simplified version of the my previous script to automate NSG rules  detailed here. It assumes you enabled JIT VM Access as per the manual process above, and that your VM would normally be in an off state and you’re about to enable access, start it up and connect.

You will need to have the AzureRM and the new Azure-Security-Center PowerShell Modules. If you are running PowerShell 5.1 or later you can install them by un-remarking lines 3 and 5.

Update lines 13, 15 and 19 for your Resource Group name, Virtual Machine name and the link to your RDP file. Update line 21 for the number of hours to request access (in line with your policy).

Line 28 uses the new Invoke-ASCJITAccess cmdlet from the Azure-Security-Center Powershell module to request access. 

Summary

This simplifies the management of NSG Rules for access to VM’s and reduces the exposure of VM’s to brute force attacks. It also simplifies for me the access to a bunch of VM’s I only have running on an ad-hoc basis.

Looking into the Azure-Security-Center PowerShell module there are cmdlets to also manage the JIT Policies.

Resolving Microsoft Identity Manager “sync-rule-validation-parsing-error” error

A couple of weeks back I inherited a Microsoft Identity Manager development environment that wasn’t quite complete. When I performed a sync on a user object I got the following error;  sync-rule-validation-parsing-error

Looking into the error for further details, Details and Stack Trace were both greyed out as shown below.

I looked at the object being exported on the MA and the awaiting export details and found slightly different information. The error was CS to MV to CS synchronization failed 0x8023055a 

Still not a lot to go on. So I looked in the Application Event Log and nothing. Anything in the System Event Log? No, nothing.

So my attention turned to the Export Synchronization Rule. Here is a partial screenshot of the Export Sync Rule. The object (user) in question had been flagged as inactive and the intent appeared to be a clearing of a number of attributes. Sending “” (crude empty/null) to an attribute isn’t very elegant.

I changed each to use the null function. So for export, null() will flow to each of the attributes. I tried the export again and the same error and problem resulted.

Running short on ideas I created a brand new Export Synchronization Rule and replicated the configuration except for the attributes being exported. Then I added one attribute into the rule at a time, tested the export and repeated until I could replicate the error.

I was able to replicate the error once I hit the terminalServer attribute.
*Note: the screenshot below is prior to changing over to flow null() instead of “”.

Sending null() to the terminalServer Active Directory attribute was causing the error. It was at this point I actually just removed that flow rule and continued with other tasks.

Coming back to this later, and thinking it through I understand the error. When dealing with Terminal Services you actually normally manage four attributes that are part of the userParameters attribute. The four attributes that define a users Terminal Services Profile are;

  • allowLogon
  • terminalServicesHomeDirectory
  • terminalServicesProfilePath
  • terminalServicesHomeDrive

For a user that has a fully configured set of Terminal Services attributes, sending null() to the terminalServer attribute isn’t going to work.

So, posting this as I couldn’t find any reference to sync-rule-validation-parsing-error or CS to MV to CS synchronization failed 0x8023055a elsewhere and chances are I’ll come across it again, and it’ll probably help someone else too.

Why and how I rebuilt my home network with Ubiquiti UniFi Networking

Remember the good old days of working from home, or checking your email/doing research for whatever you were working on and you had to plug-in the phone line to the modem and dialup your ISP or employer to access the internet? The upgrade to ISDN and having quick dial on demand access? Then the consumerization of WiFi and DSL and having always on connectivity to the internet from home.

Now in 2017 with the ubiquity of WiFi and typical house renovations and extensions you end up with a myriad of devices providing connectivity for entertainment, home automation and everything in-between. That is where my house was at (until a month ago). Add to it two high school students leveraging the interwebs for study, social and gaming means the heterogeneous organically grown network environment no longer holds up.

Something needed to change, but what? During my research I stumbled across others investigating the same predicament. I didn’t want another stop-gap band-aid solution. I wanted enterprise grade services that were reliable and affordable. Inspired by Troy Hunt’s posts here and here and some similar conversations with colleagues I sat in front of the TV and YouTube for a day over a recent long weekend and devised a plan that would work for my house and my family.

What did I want ?

With my consultant hat on I defined my requirements;

  1. Centralised administration
    • i.e. not having to log into 3 different Wi-Fi access points to change configuration, find connected devices etc
  2. Maximise throughput across Wi-Fi and ethernet to the internet
    • 802.11ac for Wi-Fi
    • Full 5G coverage
    • Gbit ethernet for connected wired devices
  3. Enterprise Grade VPN for integration with IaaS Services
  4. Security integration (video monitoring). Tactical first, ubiquitous coverage later
  5. Reporting and Auditing
    • Who’s connecting
    • What devices are doing what

Ubiquiti UniFi ticked all the boxes and based on positive local reviews I jumped in.

The Plan

What replaces what?

  1. Telstra cable broadband all-in-one modem, router, firewall, access point
    • The modem function stays enabled as it is HFC but will be put into Bridge Mode with the other functions (firewall and routing) to now be provided by the Ubiquiti Edge Router
  2. Billion switch and access point
    • At the other end of the house from the incoming Telstra connection I had a Billion all-in-one device. That was decommissioned and replaced with a Ubiquiti in-wall 802.11ac uni
  3. Cisco 8 port Gb Switch
    • This was replaced with the Ubiquiti 8 port managed switch (which includes 4 POE switch ports)
  4. Apple Airport Expresses (x2)
    • These were located in two dead zones wired in via ethernet but also acting as access points. They provided coverage for the outdoor entertaining areas, the pool, the downstairs lounge and the brewhouse. They were replaced with Ubiquiti 802.11ac Lite WiFi access points

In addition I also purchased the;

  1. Ubiquiti CloudKey to control everything and store the configuration
  2. I added an additional Ubiquiti in-wall 802.11ac unit in my sons bedroom. This gives maximum Wi-Fi coverage for him and his sister, but also an ethernet outlet to connect to his docking station for his MacBook and online gaming
  3. UniFi Video G3 IR HD Camera. Only one to start with to test a few scenarios out. If it works out it will be supplemented with additional units
  4. I had two Apple TV’s. One wired and one on Wi-Fi only. These were the 3rd Gen units and whilst they worked they really were apple centric. So streaming music to all units (including Airport Express) reliably meant using iTunes on a computer. I replaced these with Chromecast units (a combo of Audio and Video)
  5. Google Home to play nicely with the Chromecast and introduce a home automation assistant to the house though my existing Hue lights.

Putting it all together

Having invested the time in research, I’d watched a number of videos/tutorials from Chris and Willie. Check out this quick start guide from Chris and Willie’s tutorials here.

I took note of the configuration I had on my existing Telstra unit as I had some static address leases configured and a couple of ports enabled on the firewall. I installed the Ubiquiti Discovery Tool Chrome Extension on the laptop.

Essentially the process went like this:

  • Telstra Cable Broadband Ethernet connection to UniFi Security Gateway WAN Port
  • LAN Port of the UniFi Security Gateway connected to UniFi Switch
  • CloudKey connected to a UniFi POE Switch Port
  • Laptop connected via Ethernet to a UniFi Switch Port
  • UniFi AP-AC-Lite POE injector LAN Port connected to a UniFi Switch Port. UniFi AP-AC-Lite POE injector POE port connected to a UniFi AP-AC-Lite
  • UniFi Switch POE Port connected to the UniFi AC AP InWall unit
  • Via WiFi I connected to the existing Telstra modem configuration page and changed WAN port to Bridged Mode
  • I then turned off WiFi on the Telstra Modem using the button on the front
  • Powered on all the UniFi equipment and waited 5 mins

That all looked pretty similar to this

  • In the browser on the Ethernet connected laptop I went to 192.168.1.1 and could see the Unify Gateway had got an address and the configuration looked good
  • I started the Device Discovery Tool Chrome Extension, clicked on the UniFi Family button in the top right and let it find the devices on the network. It found all the network devices except the CloudKey. After clicking Find CloudKey it found the CloudKey.

I then followed this quick start guide from Chris. It was pretty straight forward and I followed my nose once I’d got through the first few steps. Pretty much everything had an update which I performed. I re-used the existing WiFi name I had previously to make reconfiguration easier to start with. But I did use a different subnet. As everything was pretty much DHCP enabled this didn’t cause any major probs. Just a couple of manual updates for the devices with static addresses.

Having just gotten off a 17hr flight from Dallas to Sydney and having been up for 14 hours before that, I forced myself back into the local timezone by performing the above. I had it all up and running within a couple of hours. Over the next few days I familiarised myself with the equip and the configs before physically locating all the components in their final resting spots.

Did I get what I wanted?

In fact it was the start of school holidays as I set everything up. As part of the initial configuration I enabled Deep Packet Inspection (DPI). The very next day looking into some of the features I noticed the following in the Statistics. Yes, my son had updated his Playstation games and had enjoyed a solid multi-hour gaming session with his mates online.

A couple of days later I noticed his sibling was keeping occupied with YouTube and iTunes whilst keeping in touch with her friends via Instagram and Snapchat. Combined with the Clients view I now had visibility what was connected and what was doing what on the home network. Even better with Cloud Access enabled I can do this at any time from anywhere.

WiFi Network Coverage

This was one of the big things I wanted to fix. The footprint of the property where I require full coverage is just over 350m2 (3767 sq ft). And ideally that coverage should be 5G. Using the Map functionality I uploaded the house/property plan and placed the units where I installed them and configured the map for the appropriate dimensions. I started out with all the Access Points configured with full power and Auto channel. The 2G coverage map looks like this. That’s some pretty good coverage.

The 5G coverage map looks like this. A small spot where coverage isn’t anticipated to be full, but I haven’t encountered any issues with connectivity possibly as the devices used there aren’t using 5G because they can’t or have dropped back to 2G. I’ll keep an eye on it, but maybe another AP in the pantry to cover the kitchen with full 5G maybe a future option. 

Configuration Updates

There was a series of firmware updates for all the devices about a week after the initial setup. After updating the Cloud Key, USG and Switch I spotted the “managed rolling upgrade” option and used that for the Access Points. Nice and simple. The Group Config option is also very nice allow the selection of multiple devices and making the same config change to all of them.

Camera Setup

The camera setup was so simple I was second guessing myself as to whether I’d done it correctly. I’d purchased the G3 HD Camera but held off on the NVR as I wanted to have a play and make sure I was happy with it first as I’ve had many unfavourable experiences with video cameras in the past. In doing my pre-purchase research I’d identified that Ubiquiti provide the NVR software for free for Windows and Linux.

I had an old laptop doing nothing, so I put Ubuntu on it, installed the NVR software and bam, it discovered my camera. Integrated with my Ubiquiti account means I can also access the camera and recordings from anywhere. I have the camera permanently fixed and configured to record on motion. It just works. Nighttime IR motion recording also works well.

The Timeline feature is very nice. Quickly catchup on the guard dogs irregular patrols of his back yard 🙂

VPN Setup

This is one piece I haven’t finished exploring and getting to where I want yet. I’ve attempted to create a site-to-site VPN from home to Azure which works but doesn’t appear to hold connection. More testing and configuration required.

Summary

Three weeks on and I’m extremely happy with the equip and its performance. I took the opportunity to also check each fly-lead attached to each device in the house and I biffed anything that wasn’t rated or was less that Cat 5E.

With school holidays coming to an end, and the network being performance tested daily with YouTube, Netflix, Sony Entertainment Network, FaceTime, Skype for Business etc, it hasn’t missed a beat. In fact I haven’t heard a peep about poor internet access, lagging online game performance, long ping times etc which I’m sure are common phrases other parents of millennial teenagers would know.

 

 

 

 

Getting started with Azure Cloud Shell

A few weeks back I noticed that I now had the option for the Azure Cloud Shell in the Azure Portal.

What is Azure Cloud Shell?

Essentially rather than having the Azure CLI installed on your local workstation, you can now initiate it from the Portal and you have automatically assigned (initiated as part of the setup) 5Gbytes of storage associated with it. So you can now create, manage and delete Azure resources using a centrally hosted CLI session. Each time you start your shell your homedrive will mount and your profile, scripts and whatever else you’ve stored in it will be available to you. Nice. Let’s do it.

Getting Started

Login to the Azure Portal and click on the Cloud Shell icon.

As this is the first time you’ve accessed it, you will not have any storage associated with your Azure Cloud Shell. You will be prompted for storage information.

Azure Files must reside in the same region as the machine being mounted to. Cloud Shell machines currently (July 2017) exist in the below regions:

Area Region
Americas East US, South Central US, West US
Europe North Europe, West Europe
Asia Pacific India Central, Southeast Asia

I hit the Advanced Settings to specify creation of a new Resource Group, Storage Account and File Share.

The UI doesn’t check for uniqueness of the configuration settings until it is written. So you might need a couple of attempts with the naming of your storage account. As you can see below it isn’t surprising that my attempt to use azcloudshell as a “Storage Account Name” was already taken.

Providing unique values for these options

.. let the initial creation go through just nicely. I now had a homedrive created for my profile and any files I create, store for my sessions.

As for commands you can use with the Azure CLI go have a look here for the full list that you can use to create, manage and delete your Azure resources.

Personally I’m currently doing a lot with Azure Functions. A list of the full range of Azure Functions CLI commands is available here.

The next thing I looked to do was to put my scripts etc into the clouddrive. I just navigated to the new StorageAccount that I created as part of this and uploaded via the browser.

Below you can see the file I uploaded on the right which appears in the directory in the middle pane.

Using the Azure CLI I changed directories and could see my uploaded file.

And that is pretty much it. Continue as you would with the CLI, but just now with it all centrally stored. Sweet.

Getting started with Ubuntu on Windows (Windows Subsystem for Linux)

This week I was building in Azure a Linux Server (Ubuntu 14). I’d deployed my new Ubuntu Server and I went to connect to it. But I was on a brand new laptop. No tools with SSH installed. Damn. As I was about to go and get my usual windows favorite SSH tools I remembered a session of Build 2017 and Microsoft starting to talk more loudly about Windows Subsystem for Linux. Yes, Ubuntu on Windows, with SUSE and Fedora coming soon. TechCrunch story here.

Now it is still listed as Beta, but the changes appear to coming pretty fast. I figured it should have more than enough for what I needed, and I could hopefully avoid having to install other 3rd party tools and maybe even finally say goodbye to Cygwin. So I dove in, and here is my quick-start guide to get you started.

Prerequisite

Your computer must be running (at a minimum) a 64-bit version of Windows 10 Anniversary Update. OS Build 14393

Installing Windows Subsystem for Linux

To configure your Windows 10 machine to accept WSL go to Windows => Settings and select Update & Security.

Select For developers and enable Developer Mode.

Agree to the warning.

Now open Turn Windows Features on or off and select the checkbox for Windows Subsystem for Linux 

Restart your workstation

After the restart from an elevated command prompt type Bash to attempt to start a Bash Shell. As it is the first time, you will be prompted to install Ubuntu.

Following installation you will be prompted to create a Linux User. This is purely for the Linux environment so does not have anything to do with your Windows Login and Password.

Using SSH from WSL

Now that I have a bash shell on my Windows laptop, lets use SSH to connect to my new Ubuntu Server.

And I’m in. Happy days.

An Identity Consultants Summary of the recent Cloud Identity Summit 2017

I’ve just returned from Chicago and the Cloud Identity Summit that was held at the Sheraton Grand Chicago. It was my first CIS conference and reminded me a lot of the now defunct Quest Experts Conference and The Burton Group Conference, both in terms of the content and scale. It definitely had a more intimate feel than the massive Microsoft Ignite category of event which attracts 25k+ attendees. 1400 attendees at CIS was a record for this event, but it still meant you got the 1:1 time with vendors and speakers which is fantastic.

Just like the Quest “The Experts Conference” (TEC) and The Burton Group Conference if you pick your sessions based on the synopsis and the speaker the sessions can be highly technical 400+ level and worthy of the 30 hr journey to get to the conference. I focused on my particular subject of Identity, so this summary is biased towards that track.

A summary of my takeaways that I’ll briefly detail in the post are:

  • ID Pro
  • Strong Authentication / Goodbye Passwords
  • PAM and IGA
  • SCIM 2.0
  • FIDO 2.0

And before I forget, CIS is dead; long live CIS, now known as Identiverse which will be in Boston in June (24-27) 2018. Ping Identity have renamed the conference moving forward.

ID Pro

Ian Glazer in his keynote on Tuesday announced what has been missing from the IDAM Community. A professional organisation to represent it. Named ID Pro with all the details available here, it is professional organisation for IDAM exponents. Join now here for US$150. Supported by the Kantara initiative this organisation already has the support and backing of the industry.

Strong Authentication / Goodbye Passwords

There were numerous sessions around this topic. And it was fantastic to see that the eco-system to support the holy grail future of No Passwords, but Strong Authentication is now present. Alex Simons summed it up nicely in his keynote on Wednesday but setting the goal of 1 Billion Logins (without passwords) by 2018 launching the hashtag to go with it #1Billionby2018 Checkout the FIDO 2.0 summary further below.

Privileged Account Management and Identity Governance & Access

Privileged Account Management and Identity Governance & Access are better together. We knew this anyway and I’ve been approaching it this way with my solutions. It was therefore refreshing to be entertained by Kelly Grizzle in his session When meets through their mutual friend . In essence SailPoint have been working heavily on their IGA offering but with the help of SCIM now at 2.0 they’ve been working with PAM vendors such as CyberArk to provide the integration and visibility the two need. Kelly entertaining and informative presentation can be found here.

SCIM 2.0

Mentioned above in the PAM and IGA summary, SCIM 2.0 is now ready for prime time. Whilst SCIM has been around for some time it hasn’t seen widespread adoption in my circles. But that’s about to change. Microsoft have been using it as a primary integration method with Azure AD with the likes of Facebook for Work. Microsoft also have a SCIM MA for Microsoft Identity Manager. I’ll be experimenting with it in the near future.

FIDO 2.0

FIDO first came on to my radar about 4 years ago. It is in a lot of the workflows we do every day (if you have a modern operation system – Windows 8+ and bio-metrics on your device or a FIDO compliant token). With FIDO 2.0  and U2F v1.1 and UAF v1.1 now complete the foundation and enabling services for Strong Authentication are ready to go.

Summary of the Summary

I’ve tried hard to not make this too wordy, but the takeaway is this. Identity is the foundation of who you are and what you do. With all the other disruption in the IT industry around cloud and mobility, identity is always the enabler. Get it right and you can make life easier for your users, more visible for your information security officers and auditable for your compliance requirements. Just keep up, as it’s moving very fast.

Integration of Microsoft Identity Manager with Azure Platform-as-a-Service Services

Overview

This isn’t an out of the box solution. This is a bespoke solution that takes a number of elements and puts them together in a unique way. I’m not expecting anyone to implement this specific solution (but you’re more than welcome to) but to take inspiration from it to implement solutions relevant to your environment(s). This post supports a presentation I did to The MIM Team User Group on 14 June 2017.

This post describes a solution that;

  • Leverages an Azure WebApp (NodeJS) to present a simple website. That site can be integrated easily in the FIM/MIM Portal
  • The NodeJS website leverages an Azure Function App to get a list of users from the FIM/MIM Synchronization Server and allows the user to use typeahead functionality to find the user they want to generate a FIM/MIM object report on
  • On selection of a user, a request will be sent to another Azure Function App to generate and return the report to the user in a new browser window

This is shown graphically below.

 

Report Request UI

The NodeJS WebApp is integrated into the FIM/MIM portal. Bootstrap Typeahead is used to find the user to generate a report on. The Typeahead userlist if fulfilled by an Azure Function into the MIM Sync Metaverse. The Generate Report button fires off a call to FIM/MIM via another Azure Function into the MIM Sync and MIM Service to generate the report.

The returned report opens in a new tab in the users browser. The report contains details of the FIM/MIM connectors the user is represented on.

The values of all attributes for the users hologram from the Metaverse are displayed along with the MA the value came from and the last modified date.

Finally the metadata report from the MIM Service MA Connector Space and the MIM Service.

Prerequisites

These are numerous, but I’ve previously posted about them. You will need;

I encourage you to digest those posts to understand how to configure the prerequisites for this solution.

Additional Solution Requirements

To bring all the individual components together, there are a few additional tasks to enable this solution.

  • Enable CORS on your Azure Function App Configuration (see details further below)
  • If you want to display User Object Photos as part of the report, you will likely need to synchronize them into FIM/MIM from an authoritative source (e.g. Office365/Exchange Online)   Checkout this post  and additional details further below
  • In order to embed the NodeJS WebApp into the FIM/MIM Portal, this post provides the details. Change the target URL from PowerBI URL to your NodeJS site
  • Object Report Request WebApp (see below for sample site)

Azure Functions Cross Origin Resource Sharing (CORS)

You will need to configure CORS to allow the NodeJS WebApp to access the Azure Functions (from both local and Azure). Reflect your port number if it is different from 3000, and use the DNS name for your Azure WebApp.

Sample UI NodeJS HTML

Here is a sample HTML file for your NodeJS WebApp with the UI to provide Input for LoginID fulfilled by the NodeJS Javascript file further below.

Sample UI NodeJS JavaScript

The following NodeJS JavaScript supports the HTML UI above. It populates the LoginID typeahead box and takes the Submit Report button to fulfill the report for the desired object(s). Yes if you use the UI to select (individually) multiple different objects all will be returned in their separate output windows.

As the HTML file above indicates you will need to obtain and make available as part of your NodeJS project the typeahead.bundle.js library.

Azure PowerShell Trigger Function App for AccountNames Lookup

The following Azure Function takes the call from the load of the NodeJS WebApp to populate the typeahead userlist.

Azure PowerShell Trigger Function App for User Object Report

Similar in structure to the Username List Lookup Azure Function above, but in the ScriptBlock you embed the Report Generation Script that is detailed here. Modify for what you want to report on.

Photos in the Report

If you want to display images in your report, you will need to determine if the user has an image during the MV metadata report generation part of the script. Add the following lines (updating for the name of your Image attribute; mine is named EXOPhoto) after the Try {} Catch {} in this section $obj = @() ; foreach ($attr in $attributes.Keys)

 # Display the Objects Photo rather than Base64 string
 if ($attr.equals("EXOPhoto")){
     $objectphoto = "<img src=$([char]0x22)data:image/jpeg;base64,$($attributes.$attr.Values.Valuestring)$([char]0x22)>"
     $val = "System.Byte[]"
 }

Then in the output of the HTML report at the end of the report generation insert the $objectphoto variable into the HTML stream.

# Output MIM Service Object Data
 $MIMServiceObjOut = $MIMServiceObjectMetaData | Sort-Object -Property Attribute | ConvertTo-Html -Fragment
 $htmlreport = ConvertTo-HTML -Body "$htmlcss<h1>Microsoft Identity Manager User Object Report</h1><h2>Query</h2>$sourcequery</br><b><center>$objectphoto</br>NOTE: Only attributes with values are displayed.</center></b><h2>Connector(s) Summary</h2>$connectorsummary<h2>MetaVerse Data</h2>$objectmetadata <h2>MIM Service CS Object Data</h2>$MIMServiceCSobjectmetadata <h2>MIM Service Object Data</h2>$MIMServiceObjOut" -Title "MIM Object Report"

As you can see above I’ve also injected the CSS ($htmlcss) into the output stream at the beginning of the Body section.  Somewhere in your script block you will need to define your CSS values. e.g.

 # StyleSheet for nice pretty output
 $htmlcss = "<style>
    h1, h2, th { text-align: center; }
    table { margin: auto; font-family: Segoe UI; box-shadow: 10px 10px 5px #888; border: thin ridge grey; }
    th { background: #0046c3; color: #fff; max-width: 400px; padding: 5px 10px; }
    td { font-size: 11px; padding: 5px 20px; color: #000; }
    tr { background: #b8d1f3; }
    tr:nth-child(even) { background: #dae5f4; }
    tr:nth-child(odd) { background: #b8d1f3; }
 </style>"

Summary

An interesting solution integrating Azure PaaS Services with Microsoft Identity Manager via PowerShell and the extremely versatile Lithnet FIM/MIM PowerShell Modules.

Please share your implementations enhancing your FIM/MIM Solution.

How to build and deploy an Azure NodeJS WebApp using Visual Studio Code

Introduction

This week I had the need to build a small web application with a reasonably simple front end that will later be integrated inside a Portal. The web application isn’t going to be high use and didn’t necessitate deployment of infrastructure (VM’s). I’d messed with NodeJS a while back in this post where I configured a UI for Microsoft Identity Manager and Azure based functions.

In the back of my mind I knew I didn’t want to have to go for a full Visual Studio Project Solution for this, and with the recent updates to Visual Studio Code I figured it must be possible to do it using it. There wasn’t much around on doing it, so I dived in and worked it out for myself. Here I share the end-to-end process to make it easy for you to started.

Overview

What you will need on your development workstation before you start are the following components. Download and install them on your dev machine.

You will also need an Azure Subscription to where you will publish your NodeJS site.

This post details setting up Visual Studio Code to build a shell NodeJS site and deploy it to Azure using a local GIT Repository. Let’s get started.

Visual Studio Code Extensions

A really smart and handy extension for VS Code is Azure Tools for VS Code. Release a few months ago (January 2017), this extension allows you to quickly create a Web App (Resource Group, App Service, Application Service Plan etc) from within VS Code. With VSCode on your development machine from the prerequisites above click on the Extensions Icon (bottom left) in the VSCode menu and type Azure Tools. Click the green Install button.

Azure Tools for VS Code

Creating the NodeJS Site in VS Code

I had a couple of attempts at doing this before I found a quick, neat and repeatable method of getting started. In order to get the Web App deployed and accessible correctly in Azure I found it easiest to use the Sample Azure NodeJS Hello World example from here. Download that sample and extract the contents to a new folder on your workstation. I created a new path on mine named …\NodeJS\nodejssite and dropped the sample in there so it looked like below.

After creating the folder structure and putting the sample in it, whilst in the sub-directory type:

code .

That will startup Visual Studio Code in the newly created folder with the starter sample.

Install Express for NodeJS

To that base sample site we’ll install Express. From the Terminal tab in the lower pane type:

npm install -g express-generator

Express App

With Express now on our machine, lets add the Express App to our new NodeJS site. Type express in the Terminal window.

express

Accept that the directory is not empty

This will create the folder structure for Express.

Now to get all the files and modules for our site configured for our app run npm install

Now type npm start in the terminal window to start our new site.

The NodeJS site will start. Open a Web Browser and go to http://localhost:3000 and you should see the Express empty site.

Navigate to views => index.jade Update the text like I have below.

Refresh your browser window and you should see the text updated.

In the terminal windows press Cntrl + C to stop NodeJS.

Test Deploy to Azure

Now let’s do a test deploy of our shell site as an Azure WebApp.

Press Cntrl + Shift + P or from the View menu select Command Palette.

Type Azure: Login 

This will generate a code and give you a link to open in your browser and login

Paste in the code from the clipboard and select continue

Then login with your account for the Tenant where you want to deploy the WebApp too. You’ll then be authorized.

From the Command Palette type azure sub and choose Azure: List Azure Subscriptions and choose the subscription where you will create and deploy the WebApp

Now from the Command Palette type Azure Create a Web App (Simple).

Give the WebApp a name. This will become the WebApp Name, and the basis for the all the associated WebApp components. Use Create a Web App (Advanced) if you want to be more specific about the name of the App Resources etc.

If you watch the bottom VS Code Status bar you will see the Azure Tools extension create the new Resource Group, Web App and Web App Plan.

Login to the Azure Portal, select the new Web App.

Select Deployment Options and then Local Git Repository. Select OK.

Select Deployment credentials and provide a username and password. You’ll need this shortly to publish your site.

Click Overview. Copy the Git clone url.

Back in VS Code, select the GIT icon (under the magnifying glass) and from the top choose Initialize Repository.

Then in the terminal window type git remote add azure <git clone url> obtained from the step above.

Type Initial Commit as the message and click the tick icon in the Source Control menu bar.

Select and select Publish

Select azure as the remote target we setup earlier.

You’ll be prompted to authenticate. Use the account you created above in Deployment Credentials.

Back in the Azure Portal under the Web App under Deployment Options you will see the initial commit.

Click on Overview and you should see that it is running. Click on URL and the site will open in a new tab in your browser.

Updating our WebApp

Now, let’s make a change to our WebApp.

Back in VS Code, click on the files and folder icon in the top left corner, navigate to views => index.jade and update the title. Hit Cntrl + S (or select Save from the File menu). In Terminal below type npm start to start our NodeJS site locally.

Check out the update locally. In a browser navigate to the local NodeJS site on localhost:3000. You’ll see the changed page.

Select the Git icon on the left menu, give the update some text eg. ‘updated page text’ and select the tick from the top menu.

Select the and choose Push to publish the changes to our Azure WebApp.

Go back to your browser which was on the Azure WebApp URL and reload. Our change and been push and reflected in the WebApp.

Summary

Very quickly and easily using Visual Studio Code (with NodeJS and Git Desktop installed locally) we have;

  • Created an Azure WebApp
  • Created a base NodeJS site
  • Have a local NodeJS site we can develop
  • Publish it to Azure

Now go create something awesome.

How to access Microsoft Identity Manager Hybrid Report data using PowerShell, Graph API and oAuth2

Hybrid Reporting is a great little feature of Microsoft Identity Manager. A small agent installed on the MIM Sync Server will send reporting data to Azure for MIM SSPR and MIM Group activities. See how to install and configure it here.

But what if you want to get the reporting data without going to the Azure Portal and looking at the Audit Reports ? Enter the Azure AD Reports and Events REST API that is currently in preview.  It took me a couple of cracks and getting this working, because documentation is a little vague especially when accessing it via PowerShell and oAuth2. So I’ve written it up and hope it helps for anyone else looking to go down this route.

Gotchas

Accessing the Reports via the API has a couple of caveats that I had to work through:

  • Having the correct permissions to access the report data. Pretty much everything you read tells you that you need to be a Global Admin. Once I had my oAuth tokens I messed around a little and a was able to also get the following from back from the API when purposely using an identity that didn’t have the right permissions. The key piece is “Api request is not from global admin or security admin or security reader role”. I authorized the WebApp using an account that is in the Security Reader Role, and can successfully access the report data.

  • Reading the documentation here on MSDN I incorrectly assumed each category was the report name. Only when I called the “https://graph.windows.net//$metadata?api-version=beta”  and looked at the list of reports I noticed each report was plural.The three that I wanted to access (and report on) are obviously the MIM Hybrid Reports;
"Name":  "mimSsgmGroupActivityEvents",
"Name":  "mimSsprActivityEvents",
"Name":  "mimSsprRegistrationActivityEvents",

Here is the full list of Reports available as of 24 May 2017.

{
    "Name":  "b2cAuthenticationCountSummary",
    "LicenseRequired":  "False"
}
{
    "Name":  "b2cMfaRequestCount",
    "LicenseRequired":  "False"
}
{
    "Name":  "b2cMfaRequestEvent",
    "LicenseRequired":  "False"
}
{
    "Name":  "b2cAuthenticationEvent",
    "LicenseRequired":  "False"
}
{
    "Name":  "b2cAuthenticationCount",
    "LicenseRequired":  "False"
}
{
    "Name":  "b2cMfaRequestCountSummary",
    "LicenseRequired":  "False"
}
{
    "Name":  "tenantUserCount",
    "LicenseRequired":  "False"
}
{
    "Name":  "applicationUsageDetailEvents",
    "LicenseRequired":  "False"
}
{
    "Name":  "applicationUsageSummaryEvents",
    "LicenseRequired":  "True"
}
{
    "Name":  "b2cUserJourneySummaryEvents",
    "LicenseRequired":  "False"
}
{
    "Name":  "b2cUserJourneyEvents",
    "LicenseRequired":  "False"
}
{
    "Name":  "cloudAppDiscoveryEvents",
    "LicenseRequired":  "False"
}
{
    "Name":  "mimSsgmGroupActivityEvents",
    "LicenseRequired":  "True"
}
{
    "Name":  "ssgmGroupActivityEvents",
    "LicenseRequired":  "True"
}
{
    "Name":  "mimSsprActivityEvents",
    "LicenseRequired":  "True"
}
{
    "Name":  "ssprActivityEvents",
    "LicenseRequired":  "True"
}
{
    "Name":  "mimSsprRegistrationActivityEvents",
    "LicenseRequired":  "True"
}
{
    "Name":  "ssprRegistrationActivityEvents",
    "LicenseRequired":  "True"
}
{
    "Name":  "threatenedCredentials",
    "LicenseRequired":  "False"
}
{
    "Name":  "compromisedCredentials",
    "LicenseRequired":  "False"
}
{
    "Name":  "auditEvents",
    "LicenseRequired":  "False"
}
{
    "Name":  "accountProvisioningEvents",
    "LicenseRequired":  "False"
}
{
    "Name":  "signInsFromUnknownSourcesEvents",
    "LicenseRequired":  "False"
}
{
    "Name":  "signInsFromIPAddressesWithSuspiciousActivityEvents",
    "LicenseRequired":  "True"
}
{
    "Name":  "signInsFromMultipleGeographiesEvents",
    "LicenseRequired":  "False"
}
{
    "Name":  "signInsFromPossiblyInfectedDevicesEvents",
    "LicenseRequired":  "True"
}
{
    "Name":  "irregularSignInActivityEvents",
    "LicenseRequired":  "True"
}
{
    "Name":  "allUsersWithAnomalousSignInActivityEvents",
    "LicenseRequired":  "True"
}
{
    "Name":  "signInsAfterMultipleFailuresEvents",
    "LicenseRequired":  "False"
}
{
    "Name":  "applicationUsageSummary",
    "LicenseRequired":  "True"
}
{
    "Name":  "userActivitySummary",
    "LicenseRequired":  "False"
}
{
    "Name":  "groupActivitySummary",
    "LicenseRequired":  "True"
}

How to Access the Reporting API using PowerShell

What you need to do is;

  • Register a WebApp
    • Assign a reply to URL of https://localhost
    • Assign it Read.Directory permissions
  • Get an oAuth2 Authentication Code using an account that is either Global Admin or in the Security Admin or Security Reader Azure Roles
  • Use your Bearer and Refresh tokens to query for the reports you’re interested in

Register your WebApp

In the Azure Portal create a new Web app/API app and assign it https://localhost as the Reply URL. Record the Application ID for use in the PowerShell script.

Assign the Read Directory data permission as shown below

Obtain a key from the Keys option on your new Web App.  Record it for use in the PowerShell script.

Generate an Authentication Code, get a Bearer and Refresh Token

Update the following script, changing Lines 5 & 6 for the ApplicationID/ClientId and Client Secret for the WebApp you created above.

Run the script and you will be prompted to authenticate. Use an account in the tenant where you created the Web App that is a Global Admin or in the Security Admin or Security Reader Azure Roles. You will need to change the location where you want the refresh.token stored (line 18).

If you’ve done everything correctly you have authenticated, got an AuthCode which was then used to get your Authorization Tokens. The value of the $Authorization variable should look similar to this;

Now you can use the Refresh token to generate new Authorization Tokens when they time out, simply by calling the Get-NewTokens function included in the script above.

Querying the Reporting API

Now that you have the necessary prerequisites sorted you can query the Reporting API.

Here are a couple of simple queries to return some data to get you started. Update the script for the tenant name of your AzureAD. With the $Authorization values from the script above you can get data for the MIM Hybrid Reports.

Synchronizing Exchange Online/Office 365 User Profile Photos with FIM/MIM

Introduction

This is Part Two in the two-part blog post on managing users profile photos with Microsoft FIM/MIM. Part one here detailed managing users Azure AD/Active Directory profile photo. This post delves deeper into photos, specifically around Office 365 and the reason why you may want to manage these via FIM/MIM.

Background

User profile photos should be simple to manage. But in a rapidly moving hybrid cloud world it can be a lot more complex than it needs to be. The best summary I’ve found of this evolving moving target is from Paul Ryan here.

Using Paul’s sound advice we too are advising our customers to let users manage their profile photo (within corporate guidelines) via Exchange Online. However as described in this article photos managed in OnPremise Active Directory are synchronized to Azure AD and on to other Office365 services only once. And of course we want them to be consistent across AD DS, Azure AD, Exchange Online and all other Office365 Services.

This post details synchronizing user profile photos from Exchange Online to MIM for further synchronization to other systems. The approach uses a combination of Azure GraphAPI and Exchange Remote PowerShell to manage Exchange Online User Profile Photos.

The following graphic depicts the what the end goal is;

Current State

  • Users historically had a photo in Active Directory. DirSync/ADSync/AzureADConnect then synchronized that to Azure AD (and once only into Office 365).
  • Users update their photo in Office365 (via Exchange Online and Outlook Web Access)
    • the photo is synchronized across Office365 Services

Desired State

  • An extension of the Current State is the requirement to be able to take the image uploaded by users in Exchange Online, and synchronize it back to the OnPremise AD, and any other relevant services that leverage a profile photo
  • Have AzureADConnect keep AzureAD consistent with the new photo obtained from Office365 that is synchronized to the OnPrem Active Directory
  • Sync the current photo to the MIM Portal

Synchronizing Office365 Profile Photos

Whilst Part-one dealt with the AzureAD side of profile photos as an extension to an existing AzureAD PowerShell Management Agent for FIM/MIM, I’ve separated out the Office365 side to streamline it and make it as efficient as possible. More on that later. As such I’ve created a new PowerShell Management Agent specifically for Office365 User Profile Photos.

I’m storing the Exchange Online photo in the MIM Metaverse as a binary object just as I did for the AzureAD photo (but in a different attribute ). I’m also storing a checksum of the photos (as I did for the AzureAD Photo, but also in a different attribute) to make it easier for comparing what is in Azure AD and Exchange Online, to then be used to determine if changes have been made (eg. user updated their profile photo).

Photo Checksum

For generating the hash of the profile photos I’m using Get-Hash from the Powershell Community Extensions.  Whilst PowerShell has Get-FileHash I don’t want to write the profile photos out to disk and read them back in just to get the checksum. That slows the process up by 25%. You can get the checksum using a number of different methods and algorithms. Just be consistent and use the same method across both profile photos and you’ll be comparing apples with apples and the comparison logic will work.

Some notes on Photos and Exchange Online (and MFA)

This is where things went off on a number of tangents. Initially I tried accessing the photos using Exchange Online Remote PowerShell.

CAVEAT 1: If your Office365 Tenant is enabled for Multi-Factor Authentication (which it should be) you will need to get the Exchange Online Remote PowerShell Module as detailed here. Chances are you won’t have full Office365 Admin access though, so as long as the account you will be using is in the Recipient Management Role you should be able to go to the Exchange Control Panel using a URL like https://outlook.office365.com/ecp/?realm=<tenantname>&wa=wsignin1.0 where tenantname is something like customer.com.au From the Hybrid menu on in the right handside pane you will then be able to download the Microsoft.Online.CSE.PSModule.Client.application I had to use Internet Explorer to download the file and get it installed successfully. Once installed I used a few lines from this script here to load the Function and start my RPS session from within PowerShell ISE during solution development.

CAVEAT 2: The EXO RPS MFA PS Function doesn’t allow you to pass it your account password. You can pass it the identity you want to use, but not the password. That makes scheduled process automation with it impossible.

CAVEAT 3: The RPS session exposes the Get-UserPhoto cmdlet which is great. But the RPS session leverages the GraphAPI. The RPS PS Module doesn’t refresh it’s tokens, so if the import takes longer than 60 minutes then using this method you’re a bit stuffed.

CAVEAT 4: Using the Get-UserPhoto cmdlet detailed above, the syncing of photos is slow. As in I was only getting ~4 profile photos per minute slow. This also goes back to the token refresh issue as for pretty much any environment of the size I deal with, this is too slow and will timeout.

CAVEAT 5: You can whitelist the IP Address (or subnet) of your host so MFA is not required using Contextual IP Addressing Whitelisting. At that point there isn’t really a need to use the MFA Enabled PREVIEW EXO RPS function anyway. That said I still needed to whitelist my MIM Sync Server(s) from MFA to allow integration into the Graph API. I configured just the single host. The whitelist takes CIDR format so that looks like /32 (eg. 11.2.33.4/32)

Performance Considerations

As I mentioned above,

  • using the Get-UserPhoto cmdlet was slow. ~4 per minute slow
  • using the GraphAPI into Exchange Online and looking at each user and determining if they had a photo then downloading it, was also slow. Slow because at this customer only ~50% of their users have a photo on their mailbox. As such I was only able to retrieve ~145 photos in 25 minutes. *Note: all timings listed above were during development and actually outputting the images to disk to verify functionality. 

Implemented Solution

After all my trial and error on this, here is my final approach and working solution;

  1. Use the Exchange Online Remote PowerShell (non-MFA version) to query and return a collection of all mailboxes with an image *Note, add an exception for your MIM Sync host to the white-listed hosts for MFA (if your Office365 Tenant is enabled for MFA) so the process can be automated
  2. Use the Graph API to obtain those photos
    • with this I was able to retrieve ~1100 profile photos in ~17* minutes (after ~2 minutes to query and get the list of mailboxes with a profile photo)

Pre-requisites

There’s a lot of info above, so let me summarize the pre-requisties;

  • The Granfeldt PowerShell MA
  • Whitelist your FIM/MIM Sync Server from MFA (if your Office 365 environment is enabled for MFA)
  • Add the account you will run the MA as, that will in turn connect to EXO via RPS to the Recipient Management Role
  • Create a WebApp for the PS MA to use to access users Profile Photos via the Graph API (fastest method)
  • Powershell Community Extensions to generate the image checksum

Creating the WebApp to access Office365 User Profile Photos

Go to your Azure Portal and select the Azure Active Directory Blade from the Resource Menu bar on the left. Then select App Registrations and from the Manage Section of the Azure Active Directory menu, and finally from the top of the main pane select “New Application Registration“.

Give it a name and select Web app/API as the type of app. Make the sign-in URL https://localhost and then select Create.

Record the ApplicationID that you see in the Registered App Essentials window. You’ll need this soon.

Now select All Settings => Required Permissions. Select Read all users basic profiles in addition to Sign in and read user profile. Select Save.

Under Required Permissions select Add and then select 1 Select an API, and select Office 365 Exchange Online then click Select.

Choose 2 Select Permissions and then select Read user profiles and Read all users’ basic profiles. Click Select.

Select Grant Permissions

From Settings select Keys, give your key a Description, choose a key lifetime and select Save. RECORD the key value. You’ll need this along with the WebApp ApplicationID/ClientID for the Import.ps1 script.

Using the information from your newly registered WebApp, we need to perform the first authentication (and authorization of the WebApp) to the Graph API. Taking your ApplicationID, Key (Client Secret) and the account you will use on on the Management Agent (and that you have assigned the Recipient Management Role in Exchange Online) and run the script detailed in this post here. It will authenticate you to your new WebApp via the GraphAPI after asking you to provide the account you will use on the MA and Authorizing the permissions you selected when registering the app. It will also create a refresh.token file which we will give to the MA to automate our connection. The Authorization dialog looks like this.

Creating the Management Agent

Now we can create our Management Agent using the Granfeldt PowerShell Management Agent. If you haven’t created one before checkout a post like this one, that further down the post shows the creation of a Granfeldt PSMA. Don’t forget to provide blank export.ps1 and password.ps1 files on the directory where you place the PSMA scripts.

PowerShell Management Agent Schema.ps1

PowerShell Management Agent Import.ps1

As detailed above the PSMA will leverage the WebApp to read users Exchange Profile Photos via the Graph API. The Import script also leverages Remote Powershell into Exchange Online (for reasons also detailed above). The account you run the Management Agent as will need to be added to the Recipient Management Role Group in order to use Remote PowerShell into Exchange Online and get the information required.

Take the Import.ps1 script below and update;

  • Update lines 11, 24 and 42 for the path to where you have put your PSMA. Mine is under the Extensions directory in a directory named EXOPhotos.
  • copy the refresh.token generated when authenticating and authorizing the WebApp earlier into the directory you specified in line 42 above.
  • Create a Debug directory under the directory you specified in lines 11,24 and 42 above so you can see what the MA is doing as you implement and debug it the first few times.
  • I’ve written the Import to use Paged Imports, so make sure you tick the Paged Imports checkbox on the configuration of the MA
  •  Update Lines 79 and 80 with your ApplicationID and Client Secret that you recorded when creating your WebApp

 

Running the Exchange User Profile Photos MA

Now that you have created the MA, you should have select the EXOUser ObjectClass and the attributes defined in the schema. You should also create the EXOPhoto (as Binary) and EXOPhotoChecksum (as String) attributes in the Metaverse on the person ObjectType (assuming you are using the built-in person ObjectType).

Configure your flow rules to flow the EXOPhoto and EXOPhotoChecksum on the MA to their respective attributes in the MV.

Create a Stage Only run profile and run it. If you have done everything correctly you will see photos come into the Connector Space.

Looking at the Connector Space, I can see EXOPhoto and EXOPhotoChecksum have been imported.

After performing a Synchronization to get the data from the Connector Space into the Metaverse it is time to test the image that lands in the Metaverse. That is quick and easy via PowerShell and the Lithnet MIIS Automation PowerShell Module.

$me = Get-MVObject -ObjectType person -Attribute accountName -Value "drobinson"
$me.Attributes.EXOPhoto.Values.ValueBinary
[System.Io.File]::WriteAllBytes("c:\temp\myOutlookphoto.jpg" ,$me.Attributes.EXOPhoto.Values.ValueBinary )

The file is output to the directory with the filename specified.

Opening the file reveals correctly my Profile Photo.

Summary

In Part one we got the AzureAD/Active Directory photo. In this post we got the Office365 photo.

Now that we have the images from Office365 we need to synchronize any update to photos to Active Directory (and in-turn via AADConnect to Azure AD). Keep in mind the image size limits for Active Directory and that we retrieved the largest photo available from Office365 when synchronizing the photo on. There are a number of PowerShell modules for photo manipulation that will allow you to resize accordingly.