Nested Virtual PowerShell Desktop Environments on Windows 10 & Windows Server 2019 in Azure – Part 1

PowerShell Desktop Virtual Environments

If you’ve been working with PowerShell for any length of time you know that through its flexibility there can come challenges when using disparate PowerShell Modules and often their version dependencies. This isn’t just a PowerShell thing; Python can also trip you up in a similar manner.

Python however has Virtual Environments (virtualenv) capabilities which provides functionality to create an environment that contains all the necessary binaries required for the packages/libraries that a Python project would need. I’ve found this this very useful and I’ve wondered why I couldn’t do the same for PowerShell Desktop (not Core). PowerShell Desktop, PowerShell Core?

PowerShell Desktop vs PowerShell Core

As of August 2016 there are two PowerShell versions;

  • PowerShell Desktop
    • PowerShell 5.1 that runs on Windows and on top of the full .NET Framework stack
  • PowerShell Core
    • PowerShell Core 6.x that is cross platform (Windows, MacOSX, Linux)
      • Doesn’t run on the full .NET Framework

If you are a Windows/Directory Services Admin the likelihood of many of the PowerShell Modules you use running on PowerShell Core are slim. That’s because a lot of the modules you use require the full .NET Framework. And that isn’t available in PowerShell Core.

A Virtual PowerShell Desktop Env? Why is this only possible now?

In July this year Microsoft started providing Windows Container Images for the Insider releases (over and above Nano and Core OS builds). This was great, but meant you needed to be on the Insider Builds and were restricted to environments on physical hardware or VM’s migrated to Azure as there wasn’t an Azure Marketplace OS Version (Windows 10 or Server 2019 Preview) that met the minimum host requirements for the Insider Container images.

We’ve had to wait until Build 1809 became available in the Azure Marketplace which it did at the end of last week (w/e 18 November 2018). The Windows Container Version History shows that there was no 1803 Windows Image. But that’s all bygones now, as 1809 is finally here.

PowerShell Desktop Virtual Environments through Nested Virtualization

The screenshot below on first glance just looks like any command window in a virtual machine. But look a little closer;

  • Remote Desktop Session to an Azure Windows 10 1809 Virtual Machine (host.region.cloudapp.azure.com)
  • Docker Run Windows 1809 PowerShell $psversiontable
    • PowerShell Desktop 5.1 via Docker inside a Virtual Machine in Azure
      • BOOM!!

PowerShell Desktop Virt Env Nested Virtualization.PNG

Ok, so that is a single Docker Container with a full Windows 10 1809 environment running inside a Windows 10 Virtual Machine. But that means we can also add more containers and have multiple isolated PowerShell environments. Something like ….

Nested Virtual PowerShell Desktop Env.png

Wait, what, how? – The Overview

The high-level process is;

  • Provision a Windows 10 Virtual Machine (Build 1809 or later).
    • I recommend to deploy it in Azure, but you could do it in other virtualization environments that support Nested Virtualization
    • NOTE: As I write this Windows Server 2019 Build 1809 hasn’t hit the Azure Marketplace. When it does, as it has a common code-base it should work exactly the same.
  • Enable the OpenSSH Feature (I’ll be using this a little in this post but more in a future post)
  • Enable the Containers and Hyper-V Features
  • Install and configure Docker
  • Pull the Windows Build 1809 Container Image

Windows 10 Build 1809 Virtual Machine

I’m not going to give step-by-step details for deploying a Windows VM in Azure. If you’re looking to setup Virtual PowerShell Desktop Environments with Docker you should be able to deploy a Windows VM. That said you need to choose a VM Size and Version that will support “Nested Virtualization”. The Azure RM Dv3 and Ev3 Series VM’s do. If you get an error similar to this when running a Docker Image then change your VM Series to Dv3. I went with;

  • The Azure Marketplace has a image for Windows 10 Build 1809. Search for Windows 10 Pro, Version 1809
    • In order to run this VM as pragmatic as practical I chose the following size and configuration for my VM initially
      • Standard D2_v3 (2 vCPUs, 8 GB memory)
      • HDD over SSD
      • Un-managed disks
    • Enable SSH and RDP in the NSG configuration
      • initially we’ll need RDP to connect to the workstation
      • moving forward we’ll be using SSH

OpenSSH Server

OpenSSH Client and Server has been available for Windows for a while. Build 1809 though has streamlined the install process considerably. The base install and setup is now just a couple of commands away. The commands below will install the latest version of OpenSSH Server via PowerShell;

# Find OpenSSH Server
$openSSH = Get-WindowsCapability -Online | Where-Object Name -like 'OpenSSH*'

# Install OpenSSH Server
$sshServer = $openSSH | Select-Object name | Where-Object {$_.name -like "OpenSSH.Server*"}
$sshServer

Add-WindowsCapability -Online -Name $sshServer.Name

which when executed via VSCode looks like;

Install OpenSSH Server on Windows 10.PNG

By default the SSH Server service is configured for Manual startup. To configure it for Automatic Startup use the Set-Service cmdlet.

# Set SSH Server for Auto Startup
Get-Service sshd
Set-service sshd -StartupType Automatic

ssh Server Startup Automatic.PNG

Finally we need to increase the ClientAliveInterval setting in the sshd_config configuration file located in the %programdata%\ssh directory. I’ve made mine 3600 seconds (1 hour).

sshd ClientAliveInterval.PNG

Windows Containers / Docker Dependencies

# Install Containers / Docker Dependencies
Enable-WindowsOptionalFeature -Online -FeatureName containers –All -NoRestart
Enable-WindowsOptionalFeature -Online -FeatureName Microsoft-Hyper-V –All
Restart the computer

Install Docker

Head on over to Docker and login (or create an account if you don’t already have one). Get Docker CE for Windows. I’m running 18.06.1-ce-win73.

Download and Install Docker.PNG

As we want a full Windows environment for PowerShell (not PowerShell Core on Linux) select “Use Windows containers” when installing Docker.

Use Windows Containers.PNG

At the end of the Docker install there is a reboot required.

Docker Install Complete.PNG

Get the Windows 1809 Base Container Image

We’re almost there. We need to get the recently released full Windows Image that will be the basis for our containers that will allow us to run full PowerShell environments. Don’t be confused by the Nano and Core images that have been available for quite some time. This is the FULL WINDOWS Build 1809 IMAGE.

As future Windows updates increment the version, the version you want to pull needs to be no greater than the host it is running on. Unlike the Insider Images the release versions follow the Release Number not the Build Number. Looking at the repository we can see that the image name is 1809 where-as its Build Number is 10.0.17763.134.

Docker Windows Image Registry.PNG

With the workstation restarted we SSH into it and pull the Windows 1809 Docker Image. I’ve given my Windows 10 VM a DNS name so I don’t need to figure out the IP Address each time I start it up. From a Windows command prompt to access your new VM (via IPAddress) use;

ssh username@IPAddressofWin10VM

Once we have a console on our Windows 10 VM we can pull the Windows 10 Docker Image.

docker pull mcr.microsoft.com/windows:1809

The image will be retrieved.

Pull Windows 1809 Base Image.PNG

After pulling the image it will be extracted. Depending on the spec of your VM this may take 10-20 minutes.

Extracting Windows Base Docker Image.PNG

After Extraction we have our base Container Image.

Completed Docker Image.PNG

In order to create a container from the command console via SSH we need to be elevated. I’ll cover that in the next post. So to validate we are able to create a container based on the full Windows 10 1809 image, RDP into the Windows 10 VM and open an elevated command prompt. Then type the command;

docker run -it mcr.microsoft.com/windows:1809 powershell $psversiontable

which will start a container using the Windows 10 1809 Image and run PowerShell with the command $PSVersionTable that will return the version of PowerShell.

PowerShell Desktop Virt Env Nested Virtualization

Summary

As you can see from the screenshot above, we have Nested Virtualization in an Azure Resource Manager Windows 10 Virtual Machine running a Docker Windows 10 1089 Container Image that allows us to run PowerShell Desktop 5.1. BOOM!!

That’s it for the first post, where I introduced the concept of Full Windows Docker Images supporting PowerShell Desktop in Azure. Stay tuned for the next post that starts putting this new functionality to good use.

Got thoughts or feedback on this? Twitter || Blog

Retrieving SailPoint IdentityNow Certification Reports using PowerShell

This is the third and probably last post in the Certifications by API series. The first post detailed retrieving and searching campaigns, the second post detailed creating and starting campaigns. If you haven’t read those, check them out as they will give you the background for this one.

As detailed in the previous two posts this post also assumes you are authenticated to IdentityNow as detailed in this post, and you understand that this post details accessing Certifications using the non-versioned SailPoint IdentityNow API’s.

With that all said, this post details obtaining Certification Reports from Completed Campaigns. The process goes like this;

  • Search and return Completed Campaigns
  • Identify campaigns completed within a time period
  • Retrieve and export the CSV completion reports

Note: The first time you access the getReports API for a Campaign the Reports are automatically generated. You can and should include additional logic to catch that the request for the report fails and wait a minute before retrying when the report has been generated allowing you to then retrieve it.

Reports API

As mentioned above the getReports API is used to get a list of reports from campaigns.

https://$($orgName).api.identitynow.com/cc/api/campaign/getReports

To get an individual report the report/get API call is used along with the ID of the report to retrieve. The base URI is;

https://$($orgName).api.identitynow.com/cc/api/report/get

Exporting Certification Campaign Completion Reports

The following PowerShell script will enumerate completed campaigns, compare the completion time to the current time and the time window to export (last 7 days in my example below) and export the CSV version of the reports to a directory on the host running the script.

Update:

  • Line 11 for the number of days previous to days date to retrieve reports on
  • Line 34 for the output CSV path. The exports are named based on the Description of the Campaign, so you’ll need to modify that if all your descriptions are the same

The output is the CSV File which can then be manipulated in PowerShell or Excel (via Data => Import CSV.

Certification Report in Excel.PNG

Summary

Using the Certifications API we can query for Completed Certification Campaigns and then retrieve their complete reports and export them to the file system. Likewise with a few simple changes you can also export the other reports.

PowerShell Progress Notifications that work in Visual Studio Code on Window 10

Earlier this year I made the switch from PowerShell ISE to Visual Studio Code (VSCode) and everything has been going just swimmingly. Well, until I was modifying some old scripts that perform some long running processes and I’d previously added Write-Progress statements in them to provide feedback as to how the script was going. Long story short, Write-Progress doesn’t work in VSCode. I’m definitely not the first to notice this as there is an open issue on Github for it, but seeing as its been open for 2 1/2 years I figured it probably wasn’t going to be resolved soon.

So I went looking for alternatives. I’ve found two that I’m happy with and will be using moving forward. One is a text-based console output progress bar and the other uses Windows 10/Windows Server Notification Centre. This post explores both and how I’m using them.

psInlineProgress

psInlineProgress is authored by Øyvind Kallstad and has been around for a few years now, so does require a small tweak to get it to work with VSCode. Nothing major, just a change for the Console (details further below). The progress bar itself is similar to the Write-Progress one in PowerShell. Here is an example;

inLineProgress Bar Notification
inLineProgress Bar Notification

You can get it from here if you need a manual install, otherwise it’s quick to install via Install-Module

Install-Module -Name psInlineProgress

To allow the progress bar to display in VSCode update the psInlineProgress.psd1 file to change Line 36 as shown below. You should be able to find it in C:\Program Files\WindowsPowerShell\Modules\psInlineProgress\1.1

#PowerShellHostName = 'ConsoleHost'
PowerShellHostName = 'Visual Studio Code Host'

Outputting to psInlineProgress

Updating the psInlineProgress bar is simply a case of giving the Progress Bar the dialog text you want displayed, in percent the progress, the progressed and un-progressed characters. In my example above I’m using the < symbol for the progress position, for the un-progressed and . for the progressed.  Sort of like a Pac-Man consuming dashes and outputting periods.

The PowerShell line looks like this, where $percentComplete is an integer between 0 and 100. E.g

Write-InlineProgress -Activity "Getting User Object $($obj.displayName)" -PercentComplete $percentComplete -ProgressCharacter '<' -ProgressFillCharacter '.' -ProgressFill '-'

Burnt Toast

Burnt Toast is a newer module writen by Josh King that takes advantage of new features in Windows 10. The progress notification isn’t a progress bar, but a Toast Notification via the Notification Centre. Here is an example;

Burnt Toast PowerShell Notification
Burnt Toast PowerShell Notification

You can get it from here if you need a manual install, otherwise it’s quick to install via Install-Module

Install-Module -Name BurntToast

Outputting to Burnt Toast

Updating Burnt Toast is similar to inlineProgress whereby you pass dialog text you want displayed, the progress completed as a fraction between 0 and 1, and a graphic. E.g

$ProgressBar = New-BTProgressBar -Status 'Getting User Objects' -Value $progressDisplay
New-BurntToastNotification –Text ‘IdentityNow Source Import’ -ProgressBar $ProgressBar -Silent –UniqueIdentifier 'Get Users' -AppLogo "C:\Users\DarrenJRobinson\Images\sailpoint.png"

As Burned Toast is a Windows Notification, after the toast notification disappears it can still be found in the Notification Centre.

Windows Notification Centre - Burned Toast
Windows Notification Centre – Burned Toast

Other Burnt Toast Features

Burnt Toast can do much more, such as Alarms, Sounds and Reminders. See examples for those and more on Github here.

Working Example

Here is a working sample from a project where I’m processing thousands of identities. The $idnObjects is a PowerShell Object Collection of all the identities.

The following script snippet comes from the processing loop that processes the identities. $i is incremented on each loop (e.g $i++) and then used to calculate the percentage and fraction completed.

psInlineProgress and Burnt Toast are only updated each whole % of progress (that is calculated in line 9).

The output in VSCode therefore looks like the following. In reality you’d only have one or the other depending on what you were looking to achieve. I’ve done both together to show a side by side comparison.

Toast and Progress Bar.PNG

Summary

Goodbye Write-Progress and Hello Burnt Toast and psInlineProgress

Creating SailPoint IdentityNow Certification Campaigns using PowerShell

Create Sailpoint IdentityNow Certification Campaigns

This is the second post in the Certifications by API series. The last post detailed searching and retrieving campaigns. If you haven’t read that, check that out as it will give you the background for this one.

Also as per the last post this post also assumes you are authenticated to IdentityNow as detailed in this post, and you understand that this post details accessing Certifications using the non-versioned SailPoint IdentityNow API’s.

With that all said, this post details the creation of IdentityNow Certification Campaigns via the API using PowerShell. The Create Campaigns from IdentityNow Search process goes like this;

  • using the Search API, find the users connected to a Source (or a group of users based on other criteria)
  • iterate through them to identify their Role(s), Entitlement(s) and Source(s) and create a Manager Certification Campaign
  • Specify the period for the Campaign along with options such as notifications and revocation
  • Start the Campaign

Campaign Creation

As stated above the first task is to search and retrieve candidates for the campaign. This uses the Search function as I described in more detail in this post here.

If you have more than the searchLimit allowable via the API you will need to page the results over multiple queries. I’ll detail how to do that in a future post. In the query below I’m searching for users on the Source “Active Directory”.

$searchLimit = '2500'
# Search Identities URI $searchURI = "https://$($orgName).api.identitynow.com/v2/search/identities?"
# Query for Source that Campaign is for
$query = '@accounts(Active Directory)'
# Search Accounts
$Accounts = Invoke-RestMethod -Method Get -Uri "$($searchURI)limit=$($searchLimit)&query=$($query)" -Headers @{Authorization = "Basic $($encodedAuth)" }
write-host -ForegroundColor Yellow "Search returned $($accounts.Count) account(s)"

With the users returned we need to iterate through each and look at the users Entitlements, Roles and Access Profiles. We are creating a Manager campaign for these users for all of these (you can reduce the scope if required). The PowerShell snippet to do that looks like this.

User Roles Access Profiles and Entitlements
User Roles Access Profiles and Entitlements

For inclusion in the campaign we need to build a collection for the Roles, Entitlements and Access Profiles. Using PowerShell to iterate through the list obtained from the users above therefore looks like this:

The summary after enumeration below shows the users for the source will be cover 1 Role, 9 Entitlements and 2 Access Profiles.

Summary of Roles Entitlements and Access Profiles
Summary of Roles Entitlements and Access Profiles

Now we have most of the information defined for the scope of our campaign we can specify the additional criteria and information such as duration, name, description, notification, and revocation. Each of those settings are self explanatory by the configuration setting. We then create the Campaign and and Activate it. I have a short delay after creation before activation as I found race conditions. You could lower the delay, but YMMV.

You can also add a Static Reviewer for the campaign as by default the owner will be the account you’re using to perform the creation. Add the following line into the configuration options and specify the ID for the identity.

$campaignOptions.Add("staticReviewerId", $reviewerUser.id )

The ID for an Identity can be obtained via Search. e.g.

# Get Campaign Reviewer in addition to the campaign creator
$usrQuery = '@accounts Rick.Sanchez'
$reviewerUser = Invoke-RestMethod -Method Get -Uri "$($searchURI)limit=$($searchLimit)&query=$($usrQuery)" -Headers @{Authorization = "Basic $($encodedAuth)" }

Putting it all together then looks like this in PowerShell.

The screenshot below shows the campaign being created.

Campaign Created
Campaign Created

Looking at the Certifications section in the IdentityNow Portal we can see the newly created Campaign.

Campaign in Portal
Created Campaign shows in Portal

And we can see the two Managers requiring review.

Campaign in Portal 2.PNG
Campaign details in Portal

As the reviewer for Ronnie I can then go and start the review and see the Entitlements that I need to review for the campaign.

Entitlements Certification
Entitlements Certification

Summary

Using PowerShell we can search IdentityNow and find accounts on a Source and create a Certification Campaign for them based on Roles, Entitlements and Access Profiles. We can then also activate the campaign. Happy orchestrating.

Accessing SailPoint IdentityNow Certification Campaigns using PowerShell

Sailpoint IdentityNow Certifications

This is the first post in a series covering SailPoint IdentityNow Certifications. Specifically listing and returning campaigns, creating campaigns and accessing campaign reports. This post will show Listing Active and Completed Campaigns, Searching for a specific Campaign and returning the full details for a Campaign.

The IdentityNow v1 API’s and v2 API’s don’t expose endpoints for IdentityNow Certification Campaigns so access will be via the non-public/versioned Certification API’s.  In order to access these API’s you will need to be appropriately authenticated. This post here details getting up to speed with that and is a prerequisite for performing the campaign functions I detail in this post.

Retrieving Certification Campaigns

Now that you’re authorized to IdentityNow we can look to retrieve Certification Campaigns. This can be achieved by calling the /campaign/list API.

https://$($orgName).api.identitynow.com/cc/api/campaign/list

Using PowerShell all Active Certification Campaigns can be returned by making the following API call and configuration. Note the Content-Type is removed as if you specify a Content-Type the API will error.

To retrieve Completed campaigns change $completedOnly = $false to $completedOnly = $true
# List Campaign Base URI
$GetCampaignBaseURI = "https://$($orgName).api.identitynow.com/cc/api/campaign/list"
$IDN.Headers.Remove("Content-Type")
$utime = [int][double]::Parse((Get-Date -UFormat %s))
$completedOnly = $false
$campaigns = 100
# Get Active Campaigns
$existingCampaigns=Invoke-RestMethod-method Get -uri "$($GetCampaignBaseURI)?_dc=$($utime)&completedOnly=$($completedOnly)&start=0&limit=$($campaigns)"-WebSession $IDN

Iterating through each result and outputting a summary to the console is then possible as shown below.

List Active SailPoint IdentityNow Campaigns
List Active SailPoint IdentityNow Campaigns

To retrieve an individual Campaign you need to know the ID of the Campaign. You can then retrieve it directly using the campaign/getCertifications API  e.g.

https://$($orgName).api.identitynow.com/cc/api/campaign/getCertifications?_dc=1542094205212&campaignId=2c9180856708ae38016709f4812345c3
Doing that via PowerShell looks like this
$IDN.Headers.Remove("Content-Type")
$utime = [int][double]::Parse((Get-Date -UFormat %s))
$campaignID = "2c9180856708ae38016709f4812345c3"
$Certs=Invoke-RestMethod-method get -Uri "https://$($orgName).api.identitynow.com/cc/api/campaign/getCertifications?_dc=$($utime)&campaignId=$($campaignID)"-websession $IDN
$Certs.items

Searching for Certifications

The new Search Beta does not extend to Certifications. Retrieving a Certification Campaign via the Campaign ID is fine, if you know it (which you won’t).  So here is my workaround for this. Retrieve all Campaigns (Active OR Completed) as detailed above using PowerShell and then use the power of PowerShell (Where-Object) to search and find the Campaign you want.

$completedOnly = $true
$existingCampaigns = Invoke-RestMethod -method Get -uri "$($GetCampaignBaseURI)?_dc=$($utime)&completedOnly=$($completedOnly)&start=0&limit=$($campaigns)" -WebSession $IDN
$myCampaign = $existingCampaigns.items | Select-Object | Where-Object {$_.name -like "*Dec 2018 Campaign*"}

Searching and returning campaigns and then retrieving the full details for a campaign therefore looks like this in PowerShell.

$myCampaignFull=Invoke-RestMethod-method get -Uri "https://$($orgName).api.identitynow.com/cc/api/campaign/getCertifications?_dc=$($utime)&campaignId=$($myCampaign.id)"-websession $IDN
$myCampaignFull.items
Search SailPoint IdentityNow Campaigns and Retrieve Full Campaign
Search SailPoint IdentityNow Campaigns and Retrieve Full Campaign

Summary

Using PowerShell we can get a list of all Completed and Active Certification Campaigns. We can then find the campaign we are looking for information on and retrieve all its details. In the upcoming posts I’ll show how to create a Certification Campaign and also how to retrieve Reports from completed campaigns.

An Azure PowerShell Trigger Function for MAC Address Vendor / Manufacturer Lookup

Recently I started working on another side IoT Project. As part of that I needed to identify the Vendor / Manufacturer of networking equipment. As you are probably aware each network device has a unique MAC Address. A MAC Address looks like this 60:5b:b4:f9:63:05The first 24 bits (6 hex characters) detail the vendor / manufacturer.

There are a number of online lookup tools to determine who the vendor is from the MAC address. And some like that one have an API to allow lookup too. If you are only looking up small volumes that is all good, but after that you get into subscription fee costs. I needed more than 1000 per day, but I also had a good idea of what the vendors were likely to be for a lot of my requests. So I rolled my own using an Azure Trigger Function.

Overview

The IEEE standards body maintains a list of the manufacturers assigned the 24 bit identifiers. A full list can be found here which is updated regularly. I downloaded this list and wrote a simple parser that created a PowerShell Object with the Hex, Base16 and Name of each Manufacturer.

I then extracted the manufacturers I expect to need to reference/lookup into a PSObject that is easily exportable and importable (export-clixml / import-clixml) and use that locally in my application. The full list to too large to keep locally so I exported the full list (again using export-clixml) and implemented a lookup as an Azure Function (that reads in the full list as a PSObject that takes ~1.7 seconds for 25,000+ records) which can then be queried with either Hex or Base16 as per the format in the IEEE list and the vendor name is returned.

Converting the IEEE List to a PowerShell Object

This little script will download the latest version of the OUI list and convert to a PowerShell Object.  The resulting object looks like this:

vendor base16 hex
------ ------ ---
Apple, Inc. F0766F 40-CB-C0
Apple, Inc. 40CBC0 40-98-AD
Apple, Inc. 4098AD 6C-4D-73

Update:

  • Line 4 for the local location to output the OUI List too
  • Line 39 for the PSObject file to create

If you want to query the file locally using PowerShell you can like this:

$query="64-70-33"
$result = $vendors | Select-Object | Where-Object {$_.hex -like $query}
$result
which will output
vendor base16 hex
------ ------ ---
Apple, Inc. 50A67F 64-70-33

If you want to extract all entries associated with a hardware vendor (e.g Apple) you can like this;

$apple = $vendors | Select-Object | Where-Object {$_.vendor -like "Apple*"}

and FYI, Apple have 671 registrations. Yes they make a LOT of equipment.

Azure Function

Here is the Azure Trigger PowerShell Function that takes a JSON object with a query containing the Base16 or Hex values for the 24bit Vendor Manufacturer and returns the Vendor / Manufacturer. e.g

{"query": "0A-00-27"}

Don’t forget to upload the Vendors.xml exported above to your Azure Function (you can drag and drop using Kudu) and update the path in Line 7.

An example PowerShell script to query would be similar to the following. Update $queryURI with the URI to your Azure Function.

$queryURI = "https://FUNCTIONAPP.azurewebsites.net/api/AZUREFUNCTION?code=12345678/uiEx6kse6NujQG0b4OIcjx6B2wHLhZepBD/8Jy6fFawg=="
$query = "0A-00-27"
$body = @{"query" = $query} | ConvertTo-Json
$result=Invoke-RestMethod-Method Post -Uri $queryURI-Body $body
$result
The output will then return the manufacturer name. e.g
Microsoft Corporation

To lookup all MAC addresses from your local windows computer the following snippet will do that after updating $queryURI for you Azure Function.

# Query MAC Address
$queryURI = "https://FUNCTIONAPP.azurewebsites.net/api/AZUREFUNCTION?code=12345678/uiEx6kse6NujQG0b4OIcjx6B2wHLhZepBD/8Jy6fFawg=="
$netAdaptors = Get-NetAdapter

foreach ($adaptor in $netAdaptors){
    $mac=$adaptor.MacAddress
    $macV=$mac.Split("-")
    $macLookup="$($macV[0])$($macV[1])$($macV[2])"
    $body=@{"query"=$macLookup} |ConvertTo-Json
    $result=Invoke-RestMethod-Method Post -Uri $queryURI-Body $body-Headers @{"content-type"="application/text"}
    Write-Host-ForegroundColor Blue $result
}

Summary

With the power of PowerShell it is quick to take a large amount of information and transform it into a usable collection that can then also be quickly exported and re-imported. It is also quickly searchable and thanks to Azure Functions supporting PowerShell it’s simple to stand-up the collection and query it as required programatically.

 

Adding Delta Sync Support to the Microsoft Identity Manager PowerShell Management Agent for Workday HR

Recently I posted a sample Microsoft Identity Manager Management Agent for Workday HR. Subsequently I also posted about some updates I made to the WorkdayAPI PowerShell Module to enable functionality to specify the time period to return changes for. This post details updating  my sample Workday Management Agent to support Delta Synchronisation.

WorkdayAPI PowerShell Module

First up you will need the updated WorkdayAPI PowerShell Module that provides the Get-WorkdayWorkerAdv cmdlet and can take a time period to return information for. Get the updated WorkdayAPI PowerShell Module from here

Update the PowerShell Module on the MIM Sync Server. The module by default will be in the  C:\Program Files\WindowsPowerShell\Modules\WorkdayApi folder.

You will need to unblock the new files.

Get-ChildItem 'C:\Program Files\WindowsPowerShell\Modules\WorkdayApi' | Unblock-File
Get-ChildItem 'C:\Program Files\WindowsPowerShell\Modules\WorkdayApi\scripts' | Unblock-File

Updated Schema

In the updated Management Agent I’m also bringing into MIM additional attributes from the other enhancements I made to the PowerShell Module for HireDate, StartDate, EndDate, Supplier and WorkdayActive. The updates to the Schema.ps1 are shown below.

$obj | Add-Member -Type NoteProperty -Name "HireDate|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "StartDate|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "EndDate|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "Supplier|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "WorkdayActive|Boolean" -Value $True

The full updated Schema Script is below;

With the Schema Script updated, refresh the Management Agent Schema.

Update Schema

You can then select the new attributes in the Workday MA under Select Attributes.

Select New Attributes.PNG

Then select Ok.

Attributes Selected.PNG

Updated Import Script

The Import Script has a number of changes to handle creating and updating a WaterMark File that is used to store the date stamp of the last run. Also updated in the Import Script is the change to use the Get-WorkdayWorkerAdv cmdlet over the Get-WorkdayWorker cmdlet so that a time period can be specified, and to retrieve the additional attributes we just added to the schema.

Update:

  • Line 11 for the path and name of the Watermark File you wish to use
  • Line 31 for the URI of your Workday Tenant

Executing the Management Agent using a Delta Import Delta Sync Run Profile

After creating a Delta Import Delta Sync Run Profile we can now run a Delta Sync. The following graphic is after seeding the WaterMark file (with the last run time in a format like this 2018-10-29T22:09:08.3628953+00:00), as by default without the WaterMark file being present a Full Import is performed by the MA as it doesn’t have a watermark to base the import time period on.

The changed records in Workday HR are then identified and those records obtained, imported and synchronised via the Management Agent.

Delta Sync.PNG

Summary

Using Delta Synchronisation functionality from Workday HR allows for much quicker synchronsiation from Workday HR to Microsoft Identity Manager.

Updated: Azure AD B2B Guest Invitations Microsoft Identity Manager Management Agent

In August I posted this that detailed Automating Azure AD B2B Guest Invitations using Microsoft Identity Manager. More recently Microsoft updated the Microsoft Graph to include additional information about Azure AD B2B Guest users and I wrote this that creates HTML Reports based off these new attributes.

That information is also handy when managing the lifecyle of Azure AD B2B Users. As we do that using Microsoft Identity Manager I’ve updated my Azure AD B2B Guest Invitation Management Agent for these attributes so they can be used in the lifecycle logic.

Updated Schema

I’ve updated the Schema script to include three new attributes that are shown in bold below in an extract from the Microsoft Graph.

odata.type : Microsoft.DirectoryServices.User
objectType : User
objectId : 38154c4c-a539-4920-a656-b5f8413768b5
deletionTimestamp : 
accountEnabled : True
creationType : Invitation
displayName : Rick Sanchez
givenName : Rick
mail : Rick.Sanchez@customer.com.au
mailNickname : Rick.Sanchez_customer.com.au#EXT#
otherMails : {Rick.Sanchez@customer.com.au}
proxyAddresses : {SMTP:Rick.Sanchez@customer.com.au}
refreshTokensValidFromDateTime : 2018-08-26T02:05:36Z
showInAddressList : False
surname : Sanchez
userPrincipalName : Rick.Sanchez_customer.com.au#EXT#@corporationone.onmicrosoft.com
userState : PendingAcceptance
userStateChangedOn : 2018-08-26T02:05:36Z
userType : Guest

Each are String attributes and I’ve named these;

  • B2BCreationType
  • B2BUpdatedDateTime
  • B2BExternalUserState

Here is the full updated Schema.ps1 Script.

Updated Import Script

The Import Script requires the following changes to bring in the B2B User State attributes.

 # B2B External User State for B2B Users from other AAD's 
if ($user.creationType) {$obj.Add("B2BCreationType", $user.creationType)} 
[string]$B2BUpdatedDateTime = $null 
if ($user.userStateChangedOn) {$B2BUpdatedDateTime = get-date($user.userStateChangedOn); $obj.Add("B2BUpdatedDateTime", $B2BUpdatedDateTime)} 
if ($user.userState) {$obj.Add("B2BExternalUserState", $user.userState)}

The full script with these additions is below. As per this post, make the following updates;

  • Change line 10 for your file path
  • Change line 24 for the version of an AzureAD or AzureADPreview PowerShell Module that you have installed on the MIM Sync Server so that the AuthN Helper Lib can be used. Note if using a recent version you will also need to change the AuthN calls as well as the modules change. See this post here for details.
  • Change line 27 for your tenant name
  • Change line 47/48 for a sync watermark file
  • The Import script also contains an attribute from the MA Schema named AADGuestUser that is a boolean attribute. I create the corresponding attribute in the MetaVerse and MIM Service Schemas for the Person/User objectClasses. This is used to determine when a Guest has been successfully created so their naming attributes can then be updated (using a second synchronisation rule).

Updating the Management Agent

With the updated Schema.ps1 and Import.ps1 scripts in place on the Synchronisation Server, using the Microsoft Identity Manager Synchronisation Service Manager right-click on the B2B Invitiation PSMA and select Refresh Schema.

Refresh B2B MA Schema.PNG

Select the Properties of the MA and choose Select Attributes.  Select the new Attributes.

Select New Attributes

Select Ok.

Select New Attributes - Selected

With the Schema updated and the Attributes selected a Stage/Full Sync can be performed. We now see the External User State, User Creation Type and External User Updated DateTime.

Updates with B2B External State.PNG

Summary

With a change to the Schema and Import B2B Invitation PSMA scripts we can now leverage the new B2B Attributes from the Microsoft Graph for use in our lifecycle management logic.

Creating SailPoint IdentityNow Source Configuration Backups and HTML Reports with PowerShell

In this post from earlier in the week I detailed leveraging the SailPoint IdentityNow APIs to retrieve IdentityNow Sources, and their configuration. This post takes that a little further, backing up the configuration and also creating a friendly HTML Report with each Sources’ Configuration and Schema. The resulting HTML Report that is dynamically created reports on all Sources in an IdentityNow Tenant Org and looks like the image below.  Sample Report.PNG

After selecting a Source you can then expand a report section for the Source Details and another for the Schema. Each Source and then Source Details and Source Schema is a collapsible DIV toggled by the link. A snippet of a Source Details output for a Generic Source looks like the image below.

Source Details.PNG

A snippet of a Source Schema output for a Generic Source looks like the image below.

Schema Example.PNG

The Script

This script assumes you are able to access the IdentityNow APIs as detailed in this post here. You will need to use that process to access the Sources APIs and have the necessary JWT Access Token to execute these API requests.

The report features an image. Here is the one I created. Download it and put it in the root of the output folder where the reports will be created.

SailPoint IdentityNow 240px.png

Make the following updates to the script:

  • Line 10 for the path to the Image file you saved from above
  • Line 17 for the base output path (sub directories are created for the date/time of each execution) for the Report and Configuration Backups

The Output

Following execution of the script a sub-directory under your directory path is created and you will find the HTML Report along with two files for each Source. An XML export of the Source Details and the Source Schema. If you need to inspect a configuration that has been exported you can use the Import-Clixml -Path “path to the exported xml file” to import it into PowerShell and inspect it.

Example Output Files.PNG

Summary

Put the execution of this script on a schedule whilst you are in the development/configuration phase of your IdentityNow deployment and you will get automated configuration reports and backups that can be reviewed if you need to roll-back or just see what changes have been made over time.

Managing SailPoint IdentityNow Sources via the API with PowerShell

Back again with another post in my series detailing accessing SailPoint IdentityNow via the API using the unpublished and undocumented APIs. Previous posts detail;

This post also assumes you are able to access the IdentityNow APIs as detailed in this post here. You will need to use that process to access the Sources APIs. You will also need to update the Headers for “Content-Type” for the Get API calls and again for the Post API call. Add this line to your script to allow the query and return of Source Details

$Global:IDN.Headers.Remove("Content-Type")

This post details:

  • Getting a List of Sources
  • Getting the Details of a Source
  • Getting the Schema of a Source
  • Updating the Details of a Source

Getting a List of Sources

https://$($orgName).api.identitynow.com/cc/api/source/list

The API call shown above will return all Sources configured in the queried IdentityNow Tenant. For each Source a limited set of configuration information is returned. Below is an example for a Delimited File Source File Source Type.

id : 36666
version : 2
name : Privileged Access Management
description : Cyberark PAM
owner :
lastUpdated : 2018-10-05T00:39:33Z
scriptName : delimitedfile
definitionName : Delimited File
appCount : 0
userCount : 0
sourceConnected : False
sourceConnectorName : Delimited File
supportsEntitlementAggregation : true
externalId : 2c918086663fbbc0016641aa51041603
icon : https://files.accessiq.sailpoint.com/modules/builds/static-assets/perpetual/identitynow/icons/2.0/source/
health : @{hostname=564c355e916f; lastSeen=1538699972555; org=orgName; healthy=True; lastChanged=1538699972555;
isAuthoritative=false; id=36777; type=C:173-delimited-file; status=SOURCE_STATE_UNCHECKED_SOURCE_NO_ACCOUNTS;
since=1494370939}
sourceType : DELIMITED_FILE
useForAuthentication : False
useForAccounts : False
useForProvisioning : False
useForPasswordManagement : False
iqServiceDownloadUrl : https://files.accessiq.sailpoint.com/integrations/iqservice/IQService.zip

PowerShell Example

The following will return the list of Sources in an IdentityNow Tenant where $orgName is the Organisation Name for your IdentityNow Tenant.

$IDNSources=Invoke-RestMethod-Method Get -Uri "https://$($orgName).api.identitynow.com/cc/api/source/list"-WebSession $IDN
write-host -ForegroundColor Green "$($IDNSources.Count) Sources found"

Getting the Details of a Source

https://$($orgName).api.identitynow.com/cc/api/source/get/$($sourceID)

The API call shown above will return all the details for the specified Source. Below is an example of the full details for the same Delimited File Source File Source Type above.

id : 36666
version : 3
name : Privileged Access Management
description : Cyberark PAM 
owner : @{id=1084412; name=IDN Admin}
lastUpdated : 2018-10-22T21:29:12Z
scriptName : delimitedfile
definitionName : Delimited File
appCount : 0
userCount : 0
sourceConnected : False
sourceConnectorName : Delimited File
supportsEntitlementAggregation : true
externalId : 2c918086663fbbc0016641aa51041603
icon : https://files.accessiq.sailpoint.com/modules/builds/static-assets/perpetual/identitynow/icons/2.0/source/
health : @{hostname=564c355e916f; lastSeen=1538699972555; org=orgName; healthy=True; lastChanged=1538699972555;
isAuthoritative=false; id=36777; type=C:173-delimited-file; status=SOURCE_STATE_UNCHECKED_SOURCE_NO_ACCOUNTS;
since=1547135418}
sourceType : DELIMITED_FILE
useForAuthentication : False
useForAccounts : False
useForProvisioning : False
useForPasswordManagement : False
iqServiceDownloadUrl : https://files.accessiq.sailpoint.com/integrations/iqservice/IQService.zip
entitlementsCount : 0
accountsCount : 0
connector_featuresString : DIRECT_PERMISSIONS, NO_RANDOM_ACCESS, DISCOVER_SCHEMA
hasValidAccountProfile : False
correlationConfig : @{attributeAssignments=; id=; name=}
sourceConfigFrom : Mantis Config: Cloud Connector
isAuthoritative : False
accessProfilesCount : 0
connector_delimiter : ,
connector_commentCharacter : #
connector_numberOfLinesToSkip :
connector_filterString :
cloudDisplayName : Privileged Access Management
cloudExternalId : 36777
cloudOriginalApplicationType : Delimited File
deleteThresholdPercentage : 10
file : /var/lib/identityiq_workspace/f8001b46-4fab-4e0b-ad15-18f53dc1507c-accounts.csv
filetransport : local
filterEmptyRecords : True
formPath :
group.columnNames : {id, name, displayName, created...}
group.delimiter : ,
group.file : /var/lib/identityiq_workspace/156524b6-9513-404c-8063-40275edfa575-groups.csv
group.filetransport : local
group.filterEmptyRecords : True
group.hasHeader : True
group.host : local
group.indexColumn : id
group.mergeColumns : {entitlements, groups, permissions}
group.mergeRows : True
group.partitionMode : disabled
hasHeader : True
host : local
indexColumn : id
managerCorrelationFilter :
mergeColumns : {groups}
mergeRows : True
partitionMode : disabled
templateApplication : DelimitedFile Template

PowerShell Example

The following will return the details for all sources in an IdentityNow Tenant where $orgName is the Organisation Name for your IdentityNow Tenant and IDNSources is the collection of Sources returned from the List Sources API call above.

foreach ($idnSource in $IDNSources){
    # Get Source Details
    $sourceInfo=Invoke-RestMethod-Method Get -uri "$($sourceDetailsURI)/$($idnSource.id)"-WebSession $IDN
    $sourceInfo
}

Getting the Schema of a Source

https://$($orgName).api.identitynow.com/cc/api/source/getAccountSchema/$($sourceID)

The API call shown above will return the Schema for the specified Source. Below is an example of the Schema for the same Delimited File Source File Source Type above.

attributes : {@{description=The unique ID for the account; displayAttribute=False; entitlement=False; identityAttribute=True; managed=False;
    minable=False; multi=False; name=id; type=string}, @{description=The name of the account - typical username etc;
    displayAttribute=True; entitlement=False; identityAttribute=False; managed=False; minable=False; multi=False; name=name;
    type=string}, @{description=The first or given name of the user associated with the account; displayAttribute=False;
    entitlement=False; identityAttribute=False; managed=False; minable=False; multi=False; name=givenName; type=string},
    @{description=The last, family name, or surname of the user associated with the account; displayAttribute=False; entitlement=False;
    identityAttribute=False; managed=False; minable=False; multi=False; name=familyName; type=string}...}
displayAttribute : name
groupAttribute : groups
identityAttribute : id
nativeObjectType : User
objectType : account

PowerShell Example

The following will return the schema for all sources in an IdentityNow Tenant where $orgName is the Organisation Name for your IdentityNow Tenant and IDNSources is the collection of Sources returned from the List Sources API call above.

foreach ($idnSource in $IDNSources){
   # Get Source Schema Details
   $sourceSchema=Invoke-RestMethod-Uri "$($sourceSchemaURI)/$($source.id)"-WebSession $IDN
   $sourceSchema
}

Updating  the Details for a Source

https://$($orgName).api.identitynow.com/cc/api/source/update/$($sourceID)

The API endpoint above is called to update the Details for a Source. In conjunction with calling the API endpoint a Body needs to be provided to update the Source Details. Below is an example of updating the Owner and the Description of a Source.

Notes:

  • The Content-Type needs to be updated for “application/x-www-form-urlencoded; charset=UTF-8”
  • You only need to specify the attributes you wish to change and append them to each other with the separator ‘&
  • The body with the udpate(s) needs to be URLEncoded. PowerShell Invoke-RestMethod handles that
$sourceUpdateURI = "https://$($orgName).api.identitynow.com/cc/api/source/update"
$Global:IDN.Headers.Add('Content-Type', "application/x-www-form-urlencoded; charset=UTF-8")
$sourceID=$idnSource.id
$sourceDesscription = "CyberArk"
$sourceOwnerID = "1089912"
$sourceDetailsBody = "description=$($sourceDesscription)&ownerId=$($sourceOwnerID)"
$updateSource = Invoke-RestMethod -Method Post -Uri "$($sourceUpdateURI)/$($sourceID)" -Body $sourceDetailsBody -WebSession $Global:IDN

If you set a variable to the POST webRequest you get the updated object returned following a successful update. A snippet of the response is below. The version updates with each update.

id : 36666
version : 7
name : Privileged Access Management
description : CyberArk
owner : @{id=1089123; name=Bob Smith}
lastUpdated : 2018-10-23T01:15:26Z
scriptName : delimitedfile
definitionName : Delimited File

Summary

In this post I showed using PowerShell to access the Sources APIs to List Sources, Get full details for a Source, Get the Schema of a Source and Update the Details for a Source. In my next post I’ll show generating HTML Reports for the configuration of Sources.

Here is the snippet of the calls as listed in this post. As per the introduction it assumes you are authenticated and re-using your WebSession.