Retrieving SailPoint IdentityNow Certification Reports using PowerShell

This is the third and probably last post in the Certifications by API series. The first post detailed retrieving and searching campaigns, the second post detailed creating and starting campaigns. If you haven’t read those, check them out as they will give you the background for this one.

As detailed in the previous two posts this post also assumes you are authenticated to IdentityNow as detailed in this post, and you understand that this post details accessing Certifications using the non-versioned SailPoint IdentityNow API’s.

With that all said, this post details obtaining Certification Reports from Completed Campaigns. The process goes like this;

  • Search and return Completed Campaigns
  • Identify campaigns completed within a time period
  • Retrieve and export the CSV completion reports

Note: The first time you access the getReports API for a Campaign the Reports are automatically generated. You can and should include additional logic to catch that the request for the report fails and wait a minute before retrying when the report has been generated allowing you to then retrieve it.

Reports API

As mentioned above the getReports API is used to get a list of reports from campaigns.

https://$($orgName).api.identitynow.com/cc/api/campaign/getReports

To get an individual report the report/get API call is used along with the ID of the report to retrieve. The base URI is;

https://$($orgName).api.identitynow.com/cc/api/report/get

Exporting Certification Campaign Completion Reports

The following PowerShell script will enumerate completed campaigns, compare the completion time to the current time and the time window to export (last 7 days in my example below) and export the CSV version of the reports to a directory on the host running the script.

Update:

  • Line 11 for the number of days previous to days date to retrieve reports on
  • Line 34 for the output CSV path. The exports are named based on the Description of the Campaign, so you’ll need to modify that if all your descriptions are the same

The output is the CSV File which can then be manipulated in PowerShell or Excel (via Data => Import CSV.

Certification Report in Excel.PNG

Summary

Using the Certifications API we can query for Completed Certification Campaigns and then retrieve their complete reports and export them to the file system. Likewise with a few simple changes you can also export the other reports.

Azure VM Docker CreateContainer Error (0xc0370102)

A real quick post as I’ve just managed to figure out what was causing this error and why my Docker container wasn’t being created.

I’d just deployed a brand new Windows 10 VM in Azure Resource Manager and installed Docker. But when I attempted to create a new container I was getting the error below in both the Docker console and in the Application Event Log.

---snippet start---
Handler for POST /v1.38/containers/6f6e2c37142a63db7a19e2533a97b8affad52b735d7cfcbbd2b69ce87ef4c36c/start
returned error: container 6f6e2c37142a63db7a19e2533a97b8affad52b735d7cfcbbd2b69ce87ef4c36c 
encountered an error during CreateContainer: failure in a Windows system 
call: The virtual machine could not be started because a required feature 
is not installed. (0xc0370102) extra info:
 {"SystemType":"Container","Name":"6f6e2c37142a63db7a19e2533a97b8affad52b735
d7cfcbbd2b69ce87ef4c36c","Owner":"docker","IgnoreFlushesDuringBoot":true,
"LayerFolderPath":"C:\\ProgramData\\Docker\\windowsfilter\\6f6e2c37142a63db7
a19e2533a97b8affad52b735d7cfcbbd2b69ce87ef4c36c","Layers":[{"ID":"4457d5ac-
9bda-5212-b74c-
---snippet end---

The key words out of all that are:

CreateContainer: failure in a Windows system call: The virtual machine could not be started because a required feature is not installed. (0xc0370102)

…. but search as you might, you can’t find anything anywhere. Through my troubleshooting process I tried creating a Hyper-V VM which I could, but it wouldn’t start with an error the a Hyper-V Process was not installed/running.

Thinking through want I was trying to do and the errors from both Hyper-V and Docker I realized it need to be something to do with Nested Virtualization.

Light-bulb Moment

Support for Nested Virtualization in Azure VM’s is dependent on the generation of the underlying host. When Microsoft started supporting the Intel Broadwell processors, they also enabled Nested Virtualization on some VM Sizes.

Changing my VM Size from a DS_v2 to a DS_v3 was all that I had to do. My Docker Containers are now running on a Windows 10 Virtual Machine in Azure Resource Manager.

FYI, Azure Dv3 and Ev3 VM’s support Nested Virtualization and I’m sure there are also now others. Hopefully I’ve saved someone else the two hours I’ve just lost.

PowerShell Progress Notifications that work in Visual Studio Code on Window 10

Earlier this year I made the switch from PowerShell ISE to Visual Studio Code (VSCode) and everything has been going just swimmingly. Well, until I was modifying some old scripts that perform some long running processes and I’d previously added Write-Progress statements in them to provide feedback as to how the script was going. Long story short, Write-Progress doesn’t work in VSCode. I’m definitely not the first to notice this as there is an open issue on Github for it, but seeing as its been open for 2 1/2 years I figured it probably wasn’t going to be resolved soon.

So I went looking for alternatives. I’ve found two that I’m happy with and will be using moving forward. One is a text-based console output progress bar and the other uses Windows 10/Windows Server Notification Centre. This post explores both and how I’m using them.

psInlineProgress

psInlineProgress is authored by Øyvind Kallstad and has been around for a few years now, so does require a small tweak to get it to work with VSCode. Nothing major, just a change for the Console (details further below). The progress bar itself is similar to the Write-Progress one in PowerShell. Here is an example;

inLineProgress Bar Notification
inLineProgress Bar Notification

You can get it from here if you need a manual install, otherwise it’s quick to install via Install-Module

Install-Module -Name psInlineProgress

To allow the progress bar to display in VSCode update the psInlineProgress.psd1 file to change Line 36 as shown below. You should be able to find it in C:\Program Files\WindowsPowerShell\Modules\psInlineProgress\1.1

#PowerShellHostName = 'ConsoleHost'
PowerShellHostName = 'Visual Studio Code Host'

Outputting to psInlineProgress

Updating the psInlineProgress bar is simply a case of giving the Progress Bar the dialog text you want displayed, in percent the progress, the progressed and un-progressed characters. In my example above I’m using the < symbol for the progress position, for the un-progressed and . for the progressed.  Sort of like a Pac-Man consuming dashes and outputting periods.

The PowerShell line looks like this, where $percentComplete is an integer between 0 and 100. E.g

Write-InlineProgress -Activity "Getting User Object $($obj.displayName)" -PercentComplete $percentComplete -ProgressCharacter '<' -ProgressFillCharacter '.' -ProgressFill '-'

Burnt Toast

Burnt Toast is a newer module writen by Josh King that takes advantage of new features in Windows 10. The progress notification isn’t a progress bar, but a Toast Notification via the Notification Centre. Here is an example;

Burnt Toast PowerShell Notification
Burnt Toast PowerShell Notification

You can get it from here if you need a manual install, otherwise it’s quick to install via Install-Module

Install-Module -Name BurntToast

Outputting to Burnt Toast

Updating Burnt Toast is similar to inlineProgress whereby you pass dialog text you want displayed, the progress completed as a fraction between 0 and 1, and a graphic. E.g

$ProgressBar = New-BTProgressBar -Status 'Getting User Objects' -Value $progressDisplay
New-BurntToastNotification –Text ‘IdentityNow Source Import’ -ProgressBar $ProgressBar -Silent –UniqueIdentifier 'Get Users' -AppLogo "C:\Users\DarrenJRobinson\Images\sailpoint.png"

As Burned Toast is a Windows Notification, after the toast notification disappears it can still be found in the Notification Centre.

Windows Notification Centre - Burned Toast
Windows Notification Centre – Burned Toast

Other Burnt Toast Features

Burnt Toast can do much more, such as Alarms, Sounds and Reminders. See examples for those and more on Github here.

Working Example

Here is a working sample from a project where I’m processing thousands of identities. The $idnObjects is a PowerShell Object Collection of all the identities.

The following script snippet comes from the processing loop that processes the identities. $i is incremented on each loop (e.g $i++) and then used to calculate the percentage and fraction completed.

psInlineProgress and Burnt Toast are only updated each whole % of progress (that is calculated in line 9).

The output in VSCode therefore looks like the following. In reality you’d only have one or the other depending on what you were looking to achieve. I’ve done both together to show a side by side comparison.

Toast and Progress Bar.PNG

Summary

Goodbye Write-Progress and Hello Burnt Toast and psInlineProgress

Creating SailPoint IdentityNow Certification Campaigns using PowerShell

Create Sailpoint IdentityNow Certification Campaigns

This is the second post in the Certifications by API series. The last post detailed searching and retrieving campaigns. If you haven’t read that, check that out as it will give you the background for this one.

Also as per the last post this post also assumes you are authenticated to IdentityNow as detailed in this post, and you understand that this post details accessing Certifications using the non-versioned SailPoint IdentityNow API’s.

With that all said, this post details the creation of IdentityNow Certification Campaigns via the API using PowerShell. The Create Campaigns from IdentityNow Search process goes like this;

  • using the Search API, find the users connected to a Source (or a group of users based on other criteria)
  • iterate through them to identify their Role(s), Entitlement(s) and Source(s) and create a Manager Certification Campaign
  • Specify the period for the Campaign along with options such as notifications and revocation
  • Start the Campaign

Campaign Creation

As stated above the first task is to search and retrieve candidates for the campaign. This uses the Search function as I described in more detail in this post here.

If you have more than the searchLimit allowable via the API you will need to page the results over multiple queries. I’ll detail how to do that in a future post. In the query below I’m searching for users on the Source “Active Directory”.

$searchLimit = '2500'
# Search Identities URI $searchURI = "https://$($orgName).api.identitynow.com/v2/search/identities?"
# Query for Source that Campaign is for
$query = '@accounts(Active Directory)'
# Search Accounts
$Accounts = Invoke-RestMethod -Method Get -Uri "$($searchURI)limit=$($searchLimit)&query=$($query)" -Headers @{Authorization = "Basic $($encodedAuth)" }
write-host -ForegroundColor Yellow "Search returned $($accounts.Count) account(s)"

With the users returned we need to iterate through each and look at the users Entitlements, Roles and Access Profiles. We are creating a Manager campaign for these users for all of these (you can reduce the scope if required). The PowerShell snippet to do that looks like this.

User Roles Access Profiles and Entitlements
User Roles Access Profiles and Entitlements

For inclusion in the campaign we need to build a collection for the Roles, Entitlements and Access Profiles. Using PowerShell to iterate through the list obtained from the users above therefore looks like this:

The summary after enumeration below shows the users for the source will be cover 1 Role, 9 Entitlements and 2 Access Profiles.

Summary of Roles Entitlements and Access Profiles
Summary of Roles Entitlements and Access Profiles

Now we have most of the information defined for the scope of our campaign we can specify the additional criteria and information such as duration, name, description, notification, and revocation. Each of those settings are self explanatory by the configuration setting. We then create the Campaign and and Activate it. I have a short delay after creation before activation as I found race conditions. You could lower the delay, but YMMV.

You can also add a Static Reviewer for the campaign as by default the owner will be the account you’re using to perform the creation. Add the following line into the configuration options and specify the ID for the identity.

$campaignOptions.Add("staticReviewerId", $reviewerUser.id )

The ID for an Identity can be obtained via Search. e.g.

# Get Campaign Reviewer in addition to the campaign creator
$usrQuery = '@accounts Rick.Sanchez'
$reviewerUser = Invoke-RestMethod -Method Get -Uri "$($searchURI)limit=$($searchLimit)&query=$($usrQuery)" -Headers @{Authorization = "Basic $($encodedAuth)" }

Putting it all together then looks like this in PowerShell.

The screenshot below shows the campaign being created.

Campaign Created
Campaign Created

Looking at the Certifications section in the IdentityNow Portal we can see the newly created Campaign.

Campaign in Portal
Created Campaign shows in Portal

And we can see the two Managers requiring review.

Campaign in Portal 2.PNG
Campaign details in Portal

As the reviewer for Ronnie I can then go and start the review and see the Entitlements that I need to review for the campaign.

Entitlements Certification
Entitlements Certification

Summary

Using PowerShell we can search IdentityNow and find accounts on a Source and create a Certification Campaign for them based on Roles, Entitlements and Access Profiles. We can then also activate the campaign. Happy orchestrating.

Accessing SailPoint IdentityNow Certification Campaigns using PowerShell

Sailpoint IdentityNow Certifications

This is the first post in a series covering SailPoint IdentityNow Certifications. Specifically listing and returning campaigns, creating campaigns and accessing campaign reports. This post will show Listing Active and Completed Campaigns, Searching for a specific Campaign and returning the full details for a Campaign.

The IdentityNow v1 API’s and v2 API’s don’t expose endpoints for IdentityNow Certification Campaigns so access will be via the non-public/versioned Certification API’s.  In order to access these API’s you will need to be appropriately authenticated. This post here details getting up to speed with that and is a prerequisite for performing the campaign functions I detail in this post.

Retrieving Certification Campaigns

Now that you’re authorized to IdentityNow we can look to retrieve Certification Campaigns. This can be achieved by calling the /campaign/list API.

https://$($orgName).api.identitynow.com/cc/api/campaign/list

Using PowerShell all Active Certification Campaigns can be returned by making the following API call and configuration. Note the Content-Type is removed as if you specify a Content-Type the API will error.

To retrieve Completed campaigns change $completedOnly = $false to $completedOnly = $true
# List Campaign Base URI
$GetCampaignBaseURI = "https://$($orgName).api.identitynow.com/cc/api/campaign/list"
$IDN.Headers.Remove("Content-Type")
$utime = [int][double]::Parse((Get-Date -UFormat %s))
$completedOnly = $false
$campaigns = 100
# Get Active Campaigns
$existingCampaigns=Invoke-RestMethod-method Get -uri "$($GetCampaignBaseURI)?_dc=$($utime)&completedOnly=$($completedOnly)&start=0&limit=$($campaigns)"-WebSession $IDN

Iterating through each result and outputting a summary to the console is then possible as shown below.

List Active SailPoint IdentityNow Campaigns
List Active SailPoint IdentityNow Campaigns

To retrieve an individual Campaign you need to know the ID of the Campaign. You can then retrieve it directly using the campaign/getCertifications API  e.g.

https://$($orgName).api.identitynow.com/cc/api/campaign/getCertifications?_dc=1542094205212&campaignId=2c9180856708ae38016709f4812345c3
Doing that via PowerShell looks like this
$IDN.Headers.Remove("Content-Type")
$utime = [int][double]::Parse((Get-Date -UFormat %s))
$campaignID = "2c9180856708ae38016709f4812345c3"
$Certs=Invoke-RestMethod-method get -Uri "https://$($orgName).api.identitynow.com/cc/api/campaign/getCertifications?_dc=$($utime)&campaignId=$($campaignID)"-websession $IDN
$Certs.items

Searching for Certifications

The new Search Beta does not extend to Certifications. Retrieving a Certification Campaign via the Campaign ID is fine, if you know it (which you won’t).  So here is my workaround for this. Retrieve all Campaigns (Active OR Completed) as detailed above using PowerShell and then use the power of PowerShell (Where-Object) to search and find the Campaign you want.

$completedOnly = $true
$existingCampaigns = Invoke-RestMethod -method Get -uri "$($GetCampaignBaseURI)?_dc=$($utime)&completedOnly=$($completedOnly)&start=0&limit=$($campaigns)" -WebSession $IDN
$myCampaign = $existingCampaigns.items | Select-Object | Where-Object {$_.name -like "*Dec 2018 Campaign*"}

Searching and returning campaigns and then retrieving the full details for a campaign therefore looks like this in PowerShell.

$myCampaignFull=Invoke-RestMethod-method get -Uri "https://$($orgName).api.identitynow.com/cc/api/campaign/getCertifications?_dc=$($utime)&campaignId=$($myCampaign.id)"-websession $IDN
$myCampaignFull.items
Search SailPoint IdentityNow Campaigns and Retrieve Full Campaign
Search SailPoint IdentityNow Campaigns and Retrieve Full Campaign

Summary

Using PowerShell we can get a list of all Completed and Active Certification Campaigns. We can then find the campaign we are looking for information on and retrieve all its details. In the upcoming posts I’ll show how to create a Certification Campaign and also how to retrieve Reports from completed campaigns.

An Azure PowerShell Trigger Function for MAC Address Vendor / Manufacturer Lookup

Recently I started working on another side IoT Project. As part of that I needed to identify the Vendor / Manufacturer of networking equipment. As you are probably aware each network device has a unique MAC Address. A MAC Address looks like this 60:5b:b4:f9:63:05The first 24 bits (6 hex characters) detail the vendor / manufacturer.

There are a number of online lookup tools to determine who the vendor is from the MAC address. And some like that one have an API to allow lookup too. If you are only looking up small volumes that is all good, but after that you get into subscription fee costs. I needed more than 1000 per day, but I also had a good idea of what the vendors were likely to be for a lot of my requests. So I rolled my own using an Azure Trigger Function.

Overview

The IEEE standards body maintains a list of the manufacturers assigned the 24 bit identifiers. A full list can be found here which is updated regularly. I downloaded this list and wrote a simple parser that created a PowerShell Object with the Hex, Base16 and Name of each Manufacturer.

I then extracted the manufacturers I expect to need to reference/lookup into a PSObject that is easily exportable and importable (export-clixml / import-clixml) and use that locally in my application. The full list to too large to keep locally so I exported the full list (again using export-clixml) and implemented a lookup as an Azure Function (that reads in the full list as a PSObject that takes ~1.7 seconds for 25,000+ records) which can then be queried with either Hex or Base16 as per the format in the IEEE list and the vendor name is returned.

Converting the IEEE List to a PowerShell Object

This little script will download the latest version of the OUI list and convert to a PowerShell Object.  The resulting object looks like this:

vendor base16 hex
------ ------ ---
Apple, Inc. F0766F 40-CB-C0
Apple, Inc. 40CBC0 40-98-AD
Apple, Inc. 4098AD 6C-4D-73

Update:

  • Line 4 for the local location to output the OUI List too
  • Line 39 for the PSObject file to create

If you want to query the file locally using PowerShell you can like this:

$query="64-70-33"
$result = $vendors | Select-Object | Where-Object {$_.hex -like $query}
$result
which will output
vendor base16 hex
------ ------ ---
Apple, Inc. 50A67F 64-70-33

If you want to extract all entries associated with a hardware vendor (e.g Apple) you can like this;

$apple = $vendors | Select-Object | Where-Object {$_.vendor -like "Apple*"}

and FYI, Apple have 671 registrations. Yes they make a LOT of equipment.

Azure Function

Here is the Azure Trigger PowerShell Function that takes a JSON object with a query containing the Base16 or Hex values for the 24bit Vendor Manufacturer and returns the Vendor / Manufacturer. e.g

{"query": "0A-00-27"}

Don’t forget to upload the Vendors.xml exported above to your Azure Function (you can drag and drop using Kudu) and update the path in Line 7.

An example PowerShell script to query would be similar to the following. Update $queryURI with the URI to your Azure Function.

$queryURI = "https://FUNCTIONAPP.azurewebsites.net/api/AZUREFUNCTION?code=12345678/uiEx6kse6NujQG0b4OIcjx6B2wHLhZepBD/8Jy6fFawg=="
$query = "0A-00-27"
$body = @{"query" = $query} | ConvertTo-Json
$result=Invoke-RestMethod-Method Post -Uri $queryURI-Body $body
$result
The output will then return the manufacturer name. e.g
Microsoft Corporation

To lookup all MAC addresses from your local windows computer the following snippet will do that after updating $queryURI for you Azure Function.

# Query MAC Address
$queryURI = "https://FUNCTIONAPP.azurewebsites.net/api/AZUREFUNCTION?code=12345678/uiEx6kse6NujQG0b4OIcjx6B2wHLhZepBD/8Jy6fFawg=="
$netAdaptors = Get-NetAdapter

foreach ($adaptor in $netAdaptors){
    $mac=$adaptor.MacAddress
    $macV=$mac.Split("-")
    $macLookup="$($macV[0])$($macV[1])$($macV[2])"
    $body=@{"query"=$macLookup} |ConvertTo-Json
    $result=Invoke-RestMethod-Method Post -Uri $queryURI-Body $body-Headers @{"content-type"="application/text"}
    Write-Host-ForegroundColor Blue $result
}

Summary

With the power of PowerShell it is quick to take a large amount of information and transform it into a usable collection that can then also be quickly exported and re-imported. It is also quickly searchable and thanks to Azure Functions supporting PowerShell it’s simple to stand-up the collection and query it as required programatically.

 

Adding Delta Sync Support to the Microsoft Identity Manager PowerShell Management Agent for Workday HR

Recently I posted a sample Microsoft Identity Manager Management Agent for Workday HR. Subsequently I also posted about some updates I made to the WorkdayAPI PowerShell Module to enable functionality to specify the time period to return changes for. This post details updating  my sample Workday Management Agent to support Delta Synchronisation.

WorkdayAPI PowerShell Module

First up you will need the updated WorkdayAPI PowerShell Module that provides the Get-WorkdayWorkerAdv cmdlet and can take a time period to return information for. Get the updated WorkdayAPI PowerShell Module from here

Update the PowerShell Module on the MIM Sync Server. The module by default will be in the  C:\Program Files\WindowsPowerShell\Modules\WorkdayApi folder.

You will need to unblock the new files.

Get-ChildItem 'C:\Program Files\WindowsPowerShell\Modules\WorkdayApi' | Unblock-File
Get-ChildItem 'C:\Program Files\WindowsPowerShell\Modules\WorkdayApi\scripts' | Unblock-File

Updated Schema

In the updated Management Agent I’m also bringing into MIM additional attributes from the other enhancements I made to the PowerShell Module for HireDate, StartDate, EndDate, Supplier and WorkdayActive. The updates to the Schema.ps1 are shown below.

$obj | Add-Member -Type NoteProperty -Name "HireDate|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "StartDate|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "EndDate|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "Supplier|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "WorkdayActive|Boolean" -Value $True

The full updated Schema Script is below;

With the Schema Script updated, refresh the Management Agent Schema.

Update Schema

You can then select the new attributes in the Workday MA under Select Attributes.

Select New Attributes.PNG

Then select Ok.

Attributes Selected.PNG

Updated Import Script

The Import Script has a number of changes to handle creating and updating a WaterMark File that is used to store the date stamp of the last run. Also updated in the Import Script is the change to use the Get-WorkdayWorkerAdv cmdlet over the Get-WorkdayWorker cmdlet so that a time period can be specified, and to retrieve the additional attributes we just added to the schema.

Update:

  • Line 11 for the path and name of the Watermark File you wish to use
  • Line 31 for the URI of your Workday Tenant

Executing the Management Agent using a Delta Import Delta Sync Run Profile

After creating a Delta Import Delta Sync Run Profile we can now run a Delta Sync. The following graphic is after seeding the WaterMark file (with the last run time in a format like this 2018-10-29T22:09:08.3628953+00:00), as by default without the WaterMark file being present a Full Import is performed by the MA as it doesn’t have a watermark to base the import time period on.

The changed records in Workday HR are then identified and those records obtained, imported and synchronised via the Management Agent.

Delta Sync.PNG

Summary

Using Delta Synchronisation functionality from Workday HR allows for much quicker synchronsiation from Workday HR to Microsoft Identity Manager.

Updated: Azure AD B2B Guest Invitations Microsoft Identity Manager Management Agent

In August I posted this that detailed Automating Azure AD B2B Guest Invitations using Microsoft Identity Manager. More recently Microsoft updated the Microsoft Graph to include additional information about Azure AD B2B Guest users and I wrote this that creates HTML Reports based off these new attributes.

That information is also handy when managing the lifecyle of Azure AD B2B Users. As we do that using Microsoft Identity Manager I’ve updated my Azure AD B2B Guest Invitation Management Agent for these attributes so they can be used in the lifecycle logic.

Updated Schema

I’ve updated the Schema script to include three new attributes that are shown in bold below in an extract from the Microsoft Graph.

odata.type : Microsoft.DirectoryServices.User
objectType : User
objectId : 38154c4c-a539-4920-a656-b5f8413768b5
deletionTimestamp : 
accountEnabled : True
creationType : Invitation
displayName : Rick Sanchez
givenName : Rick
mail : Rick.Sanchez@customer.com.au
mailNickname : Rick.Sanchez_customer.com.au#EXT#
otherMails : {Rick.Sanchez@customer.com.au}
proxyAddresses : {SMTP:Rick.Sanchez@customer.com.au}
refreshTokensValidFromDateTime : 2018-08-26T02:05:36Z
showInAddressList : False
surname : Sanchez
userPrincipalName : Rick.Sanchez_customer.com.au#EXT#@corporationone.onmicrosoft.com
userState : PendingAcceptance
userStateChangedOn : 2018-08-26T02:05:36Z
userType : Guest

Each are String attributes and I’ve named these;

  • B2BCreationType
  • B2BUpdatedDateTime
  • B2BExternalUserState

Here is the full updated Schema.ps1 Script.

Updated Import Script

The Import Script requires the following changes to bring in the B2B User State attributes.

 # B2B External User State for B2B Users from other AAD's 
if ($user.creationType) {$obj.Add("B2BCreationType", $user.creationType)} 
[string]$B2BUpdatedDateTime = $null 
if ($user.userStateChangedOn) {$B2BUpdatedDateTime = get-date($user.userStateChangedOn); $obj.Add("B2BUpdatedDateTime", $B2BUpdatedDateTime)} 
if ($user.userState) {$obj.Add("B2BExternalUserState", $user.userState)}

The full script with these additions is below. As per this post, make the following updates;

  • Change line 10 for your file path
  • Change line 24 for the version of an AzureAD or AzureADPreview PowerShell Module that you have installed on the MIM Sync Server so that the AuthN Helper Lib can be used. Note if using a recent version you will also need to change the AuthN calls as well as the modules change. See this post here for details.
  • Change line 27 for your tenant name
  • Change line 47/48 for a sync watermark file
  • The Import script also contains an attribute from the MA Schema named AADGuestUser that is a boolean attribute. I create the corresponding attribute in the MetaVerse and MIM Service Schemas for the Person/User objectClasses. This is used to determine when a Guest has been successfully created so their naming attributes can then be updated (using a second synchronisation rule).

Updating the Management Agent

With the updated Schema.ps1 and Import.ps1 scripts in place on the Synchronisation Server, using the Microsoft Identity Manager Synchronisation Service Manager right-click on the B2B Invitiation PSMA and select Refresh Schema.

Refresh B2B MA Schema.PNG

Select the Properties of the MA and choose Select Attributes.  Select the new Attributes.

Select New Attributes

Select Ok.

Select New Attributes - Selected

With the Schema updated and the Attributes selected a Stage/Full Sync can be performed. We now see the External User State, User Creation Type and External User Updated DateTime.

Updates with B2B External State.PNG

Summary

With a change to the Schema and Import B2B Invitation PSMA scripts we can now leverage the new B2B Attributes from the Microsoft Graph for use in our lifecycle management logic.

Enrolling and using both Microsoft Authenticator and a YubiKey Physical Token with Azure MFA

Microsoft have just announced the Public Preview for Hardware OATH Tokens such as the Yubico YubiKey with Azure MFA. In this very long and graphic heavy post I show the end-to-end setup and use of a YubiKey physical token from Yubico as a Multi-Factor Authentication (MFA) second factor authentication method to Azure AD/Office 365.

Specifically I detail;

  • the user experience using a YubiKey Hardware Token with Azure MFA
  • the administrator configuration process for admin enabled YubiKey physical tokens for use with Azure MFA
  • a user enrolling a YubiKey physical token as an additional method for use with Azure MFA
  • switching second-factor authentication methods when authenticating to Azure AD / Office 365

For the process I show here;

  • the Admin account I’m using to do the configuration is a Global Admin
  • the user I’m enabling the token for
    • is assigned an Enterprise Mobility + Security E3 license
    • is enabled for MFA
    • was enrolled in MFA using the Microsoft Authenticator App.

Authenticating to Azure AD/Office 365 with a YubiKey for MFA

Before I get into the configuration and setup, here is the resulting process when complete. When authenticating to Azure AD/Office 365 I’m prompted for my Username and Password.

Using YubiKey Token Azure MFA 1

Then I’m prompted for my YubiKey One Time Password (OTP). With the key inserted in my computer ……

Laptop Yubikey Azure MFA.jpg

… and the Yubico Authenticator open, the Yubico Authenticator displays the OTP that I can copy and paste the password into the Login Code Window.

Note: The Yubico Authenticator will only display the OTP code for the appropriately configured YubiKey which it is inserted into the same computer running the Yubico Authenticator.

Admin Enabled OATH TOTP Token Assignment

In order to enable physical tokens for use with Azure MFA an Azure Administrator must configure token assignments for users in the Azure Portal. Like other functionality we’ve seen during Public Preview (such as Azure B2B) the method to configure these assignments is uploading a CSV with the necessary information. Hopefully we don’t have to wait too long for a Microsoft Graph/PowerShell Module to complete this step.

CSV Format

The CSV Format is shown below in raw and from VSCode. Essentially token assignment assigns a token to a UPN. The upload will fail if you don’t specify a valid UPN for your tenant.

Note: The header row must be present and don’t use quotes. 

upn,serial number,secret key,timeinterval,manufacturer,model
YubiKey@darren.customer.com.au,9876543,PFXXKIDENFSG4J3UEB2GQ2LONMQGSJ3EEB2XGZJAMEQHEZLBNRWHSIDTMVRXEZLUEBSGSZBAPFXXKPY=,30,YubiKey,HardwareKey

File Format.PNG

The file is uploaded via the Azure Active Directory => MFA Server => OATH Tokens configuration option. Once you’ve selected Upload and provided the file, the File upload is in-progress dialog is displayed. Select Refresh.

Uploaded for Processing

After successfully uploading the CSV and hitting the Refresh button we have the token assignment for the user.

Successfully Uploaded

In order to Activate the token you will need to have the Yubico Authenticator Application installed and the YubiKey token configured. That process is shown in the next section below. Once that is complete select Activate from the screen above and enter the OTP code displayed in the Yubico Authenticator for the token enrolled with the associated user.

Enrolling a YubiKey Physical Token with Azure MFA

With a hardware token associated with a user in Azure MFA the user can now enroll with that option. Head to Additional security verification options under the user’s profile and choose Setup Authenticator app

Enrol YubiKey Token Azure MFA 1

The following option will be displayed. Select the link for Configure app without notifications Enrol YubiKey Token Azure MFA 2

A slightly modified QR Code will be presented. Enrol YubiKey Token Azure MFA 3

Open the Yubico Authenticator Application and with the YubiKey inserted in the workstation from the File menu select Scan QR Code. The Yubico Authenticator App will magically scan the QR code and configure the credential in the Authenticator App. Select Save Credential.  Enrol YubiKey Token Azure MFA 4

Verification is required. Select Verify nowEnrol YubiKey Token Azure MFA 5

Copy and paste the OTP into the text box and select VerifyEnrol YubiKey Token Azure MFA 6

We now have the 3rd MFA method enrolled (Phone, Microsoft Authenticator and YubiKey with Yubico Authenticator). Select Save.

Enrol YubiKey Token Azure MFA 7

As I now have multiple methods registered and my latest method I just registered is now the default I have to re-verify the new method. Select Verify preferred option.

Enrol YubiKey Token Azure MFA 8

Copy and paste the OTP code into the text box and select Verify.

Enrol YubiKey Token Azure MFA 9

Verified. Update is successful. Select Close.

Enrol YubiKey Token Azure MFA 10

Back in the MFA Server OATH tokens Admin console for the associated user, select Activate and enter the current OTP code displayed in the Yubico Authenticator.

Successfully Uploaded

The token can now be used for MFA as shown at the beginning of this post.

Multiple MFA Methods – Switching MFA Methods

As I already had the Microsoft Authenticator Application registered as an MFA method before I enrolled the physical token I now have multiple methods enrolled. The primary is the token which is fine when using a laptop but as an iPhone user, NFC isn’t an option with YubiKey v5 yet as NFC Write is not enabled on iOS.

On logon after providing my username and password I’m prompted for MFA and I can select Sign in another way

Sign In Another Way

This then provides me with my other enrolled methods and top of the list is the option for Microsoft Authenticator

Sign In Another Way 2

Selecting the Microsoft Authenticator App option I am prompted on my Microsoft Authenticator App for MFA rather than using the primary method of the YubiKey with the Yubico Authenticator. I select Approve and I’m authenticated.

Changing Primary Method

With multiple methods enrolled at some point you may need to change your primary method. Back in the Additional security verification options for the user the default method can be updated. The key bottom two options are;

  • Notify me through app
    • for me this is the Microsoft Authenticator App
  • Use verification code from app
    • this is the Yubico Authenticator with the YubiKey enrolled

Change Primary Method.PNG

Summary

Microsoft Azure MFA now supports OATH TOTP Hardware Tokens. Compatible tokens can be registered by an Azure Administrator and assigned to users.

Hardware Tokens can be enrolled to a users profile in addition to other methods (phone call, SMS, Microsoft Authenticator).

Now what would be nice would be via Conditional Access the ability to specify the MFA Factor. e.g. If accessing Application X the only accepted MFA Method is physical token.

Creating SailPoint IdentityNow Source Configuration Backups and HTML Reports with PowerShell

In this post from earlier in the week I detailed leveraging the SailPoint IdentityNow APIs to retrieve IdentityNow Sources, and their configuration. This post takes that a little further, backing up the configuration and also creating a friendly HTML Report with each Sources’ Configuration and Schema. The resulting HTML Report that is dynamically created reports on all Sources in an IdentityNow Tenant Org and looks like the image below.  Sample Report.PNG

After selecting a Source you can then expand a report section for the Source Details and another for the Schema. Each Source and then Source Details and Source Schema is a collapsible DIV toggled by the link. A snippet of a Source Details output for a Generic Source looks like the image below.

Source Details.PNG

A snippet of a Source Schema output for a Generic Source looks like the image below.

Schema Example.PNG

The Script

This script assumes you are able to access the IdentityNow APIs as detailed in this post here. You will need to use that process to access the Sources APIs and have the necessary JWT Access Token to execute these API requests.

The report features an image. Here is the one I created. Download it and put it in the root of the output folder where the reports will be created.

SailPoint IdentityNow 240px.png

Make the following updates to the script:

  • Line 10 for the path to the Image file you saved from above
  • Line 17 for the base output path (sub directories are created for the date/time of each execution) for the Report and Configuration Backups

The Output

Following execution of the script a sub-directory under your directory path is created and you will find the HTML Report along with two files for each Source. An XML export of the Source Details and the Source Schema. If you need to inspect a configuration that has been exported you can use the Import-Clixml -Path “path to the exported xml file” to import it into PowerShell and inspect it.

Example Output Files.PNG

Summary

Put the execution of this script on a schedule whilst you are in the development/configuration phase of your IdentityNow deployment and you will get automated configuration reports and backups that can be reviewed if you need to roll-back or just see what changes have been made over time.