Lithnet Password Protection for Active Directory

Lithnet Password Protection for Active Directory

Today Ryan Newington released the latest Open Source project from Lithnet; Lithnet Password Protection for Active Directory.

I’ve posted extensively about leveraging Lithnet services in conjunction with Microsoft Identity Manager. In fact many of the solutions I’ve built for customers just wouldn’t be as functional without Ryan’s extensive contributions to the Microsoft Identity Management community under the Lithnet brand.

What is Lithnet Password Protection for Active Directory

With the latest offering I had the opportunity to test a few elements of the solution before release. Mostly around the Pwned (Compromised) Password functions that leverage the Have I Been Pwned NTLM datasets (available here). This functionality provides the ability to;

  • add the Have I Been Pwned NTLM dataset to the Lithnet Password Protection Compromised Passwords Data Store that will prohibit those passwords from being used
  • allow administrators to test individual users Active Directory password against the Have I Been Pwned NTLM dataset to identify if the password has been compromised
  • allow administrators to test users from an entire Active Directory domain against the Have I Been Pwned NTLM dataset to identify if the password has been compromised
  • on password change against Active Directory, only permit passwords that don’t appear in the compromised or banned password lists in the Lithnet Password Protection for Active Directory data store

In addition the Lithnet Password Protection for Active Directory solution also allows more granular definition of your Active Directory Password Policy e.g Reward a password with longer length, with less complexity.

Whilst Microsoft does have something similar (for banned passwords), it is still in preview, and for a subset of the functionality you will need to be a Microsoft Azure AD Premium licensed customer. And you can’t ingest the Have I Been Pwned password dataset in as a Custom Banned Password List either. You’re hoping Microsoft has a good overlap with those datasets.


If you’re like the majority of organisations I’ve consulted for, you are currently hoping that your existing password policies (length, complexity, rotation period etc) along with implementing Multi-Factor Authentication will provide you with a balance between end-user usability and security posture.

What you ultimately require however is the peace of mind that your end-users passwords in your on-premise Active Directory and Azure Active Directory don’t contain passwords that will be consistently used in Password Spray and Password Brute Force attacks.

Until we get further down the path to Passwordless Authentication this is the best protection you can have today against two of the common password based attacks.

SailPoint IdentityNow Roles Management Agent for Microsoft Identity Manager

This is the first post in a series where I will provide a number of base-level Management Agents for Microsoft Identity Manager to integrate with SailPoint IdentityNow. Whilst the two products have areas of competing/equivalent functionality there are other aspects where integration of the two compliment each other. Whilst that is not the purpose of this post, through the series of upcoming posts it will be relatively easy to extrapolate how the two products can happy co-exist and orchestrate each other for certain functions.

This Management Agent is for Microsoft Identity Manager to have visibility of IdentityNow Roles (see customisation at the end for me functionality).

For more information on IdentityNow Roles see this post where I detailed Creating Roles as well as updating/managing them via API. The MA also consumes whether the Role is requestable that I covered in this post.


  • The Management Agent is a Full Sync only Management Agent. This is because the IdentityNow API doesn’t expose differential style requests. That is also why this is a single function Management Agent (just for Roles).
  • The Management Agent is configured for Paging the results back into Identity Manger. For more details on that see this post.


  • On your MIM Sync Server you will need the PowerShell Community Extensions (PSCX)  for the Get-Hash cmdlet
  • The Management Agent uses IdentityNow v3 Authentication. You will need to request the API Keys from your SailPoint Customer Success Manager. Details on v3 Authentication can be found in this post
  • The Management Agent leverages the Granfeldt PowerShell Management Agent. Start here to get up to speed with that. As detailed above this is an Import only MA so I’m not providing an Export Script and the Password is redundant. The script files need to be present but will be empty

Schema Script

The Schema Script below covers the core attributes associated with IdentityNow Roles.

Import Script

As IdentityNow v3 API Authentication requires a number of artifacts, we need to make sure we secure them all appropriately.

For the Admin Username and Password we will do that by exporting them to an XML file using Export-CLIXML and then in the Import Script, import them using Import-CLIXML. Those cmdlets respect the context by which the credentials were exported and will only be able to access them when imported under that same context. As our Management Agent will be run by the MIM Sync Server Service Account we need to create the credentials file using that login. To do that;

  • temporarily reconfigure the MIM Sync Service Account so that it can logon locally
    • On the MIM Sync Server open Local Security Policy = > Local Policies => User Rights Assignment => Deny log on locally and remove the MIM Sync Server Service Account
    • repeat for Deny access to this computer from the network
  • Logon to the MIM Sync Server using the MIM Sync Server Service Account
  • Run the following to create the credential file and put the credential file in the Extensions\yourRolesMA directory
$adminUSR = [string]"Partner_Admin".ToLower()
$adminPWDClear = 'myStr0ngP@$$w0rd'
$adminPWD = ConvertTo-SecureString $adminPWDClear -AsPlainText -Force
$Credentials = New-Object System.Management.Automation.PSCredential $adminUSR,$adminPWD
$Credentials | export-clixml c:\temp\RoleAdminCred.xml
  • IMPORTANT: Add the MIM Sync Server Service Account back  into the Deny access to this computer from the network and Deny Logon Locally policies

The IdentityNow v3 API Credentials are stored on the Management Agent Connectivity Configuration page. The Username and Password Authentication options take the v3 API Client ID and API Client Secret respectively.

MA Configuration Username Password ClientID Client Secret.PNG

Make the following updates for your implementation:

  • Line 24 for your IdentityNow Organisation name
  • Line 27 for the location and name of the credentials file created above


Depending on what you want to do with it, will depend on how you want Identity Manger to consume the data. You will likely want to;

  • Create a new ObjectType in the Metaverse along with the attributes associated with the Roles
  • Flow the information in and perform any logic
  • Create Roles in IdentityNow
  • Update Roles in IdentityNow


Using this base management we can get connectivity and visibility of IdentityNow Roles in Microsoft Identity Manager.

Managing SailPoint IdentityNow Tasks with PowerShell

Sailpoint IdentityNow Tasks via API using PowerShell

In SailPoint IdentityNow when using the Request Center, tasks are created for activities that are not able to be automatically (directly) fulfilled. Essentially completion of the request requires someone to do something, then return to the IdentityNow Portal and flag the Task as complete. What if we want to see what Tasks are open and flag them as complete through external automation?

Well, this SailPoint IdentityNow Compass article gives the only background to using the API to get visibility of Tasks.  But what we needed to do was;

  • enumerate Tasks that were pending for Flat File Sources
  • understand the pending operations and complete them
  • mark the Task(s) as complete

This post will cover the first and last bullet points. The performing the operation will be dependent on what you have integrated with and what is being requested for an Entitlement.


I’m using a v3 Token to access the IdentityNow API’s. I detailed that in this post here. If you don’t have API keys for the v3 endpoint you can use this method to get the oAuth JWT Token. You will also need to make sure that you don’t have any Content-Type set in your headers. If you do then you will get an error message like this;

  • Missing or invalid arguments

API Error when ContentType is set.PNG

Enumerating all Tasks

To enumerate all tasks we need to call the API /task/listAll. Using PowerShell and the access token from one of the methods listed in the prerequisites we can make the following call.

$orgName = "myOrgName"
$tasksURI = "https://$($orgName)"
$tasksList = Invoke-RestMethod -method Get -uri $tasksURI -Headers @{Authorization = "$($v3Token.token_type) $($v3Token.access_token)"}

A task then looks like this;

Manual Task Details.PNG

Searching for and retrieving an Individual Task

To retrieve an individual task we need to know the ID of the task. If we know what we are looking for we can use PowerShell to locate the specific task, get the ID then get that individual task.

The following command shows looking through the Task items and finding tasks that are ManualAction, are not completed and contain Luke in the description.

$manualActions = $tasksList.items | select-object | Where-Object {$_.type -eq "ManualAction" -and $_.complete -eq $false -and $_.description -like "*Luke*"}
$taskID = $

With our Task identified we can then retrieve it using the API /task/get/{taskID}

$utime = [int][double]::Parse((Get-Date -UFormat %s))
$getIdvTaskbyIDURI = "https://$($orgName)$($taskID)?_dc=$($uTime)"
$indvTask = Invoke-RestMethod -method Get -uri $getIdvTaskbyIDURI -Headers @{Authorization = "$($v3Token.token_type) $($v3Token.access_token)"}

Looking at it, it is the one we wanted.

Individual Task.PNG

Completing a Task

With the task ID we can then update the Task and mark it as completed. To complete the task we make a POST request to task/complete/{taskID}

$completeTaskURI = "https://$($orgName)$($taskID)"
$completeTask = Invoke-RestMethod -method Post -uri $completeTaskURI -Headers @{Authorization = "$($v3Token.token_type) $($v3Token.access_token)"}

The API will return the object on Success.

Complete Task.PNG

Looking in the IdentityNow Portal we can see that the Task is showing as Completed.

Complete in Portal.PNG


Using the IdentityNow Task API’s we can get a list of Tasks, search and find the task we are looking for and retrieve all the details it before finally updating the status of the Task to complete.

Batching Microsoft Graph API Requests with JSON Batching and PowerShell

Microsoft Graph JSON Batching

Late in 2018 it came to my attention new functionality with the Microsoft Graph API for batching API requests into a single request. As I predominantly use PowerShell for scripting into Microsoft Graph parallel requests historically required extra functions to  achieve something similar. Use of Invoke-Parallel for instance, that I’ve previously discussed in posts such as How to create an Azure Function App to Simultaneously Start|Stop all Virtual Machines in a Resource Group.

Fast forward to 2019 and I’ve been building a bunch of reports from Microsoft Graph that aggregate data from multiple API endpoints such as /users /auditLogs and /security . A perfect time to explore deeper JSON Batching with Microsoft Graph.

Getting Started

Essentially your Graph API requests are the same, but instead of calling the API endpoint directly, you make a POST request to the Batch endpoint …$batch

… with your request in the body that includes the suffix for the API URI for the request, the request method and an ID. The example below is to query the User API endpoint for users whose displayName starts with the string provided.

$myRequest = [pscustomobject][ordered]@{

Additional requests are added as required to build the Request Body and passed as part of the Post request to the Batch endpoint. Each additional request in the batch requires a different ID and the respective particulars for the request(s).

Batching Requests using PowerShell

Below is an example for generating a batch in PowerShell with two requests.

To execute the request looks like this

$getBatchRequests = Invoke-RestMethod -Method Post -Uri $batchUrl -Body $batchBody -headers $requestHeaders


The responses come back under the Responses property as shown in the graphic below. The responses have the ID of the requests associated with them so you can associate a response with a request.

Microsoft Graph API JSON Batching 1.PNG

The request responses can then be parsed by iterating through the jobs.


Using Microsoft Graph JSON Batching we can submit multiple requests as a single job, have the requests performed on the backend in parallel and get the results back.

Keep in mind that the request can also use the DependsOn option to make a request dependent on another that must be completed prior. Also if the response contains a dataset that is paged. Check the nextLink property on the responses and make subsequent calls to return the remaining data.

Is Identity Management still relevant in 2019?

is identity relevant in 2019

The last three years have been a blur. Over the holiday period I’ve been reflecting on my professional experiences over the last few years whilst also considering the future of identity and access management from my perspective as an architect and consultant. Is Identity Management still relevant in 2019? More on that further below, but first a quick recap.

2016 & 2017 Recap

After changing employer and role at the end of 2015 I assimilated into an organisation with a much different culture and by mid 2016 started to embrace the company values of supporting the industry and community that supports us. This is quite a change from my previous 23 years of employment where experience, skills and knowledge was considered internal intellectual property and not to be shared/discussed on public forums.

Identity and Access Management was also a changing industry. Enterprise customers embracing cloud computing was driving requests for solutions relating to hybrid identity management. The incoming work relating to cloud identity enablement was over and above the traditional identity sync and messaging services that I’d been seeing for the previous 5+ years (with respect to Cloud). Enterprises were asking for solutions that weren’t out of the box solutions that could be achieved with just configuration (but then Identity never really has been).

Personal Commentary

In early 2017 I took out a web hosting plan and installed WordPress and started to cross-post the content I write for Kloud along with additional posts. Those posts along with numerous other community activities I undertook saw me rewarded with Microsoft MVP status for Enterprise Mobility (Identity and Access Management).

My posts on this blog are mostly a narrative on my professional activities along with my tangential exploration of new and emerging technical areas of interest. My posts cover many topics but with regularity (Identity & Access Management, Azure Serverless Services, Microsoft Identity Manager, SailPoint IdentityNow, Internet of Things, PowerShell and Containerization).

At the start of 2018 I made a conscious effort to write more, specifically my experiences learning new technologies/services primarily as a reference for myself but also with a greater desire to give back to the industry and community I’d been forced to be a passive participant of for too long.

2018 Summary and Reflections

In 2017 I wrote 55 posts (which was inline with my target of an average of 1 post a week), and in 2018 that increased to a crazy 72. But with it my blog saw an increase in traffic with visitors up 177% and views up 203% from 2017. I don’t write for any reasons other that those stated above. But is nice to have some data to show it has some relevance and is of benefit to others.

Visitors and Views 2017 vs 2018
Visitors and Views 2017 vs 2018

What I do find interesting is where the readers are coming from. 41% from the USA, just under 14% from Australia, 5% from the UK then followed by India, Germany, Netherlands …….

I find this interesting as there maybe some correlation between the content and the location of like-minded individuals. Leading a growing team of Identity Professionals working on projects that aren’t the traditional On Premise Identity Sync style of projects that we’ve been doing for the last 20 years comes with the increased difficulty of talent acquisition for those types of projects. It is rare to find Identity Professionals that have the traditional IDAM skills but also understand Cloud Services, SaaS and PaaS offerings and how to integrate with API’s.


Blog Audience Summary
Blog Audience Summary

My top 3 posts for 2018 where;

It isn’t surprising then that 2 of the top 3 posts are associated with integration of Identity between tenants in an IDentity as a Service (IDaaS) offering. It has become one of the common themes our customers are requesting solutions to, in order to solve their inter and intra company collaboration enablement problems. The other post ironically is when Cloud Services that are expected to always be on, go into a transient state.

What will 2019 bring? Is Identity Management still relevant?

Extrapolating from posts I’ve made in the last two years along with customer requests along with current and planned projects there are a couple of themes developing. Identity is more relevant than it ever was;

  1. In Australia we are seeing traditional enterprises moving their Human Capital Management (HCM) services to Cloud SaaS providers (think SAP Success Factors and Workday)
    • this requires a re-think on authoritative source integration for Identity Services
    • it is an opportune time to plan for re-thinking identity data models to drive higher capabilities driven by identity such as Role Based Provisioning and Role Based Access Control
  2. Hybrid Identity (On Premise <=> Cloud) needs to be solved by all Enterprises
    • We are seeing enterprises with traditionally lower levels of investment in IT Services finally having sweated their assets to the point they almost need to re-architect their entire estate
    • When an environment;
      • is still on physical infrastructure
      • is up to 4 major versions behind the current offerings (think Windows Server 2008/2008 R2, Microsoft Exchange 2010)
      • requires users to use legacy VPN style connections to access resources remotely
      • doesn’t support modern workplace working models and mobility isn’t even an option
        • then it requires new blood to sponsor and drive the re-organisation, along with fresh thinking to develop the Strategy and Roadmap with new patterns to accelerate the adoption to continually deliver improvements
          • and one of the key cornerstone drivers of these projects is identity
    • We are also seeing large enterprises that have embraced Cloud Services, but have approached them as tactical integrations into their existing environments. At scale though this often results in a disjoint series of silo’d identity repositories and a very confusing user experience from differing Login ID’s and Passwords.
      • We are assisting customers with planning for a holistic identity driven end-state and then re-architecting the integration of services to provide a consistent and predictable experience for Provisioning, Lifecycle Management, Separation as well as the end-user experience
  3. Historical On Premise Identity Management implementations need an uplift
    • Similar to the sweating your assets comment above we are often evaluating customers existing IDAM implementations based on heritage On Premise releases (such as Microsoft Forefront Identity Manager, Novell Identity Manager, Tivoli Identity Manager and Oracle Identity Manager).
    • Having designed and built numerous IDAM implementations on these products it is disheartening to still see implementations only doing what they were originally developed to do when implemented ~7+ years ago
      • When functionality has been enhanced it is often via a different solution in parallel with the foundation one, or even worse via manual operations performed to achieve the desired outcome
    • Many of the products listed in the first bullet point above have evolved, but so has the requirements of an enterprise organisation.
      • Do you rip and replace or uplift and extend? We are seeing requests for both

What will I be working on in 2019?

Hybrid Identity Management Solutions

My last two major Identity projects have seen me architect solutions that are a hybrid of traditional On Premise Identity Management products with Cloud PaaS and SaaS services along with IDaaS providers. As the PaaS and SaaS offerings mature and IDaaS services achieve highly functionality there will be more demand to augment existing IDAM implementation with them and in the case of IDaaS and traditional IDAM products, where does functionality X best reside. The upside to all of this is less bespoke design and development and more configuration and dissemination of IDAM functionality into micro-services.


In 2018 I entered my first Hackathon with my Voice Assistant for Microsoft Identity Manager. Hackathons were something I had been observing for a while but not something I ever thought I would be a part of. It is highly likely I will be part of another in 2019, naturally in my sphere of Identity Management.

Internet of Things

IoT is something I’ve messed with long before the term IoT became a three-letter acronym. In 2018 I even gave a talk on the Internet of YOUR Things at the Global Azure Bootcamp.

IoT integration with Cloud Services obviously also has an Identity component and one that I’m keen to keep on top of. From physical IoT Devices to Bots integrated with IDAM Implementations I foresee myself continuing to tinker and workout what it will mean from an IDAM perspective managing non heartbeat identities for corporate enterprises in the future.

In closing

Whilst this post started as a simple reflection on my last three years and a changing industry with a viewpoint on the future it seems to have got a lot more deep and meaningful in the middle. If you have made it this far, well done. You have the endless patience of an Identity Consultant.

Is Identity Management still relevant in 2019? Absolutely. What and how we define Identity Management is changing quickly, but it is more relevant than ever.

What is your analysis of the current state of Identity and Identity Management in 2019? Let me know on Twitter, LinkedIn or in the comments below.

Azure Self Service Password Reset Reporting using PowerShell

Just over 18 months ago I wrote this post on using PowerShell and oAuth to access the Azure AD Reports API to retrieve MIM Hybrid Report data.  This week I went to re-use that for Azure Password Reset Reporting and found out that the API had been deprecated.

API Deprecated.PNG

Using the error information that actually was informative I proceeded to the new API. Having authenticated as I had in the previous article, I executed the following to retrieve a list of the Audit Reports available.

$TenantDomain = ""
$url = '' + $tenantdomain + "/activities/auditActivityTypes?api-version=beta"
# Returns a JSON document for the Audit Activity Types
$auditActivities= (Invoke-WebRequest-Headers $headerParams-Uri $url).Content |ConvertFrom-Json

Before I go any further it’s a good time to mention you can easily access the Audit log events using the Azure Portal and retrieve the data you maybe looking for and download a report. There is also the Azure Audit logs content pack for PowerBI as detailed here. Those are awesome solutions, but if you want to do something a little more bespoke and programmatic then keep reading.

Azure AD Password Events Audit Log Data

For the record (as at 18 Dec 2018) there are 1023 different Activity Resource Types. The full list is available here. The ones I’m interested in right now though are a subset of the Self-service Password Management category;

SSPR Audit Reports
SSPR Audit Reports
After investigation, the API to get the information I was after is now available on the Microsoft Graph Beta Audit Logs endpoint auditLogs/directoryAudits 

My Azure AD Registered App needed to be updated to have the additional role (AuditLog.Read.All) which was done via the Registered Applications blade under Azure Active Directory in the Azure Portal;


My script then needed to be updated to talk to the Microsoft Graph and the new scope;

$resource = ""
$scope = "AuditLog.Read.All; Directory.Read.All"

So to get Azure Password Reset events for the last week the following calls can be made;

# Get Reset Password Events
$DirectoryAuditURL = ""
$passwordMgmtAuditData = Invoke-RestMethod -Method Get -Uri "$($DirectoryAuditURL)?`$filter=category eq `'UserManagement`' and activityDateTime ge 2018-12-11T03:15:10.6837342Z and startswith(activityDisplayName%2C+`'Reset password`')" -Headers @{Authorization = "Bearer $($Global:accesstoken)"}

To retrieve the other events contained in the auditLog we just need to alter the event to retrieve events for and the timeframe of interest. The events from the table above associated with Azure Self Service Password Reset and Azure Change Password are;

Blocked from self-service password reset
Change password (self-service)
Credentials Registered and Password Reset Status of User
Reset password (by admin)
Reset password (self-service)
Security info saved for self-service password reset
Self-serve password reset flow activity progress
Self-service password reset flow activity progress
Unlock user account (self-service)
User completed security info registration for self-service password reset
User registered for self-service password reset
User started security info registration for self-service password reset

The Script

Here is the modified script from my previous post here that uses oAuth to retrieve Azure Password Reset events. As per the other script it enables the scopes required. If you’re not Global Admin get the script run initially by someone who has the Global Admin role or get them to assign the AuditLog.Read.All permission to the Azure AD Application you have created. You can then login and get an Access Token and a Refresh Token.


  • Line 5 for your Application ClientID
  • Line 6 for your Application Secret
  • Line 18 for where you want to store the Refresh Token
  • Line 97 can be un-remarked after getting a token to use the Refresh Token (and Line 96 commented out). To get a new Access Token using the stored Refresh Token (Line 18) call the Get-NewTokens function

If, due to your time filter you have more than the maximum (1000 events) that can be returned per call, you can use the skiptoken to get the next page. If you have more than 2000 use the subsequent skipToken in additional calls in a do { get data } while ($skipToken) loop with the addition of adding the returned data to a collection.

$skipToken = $passwordMgmtAuditData.'@odata.nextLink'
$results = Invoke-RestMethod -Method Get -Uri $skipToken -Headers @{Authorization = "Bearer $($Global:accesstoken)"}

If you want to automate the time period to retrieve events for then you can use the Get-Date function and set the time window.

# Date Time for the report. Last 14 Days
[string]$strDateTimeNow = Get-Date -Format yyyy-MM-ddThh:mm:ss
[datetime]$dateTimeNow = Get-Date -Format yyyy-MM-ddThh:mm:ss
[datetime]$minus14Days = $dateTimeNow.AddDays(-14)
[string]$14daysAgo = get-date($minus14Days) -Format yyyy-MM-ddThh:mm:ss

so then the call for the last 14 days of events would be

$passwordMgmtAuditData = Invoke-RestMethod -Method Get -Uri "$($DirectoryAuditURL)?`$filter=category eq `'UserManagement`' and activityDateTime ge $($14daysAgo)z and activityDateTime le $($strDateTimeNow)z and startswith(activityDisplayName%2C+`'Reset password`')" -Headers @{Authorization = "Bearer $($Global:accesstoken)"}


Using the Microsoft Graph auditLog API we can retrieve using PowerShell events of interest for reporting or other requirements.

Granfeldt PowerShell Management Agent Schema HRESULT: 0x80231343 Error

Yesterday I was modifying the Schema configuration on a Granfeldt PowerShell Management Agent on a Microsoft Identity Manager 2016 SP1 Server.

I was changing the Anchor attribute for a different attribute and on attempting to refresh the schema or view the configuration I got the following error;

Unable to retrieve schema. Error: Exception from HRESULT 0x80231343

Unable to retreive Schema.PNG

I knew I’d seen this before, but nothing was jumping to mind. And this was a particular large Schema script.

After some debugging I realized it was because the attribute I had changed the Anchor attribute too, was also listed in the Schema script. It wasn’t obvious as the attribute entry was multiple pages deep in the script.

Essentially if you are seeing the error Unable to retrieve schema. Error: Exception from HRESULT 0x80231343 there are two likely causes;

  • you haven’t declared an Object Class e.g User
    • $obj | Add-Member -Type NoteProperty -Name “objectClass|String” -Value “User”
  • the attribute you have as your anchor is also listed as an attribute in the schema script e.g
    • $obj | Add-Member -Type NoteProperty -Name “Anchor-ObjectId|String” -Value “333a7e07-e321-42ea-b0a5-820598f2adee”
    • $obj | Add-Member -Type NoteProperty -Name “ObjectId|String” -Value “333a7e07-e321-42ea-b0a5-820598f2adee”
      • you don’t need this entry, just the Anchor entry

Hopefully this helps me quickly find the reason next time I make a simple mistake like this in the Schema script.


Using SailPoint IdentityNow v3 API’s with PowerShell

The SailPoint IdentityNow SaaS product is evolving. I’ve previously posted about integrating with the IdentityNow API’s using PowerShell;

IdentityNow now has v3 API’s which are essentially the v2 and non-Published API’s with the added benefit of being able to obtain an oAuth token from a new oAuth Token endpoint. Unlike the v2 process for enabling API integration, v3 currently requires that SailPoint generate and provide you with the ClientID and Secret. This Compass document (at the very bottom) indicates that this will be the preferred method for API access moving forward.

The process to get an oAuth Token the process is;

  • Generate Credentials using your IdentityNow Admin Username and Password as detailed in my v1 Private API post
    • Lines 1-12
  • Use the credentials from the step above in the oAuth token request
  • Use the ClientID and Secret as Basic AuthN to the Token endpoint
  • Obtain an oAuth Token contained in the resulting $Global:v3Token variable

Authentication Script


  • Line 2 with your Org name
  • Line 5 with your Admin Login Name
  • Line 6 with your Admin Password
  • Line 15 with your SailPoint supplied ClientID
  • Line 16 with your SailPoint supplied Secret

Your resulting token will then look like this;

SailPoint v3 API oAuth Token.PNG

Using the v3 oAuth Access Token

So far I’ve found that I can use the oAuth Token to leverage the v2 and non-published API’s simply by using the JWT oAuth Token in the Header of the webrequest e.g

@{Authorization = "Bearer $($Global:v3Token.access_token)"}

Depending on which API you are interacting with you may also require Content-Type e.g

@{Authorization = "Bearer $($Global:v3Token.access_token)"; "Content-Type" = "application/json"}


Talk to your friendly SailPoint Support Rep and get your v3 API ClientID and Secret and discard this previous hack of scraping the Admin Portal for the oAuth Token saving a few hundred lines of code.

Enabling Requestable Roles in SailPoint IdentityNow using PowerShell

Recently I wrote this post about Retrieving, Creating, and Managing SailPoint IdentityNow Roles using PowerShell.

Last week SailPoint enhanced Roles with the ability to request them. The details are located on Compass here.

I had a number of Roles that we wanted to make requestable, so rather than opening each and using the Portal UI to enable them, I did it via the API using PowerShell.

As per my other Roles post, a JWT Bearer Token is required to leverage the Roles API’s. That is still the same. I covered how to obtain a JWT Bearer Token specifically for interacting with these API’s in this post here. I’m not going to cover that here so read that post to get up to speed with that process.

Enabling Roles to be Requestable

The following script queries to return all Roles, iterates through them to make them requestable. Update;

  • Line 2 for your IdentityNow Org Nam
  • after Line 9 you can refine the roles you wish to make requestable


Using the API we can quickly enable existing IdentityNow roles to be requestable.  When creating new Roles we can add in the attribute Requestable with the value True if we want them to be requestable.

Using Invoke-WebRequest calls within a Granfeldt PowerShell MA for Microsoft Identity Manager

MIM Sync Service Account IE Security Settings

If you use PowerShell extensively you should be familiar with the Invoke-RestMethod cmdlet and the ability for PowerShell to call API’s and receive information. The great thing about Invoke-RestMethod is the inbuilt conversion of the results to PowerShell Objects. However there are times when you need the raw response (probably because you are trying to bend things in directions they aren’t supposed to be; story of many of my integrations).

From within Granfeldt PowerShell Management Agent script(s) that use Invoke-WebRequest calls, these will in turn leverage the Internet Explorer COM API on the local machine. That means the account that is performing those tasks will need to have the necessary permissions / Internet Explorer configuration to do so.

Enabling Invoke-WebRequest for the FIM/MIM Synchronization Server Service Account

Logon to your Microsoft Identity Manager Synchronization Server using the Service Account associated with the Forefront Identity Manager Synchronization Service. If you’ve secured the service account properly you will need to temporarily allow that service account to Log on Locally.

FIM Sync Service Account.PNG

Open Internet Explorer and open Internet Options. Select the Security Tab, the Internet zone and select Custom Level.

Locate the Scripting section and set Active scripting to Enable. Set the Allow websites to prompt for information using scripted windows to Disable. Select Ok and Ok.

MIM Sync Service Account IE Security Settings.PNG

With the configuration completed, go back and change back the Forefront Identity Manager Service Accounts’ local permissions so it can’t logon locally.

Your Import / Export Granfeldt PowerShell Management Agent Scripts can now use Invoke-WebRequest where the requests/responses use the Internet Explorer COM API and respect the Internet Explorer security settings for the user profile that the requests are being made by.

Hopefully this helps someone else that is wondering why the scripts that work standalone fail when operating under the FIM/MIM Sync Service Account by the Granfeldt PowerShell Management Agent.