Error 25009 HResult 0x80131700 when installing Microsoft Identity Manager

This week I was installing Microsoft Identity Manager in a new environment and wasn’t using my usual scripts that semi automate the process. During the installation of the Microsoft Identity Manager Synchronization Service I got the Error 25009 HResult 0x80131700 as shown below.

FIM Sync Server Installation Failed Configurating SQL.PNG
FIM Sync Server Installation Failed Configurating SQL

As mentioned above I normally do this semi-automated but this time I was updating a bunch of that so was starting with a fresh install on a Windows Server 2016 host.

Note: Windows Server 2019 isn’t an officially supported platform currently.

Re-running the install with an installation log as I detail in this post didn’t help much as the install log did show an error but not too much more.

Additional research indicated that this error can be caused for three varying scenarios;

  • insufficient permissions on the SQL Server
  • missing SQL Native Client on the Microsoft Identity Manager Sync Server
  • missing .NET Framework 3.5

Checking the SQL Server I could see that the login for the Sync Server Service has been created, so that discounted the first two. Looking at the installed applications on the Sync Server confirmed that I did not have the .NET Framework 3.5 installed.

Install NET 3.5.PNG

Looking back at my automation scripts one of my first lines is;

Install-WindowsFeature NET-Framework-Core
Following installation of the .NET Framework, 3.5 re-running the setup got me up and running.
Completed Successfully.PNG

Summary

In 2019 installing Microsoft Identity Manager 2016 with SP1 still has the same dependants that Identity Lifecycle Manager had in 2007.  .NET Framework 3.5 however isn’t installed by default on Server 2016 (.NET Framework 4.x is). If nothing else this will jog my memory for next time.

Loading and Querying Data in Azure Table Storage using PowerShell

As part of both a side project and a work project I recently had a couple of larger datasets that I needed to put into a database and be able to search them. I had previously used Azure Blob Storage but hadn’t done too much with Azure Table Storage. Naturally I needed to use PowerShell to perform this and I quickly found out that the AzureRM PowerShell Module could do the basics, but it wasn’t going to scale to the size of the datasets I had. Some trial and effort later I got it to do want I needed and then using the Azure Table Service REST API I was able to easily query the dataset.

Note: Initially I performed an initial load of one of the datasets (~35k rows), a row at a time which took close to 5 hours. Once I got Batch Operations for dataset insertion into Table Storage working I got that down to ~4 minutes. This post details how I did it.

Prerequisites

You will need;

  • the Azure Storage Explorer which you can get from here 
  • An Azure Storage Account
    • you can create one though many different methods such as the Azure Portal UI, Azure Portal CLI, Azure CLI, PowerShell …..
  • PowerShell 5.1 or later and the AzureRM PowerShell Module

Creating an Azure Table Storage Table

Using the Azure Storage Explorer, authenticate to Azure and navigate to your Storage Account. Expand the Storage Account, select Tables and right-click and select Create Table. Give the Table a name and hit enter. Whilst you can also create the Table via the PowerShell AzureRM Module or the Azure CLI, it is also super quick and easy using the Azure Storage Explorer which will be used later to verify the loaded dataset.

Using the Azure Storage Explorer I created a Table named NICVendors.

Create Azure Table Service Table.PNG

Loading Data into Azure Table Service

The example data I will use here is the dataset from a post last year for MAC Address Vendors lookup. Rather than exporting to XML I will load it into Azure Table Storage. The following script will obtain the Vendors list from here and save to your local disk. This will provide ~26k entries and is a good test for loading into Azure Table Service.

Update Line 3 for where you want to output the file too.

With the dataset in memory we can parse it and insert each row into the table. The quickest method is to batch the inserts. The maximum number of rows allowed in a batch is 100. Each row must also be using the same Primary Key.

Update:

  • Line 2 for your Azure Subscription
  • Line 3 for the Resource Groups your Storage Account is located in
  • Line 4 for the name of your Storage Account
  • Line 5 for the name of the Table you created
  • Line 6 for the name of the Partition for the dataset

With my Storage Account being in Central US and myself in Sydney Australia loading the ~26k entries took 4 mins 27 seconds to insert.

Azure Table Storage Rows Inserted.PNG

Querying Azure Table Service using the RestAPI and PowerShell

To then query the Table entries to find results for a Vendor the following script can be used. Change;

  • Line 2 for your Storage Account name
  • Line 3 for your Storage Account Key (which you obtain from the Azure Portal for your Storage Account)
  • Line 4 for your Table name
  • Line 20 for the Vendor to query for

Executing the script to query for Dell Inc. returns 113 entries. The graphic below shows one.

Search Azure Table Storage Response.PNG

Summary

Using the AzureRM PowerShell Module and the TableBatchOperation class from the Microsoft.WindowsAzure.Storage.dll we are able to batch the record inserts into 100 row batches.

Using the Azure Table Service REST API we are able to quickly search the Table for the records we are looking for.

SailPoint IdentityNow Governance Groups Management Agent for Microsoft Identity Manager

Last week I posted a SailPoint IdentityNow Roles Management Agent for Microsoft Identity Manager. Today I’m posting a sister for it, an IdentityNow Governance Groups Management Agent.

I’ve posted about Governance Groups before. See Managing SailPoint IdentityNow Governance Groups via the API with PowerShell. That post details creating and managing Governance Groups via the API.

This Management Agent is essentially the enumeration of Governance Groups in IdentityNow via API wrapped up in a PowerShell Management Agent. You can extend the management agent for managing Governance Groups to fit your needs.

Prerequisites

  • On your MIM Sync Server you will need the PowerShell Community Extensions (PSCX)  for the Get-Hash cmdlet
  • Authentication to IdentityNow for Governance Groups can leverage the v2 Authentication method. I cover  enabling that in this post here. You can also use the v3 Authentication method I detail here. If you do that you will need to appropriately secure the extra credentials as I show in the Roles Management Agent.
  • The Management Agent leverages the Granfeldt PowerShell Management Agent. Start here to get up to speed with that. As detailed above this is an Import only MA so I’m not providing an Export Script and the Password is redundant. The script files need to be present but will be empty

Schema Script

The Schema Script below covers the core attributes associated with IdentityNow Governance Groups.

Import Script

The Import Script unlike the Roles Management Agent can use v2 Authentication. As such we don’t need to perform additional effort to provide the necessary credentials.

Your v2 IdentityNow credentials need to be provided on the Management Agent Connectivity Configuration page. The Username and Password Authentication options take the v2 API Client ID and API Client Secret respectively.

IdentityNow Governance Groups MA Connectivity Username Password

NOTE: The Import Script is also configured to page the import of Governance Groups. The Page Size is configured in your Run Profile.

Make the following update for your implementation;

  • Line 24 for your IdentityNow Orgname

Customisation

Depending on what you want to do with it, will depend on how you want Identity Manger to consume the data. You will likely want to;

  • Create a new ObjectType in the Metaverse along with the attributes associated with IdentityNow Governance Groups
  • Flow the information in and perform any logic
  • Create an Export Script that will;
    • create Governance Groups in IdentityNow
    • update Governance Groups in IdentityNow

Summary

Using this base management we can get connectivity and visibility of IdentityNow Governance Groups in Microsoft Identity Manager.

Lithnet Password Protection for Active Directory

Lithnet Password Protection for Active Directory

Today Ryan Newington released the latest Open Source project from Lithnet; Lithnet Password Protection for Active Directory.

I’ve posted extensively about leveraging Lithnet services in conjunction with Microsoft Identity Manager. In fact many of the solutions I’ve built for customers just wouldn’t be as functional without Ryan’s extensive contributions to the Microsoft Identity Management community under the Lithnet brand.

What is Lithnet Password Protection for Active Directory

With the latest offering I had the opportunity to test a few elements of the solution before release. Mostly around the Pwned (Compromised) Password functions that leverage the Have I Been Pwned NTLM datasets (available here). This functionality provides the ability to;

  • add the Have I Been Pwned NTLM dataset to the Lithnet Password Protection Compromised Passwords Data Store that will prohibit those passwords from being used
  • allow administrators to test individual users Active Directory password against the Have I Been Pwned NTLM dataset to identify if the password has been compromised
  • allow administrators to test users from an entire Active Directory domain against the Have I Been Pwned NTLM dataset to identify if the password has been compromised
  • on password change against Active Directory, only permit passwords that don’t appear in the compromised or banned password lists in the Lithnet Password Protection for Active Directory data store

In addition the Lithnet Password Protection for Active Directory solution also allows more granular definition of your Active Directory Password Policy e.g Reward a password with longer length, with less complexity.

Whilst Microsoft does have something similar (for banned passwords), it is still in preview, and for a subset of the functionality you will need to be a Microsoft Azure AD Premium licensed customer. And you can’t ingest the Have I Been Pwned password dataset in as a Custom Banned Password List either. You’re hoping Microsoft has a good overlap with those datasets.

Conclusion

If you’re like the majority of organisations I’ve consulted for, you are currently hoping that your existing password policies (length, complexity, rotation period etc) along with implementing Multi-Factor Authentication will provide you with a balance between end-user usability and security posture.

What you ultimately require however is the peace of mind that your end-users passwords in your on-premise Active Directory and Azure Active Directory don’t contain passwords that will be consistently used in Password Spray and Password Brute Force attacks.

Until we get further down the path to Passwordless Authentication this is the best protection you can have today against two of the common password based attacks.

SailPoint IdentityNow Roles Management Agent for Microsoft Identity Manager

This is the first post in a series where I will provide a number of base-level Management Agents for Microsoft Identity Manager to integrate with SailPoint IdentityNow. Whilst the two products have areas of competing/equivalent functionality there are other aspects where integration of the two compliment each other. Whilst that is not the purpose of this post, through the series of upcoming posts it will be relatively easy to extrapolate how the two products can happy co-exist and orchestrate each other for certain functions.

This Management Agent is for Microsoft Identity Manager to have visibility of IdentityNow Roles (see customisation at the end for me functionality).

For more information on IdentityNow Roles see this post where I detailed Creating Roles as well as updating/managing them via API. The MA also consumes whether the Role is requestable that I covered in this post.

Notes

  • The Management Agent is a Full Sync only Management Agent. This is because the IdentityNow API doesn’t expose differential style requests. That is also why this is a single function Management Agent (just for Roles).
  • The Management Agent is configured for Paging the results back into Identity Manger. For more details on that see this post.

Prerequisites

  • On your MIM Sync Server you will need the PowerShell Community Extensions (PSCX)  for the Get-Hash cmdlet
  • The Management Agent uses IdentityNow v3 Authentication. You will need to request the API Keys from your SailPoint Customer Success Manager. Details on v3 Authentication can be found in this post
  • The Management Agent leverages the Granfeldt PowerShell Management Agent. Start here to get up to speed with that. As detailed above this is an Import only MA so I’m not providing an Export Script and the Password is redundant. The script files need to be present but will be empty

Schema Script

The Schema Script below covers the core attributes associated with IdentityNow Roles.

Import Script

As IdentityNow v3 API Authentication requires a number of artifacts, we need to make sure we secure them all appropriately.

For the Admin Username and Password we will do that by exporting them to an XML file using Export-CLIXML and then in the Import Script, import them using Import-CLIXML. Those cmdlets respect the context by which the credentials were exported and will only be able to access them when imported under that same context. As our Management Agent will be run by the MIM Sync Server Service Account we need to create the credentials file using that login. To do that;

  • temporarily reconfigure the MIM Sync Service Account so that it can logon locally
    • On the MIM Sync Server open Local Security Policy = > Local Policies => User Rights Assignment => Deny log on locally and remove the MIM Sync Server Service Account
    • repeat for Deny access to this computer from the network
  • Logon to the MIM Sync Server using the MIM Sync Server Service Account
  • Run the following to create the credential file and put the credential file in the Extensions\yourRolesMA directory
$adminUSR = [string]"Partner_Admin".ToLower()
$adminPWDClear = 'myStr0ngP@$$w0rd'
$adminPWD = ConvertTo-SecureString $adminPWDClear -AsPlainText -Force
$Credentials = New-Object System.Management.Automation.PSCredential $adminUSR,$adminPWD
$Credentials | export-clixml c:\temp\RoleAdminCred.xml
  • IMPORTANT: Add the MIM Sync Server Service Account back  into the Deny access to this computer from the network and Deny Logon Locally policies

The IdentityNow v3 API Credentials are stored on the Management Agent Connectivity Configuration page. The Username and Password Authentication options take the v3 API Client ID and API Client Secret respectively.

MA Configuration Username Password ClientID Client Secret.PNG

Make the following updates for your implementation:

  • Line 24 for your IdentityNow Organisation name
  • Line 27 for the location and name of the credentials file created above

Customisation

Depending on what you want to do with it, will depend on how you want Identity Manger to consume the data. You will likely want to;

  • Create a new ObjectType in the Metaverse along with the attributes associated with the Roles
  • Flow the information in and perform any logic
  • Create Roles in IdentityNow
  • Update Roles in IdentityNow

Summary

Using this base management we can get connectivity and visibility of IdentityNow Roles in Microsoft Identity Manager.

Managing SailPoint IdentityNow Tasks with PowerShell

Sailpoint IdentityNow Tasks via API using PowerShell

In SailPoint IdentityNow when using the Request Center, tasks are created for activities that are not able to be automatically (directly) fulfilled. Essentially completion of the request requires someone to do something, then return to the IdentityNow Portal and flag the Task as complete. What if we want to see what Tasks are open and flag them as complete through external automation?

Well, this SailPoint IdentityNow Compass article gives the only background to using the API to get visibility of Tasks.  But what we needed to do was;

  • enumerate Tasks that were pending for Flat File Sources
  • understand the pending operations and complete them
  • mark the Task(s) as complete

This post will cover the first and last bullet points. The performing the operation will be dependent on what you have integrated with and what is being requested for an Entitlement.

Prerequisites

I’m using a v3 Token to access the IdentityNow API’s. I detailed that in this post here. If you don’t have API keys for the v3 endpoint you can use this method to get the oAuth JWT Token. You will also need to make sure that you don’t have any Content-Type set in your headers. If you do then you will get an error message like this;

  • Missing or invalid arguments

API Error when ContentType is set.PNG

Enumerating all Tasks

To enumerate all tasks we need to call the API /task/listAll. Using PowerShell and the access token from one of the methods listed in the prerequisites we can make the following call.

$orgName = "myOrgName"
$tasksURI = "https://$($orgName).identitynow.com/api/task/listAll"
$tasksList = Invoke-RestMethod -method Get -uri $tasksURI -Headers @{Authorization = "$($v3Token.token_type) $($v3Token.access_token)"}
$tasksList.items

A task then looks like this;

Manual Task Details.PNG

Searching for and retrieving an Individual Task

To retrieve an individual task we need to know the ID of the task. If we know what we are looking for we can use PowerShell to locate the specific task, get the ID then get that individual task.

The following command shows looking through the Task items and finding tasks that are ManualAction, are not completed and contain Luke in the description.

$manualActions = $tasksList.items | select-object | Where-Object {$_.type -eq "ManualAction" -and $_.complete -eq $false -and $_.description -like "*Luke*"}
$taskID = $manualActions.id

With our Task identified we can then retrieve it using the API /task/get/{taskID}

$utime = [int][double]::Parse((Get-Date -UFormat %s))
$getIdvTaskbyIDURI = "https://$($orgName).api.identitynow.com/cc/api/task/get/$($taskID)?_dc=$($uTime)"
$indvTask = Invoke-RestMethod -method Get -uri $getIdvTaskbyIDURI -Headers @{Authorization = "$($v3Token.token_type) $($v3Token.access_token)"}
$indvTask

Looking at it, it is the one we wanted.

Individual Task.PNG

Completing a Task

With the task ID we can then update the Task and mark it as completed. To complete the task we make a POST request to task/complete/{taskID}

$completeTaskURI = "https://$($orgName).api.identitynow.com/cc/api/task/complete/$($taskID)"
$completeTask = Invoke-RestMethod -method Post -uri $completeTaskURI -Headers @{Authorization = "$($v3Token.token_type) $($v3Token.access_token)"}
$completeTask

The API will return the object on Success.

Complete Task.PNG

Looking in the IdentityNow Portal we can see that the Task is showing as Completed.

Complete in Portal.PNG

Summary

Using the IdentityNow Task API’s we can get a list of Tasks, search and find the task we are looking for and retrieve all the details it before finally updating the status of the Task to complete.

Batching Microsoft Graph API Requests with JSON Batching and PowerShell

Microsoft Graph JSON Batching

Late in 2018 it came to my attention new functionality with the Microsoft Graph API for batching API requests into a single request. As I predominantly use PowerShell for scripting into Microsoft Graph parallel requests historically required extra functions to  achieve something similar. Use of Invoke-Parallel for instance, that I’ve previously discussed in posts such as How to create an Azure Function App to Simultaneously Start|Stop all Virtual Machines in a Resource Group.

Fast forward to 2019 and I’ve been building a bunch of reports from Microsoft Graph that aggregate data from multiple API endpoints such as /users /auditLogs and /security . A perfect time to explore deeper JSON Batching with Microsoft Graph.

Getting Started

Essentially your Graph API requests are the same, but instead of calling the API endpoint directly, you make a POST request to the Batch endpoint …

https://graph.microsoft.com/v1.0/$batch

… with your request in the body that includes the suffix for the API URI for the request, the request method and an ID. The example below is to query the User API endpoint for users whose displayName starts with the string provided.

$myRequest = [pscustomobject][ordered]@{
   id=$requestID
   method="GET"
   url="/users?`$filter=(startswith(displayName,`'$($userSearchNameEncoded)`'))&`$top=10"
}

Additional requests are added as required to build the Request Body and passed as part of the Post request to the Batch endpoint. Each additional request in the batch requires a different ID and the respective particulars for the request(s).

Batching Requests using PowerShell

Below is an example for generating a batch in PowerShell with two requests.

To execute the request looks like this

$getBatchRequests = Invoke-RestMethod -Method Post -Uri $batchUrl -Body $batchBody -headers $requestHeaders

Responses

The responses come back under the Responses property as shown in the graphic below. The responses have the ID of the requests associated with them so you can associate a response with a request.

Microsoft Graph API JSON Batching 1.PNG

The request responses can then be parsed by iterating through the jobs.

Summary

Using Microsoft Graph JSON Batching we can submit multiple requests as a single job, have the requests performed on the backend in parallel and get the results back.

Keep in mind that the request can also use the DependsOn option to make a request dependent on another that must be completed prior. Also if the response contains a dataset that is paged. Check the nextLink property on the responses and make subsequent calls to return the remaining data.

Update your Azure Sphere OS to 18.11 before Jan 15 2019

Azure Sphere

Update your Azure Sphere OS from 4.2.1 to 18.11 before January 15, 2019

Azure Sphere development kits became available in September of 2018. I had pre-ordered one and received it very quickly from Seeed Studio. I even wrote up my first impressions of it here.

On November 5 Microsoft announced that there was an OS update coming for the Azure Sphere development board the week of November 12.

After the release of 18.11, we encourage you to upgrade your device OS and SDK as soon as possible. After the release, devices that run the TP 4.2.1 release won’t receive any OTA updates for either device or application software. However, device authentication and attestation will continue to work to authenticate to Azure IoT Hub. TP 4.2.1 will continue to be supported until January 15, 2019. Thereafter, devices that are running the TP 4.2.1 OS won’t be able to authenticate to Azure IoT Hub.

Along with the change to the OS version naming, new Features were detailed here in the release notes;

    • Real-time clock (RTC). A Beta API enables applications to set and use the internal clock and leverages support for using a coin-cell battery to ensure the RTC continues to keep time when power is lost.
    • Mutable storage. A Beta API provides access to a maximum of 64k for storage of persistent read/write data.
    • External MCU update. A reference solution shows how your application can update the firmware of additional connected MCUs.
    • Private Ethernet. The MT3620 now supports connecting to a private, 10 Mbps network via the Microchip Ethernet part over a serial peripheral interface (SPI). This functionality allows an application running on the A7 chip to communicate with devices on a private network via standard Transmission Control Protocol (TCP) or User Datagram Protocol (UDP) networking. Stay tuned to the Azure Updates website for more information about this capability.
    • Beta API targeting. Beta APIs are still in development and may change in or be removed from a later release. Starting with this release, we make them available for testing and feedback so that you can get a head start on using new features. You can target applications for either the production APIs or the production and Beta APIs.

Upgrading Azure Sphere OS to 18.11

The upgrade process is manual from 4.2.1 to 18.11. The details are described in the update section of the release notes here. Essentially the process is;

  • Update your Azure Sphere SDK
  • Move the Azure Sphere device to a System Software Only Group
  • Upgrade the Azure Sphere OS
  • Verify the devices status

With the updated SDK installed, running the commands was quick and successful.

Move Device to System Only Group
Move Device to System Only Group
Azure Sphere OS Updated
Azure Sphere OS Updated

Summary

You’ve got 10 days left to update your Azure Sphere OS before it will no longer be able to connect to Azure IoT Hub. The update is quick and painless (unless you have dozens to update).

Is Identity Management still relevant in 2019?

is identity relevant in 2019

The last three years have been a blur. Over the holiday period I’ve been reflecting on my professional experiences over the last few years whilst also considering the future of identity and access management from my perspective as an architect and consultant. Is Identity Management still relevant in 2019? More on that further below, but first a quick recap.

2016 & 2017 Recap

After changing employer and role at the end of 2015 I assimilated into an organisation with a much different culture and by mid 2016 started to embrace the company values of supporting the industry and community that supports us. This is quite a change from my previous 23 years of employment where experience, skills and knowledge was considered internal intellectual property and not to be shared/discussed on public forums.

Identity and Access Management was also a changing industry. Enterprise customers embracing cloud computing was driving requests for solutions relating to hybrid identity management. The incoming work relating to cloud identity enablement was over and above the traditional identity sync and messaging services that I’d been seeing for the previous 5+ years (with respect to Cloud). Enterprises were asking for solutions that weren’t out of the box solutions that could be achieved with just configuration (but then Identity never really has been).

Personal Commentary

In early 2017 I took out a web hosting plan and installed WordPress and started to cross-post the content I write for Kloud along with additional posts. Those posts along with numerous other community activities I undertook saw me rewarded with Microsoft MVP status for Enterprise Mobility (Identity and Access Management).

My posts on this blog are mostly a narrative on my professional activities along with my tangential exploration of new and emerging technical areas of interest. My posts cover many topics but with regularity (Identity & Access Management, Azure Serverless Services, Microsoft Identity Manager, SailPoint IdentityNow, Internet of Things, PowerShell and Containerization).

At the start of 2018 I made a conscious effort to write more, specifically my experiences learning new technologies/services primarily as a reference for myself but also with a greater desire to give back to the industry and community I’d been forced to be a passive participant of for too long.

2018 Summary and Reflections

In 2017 I wrote 55 posts (which was inline with my target of an average of 1 post a week), and in 2018 that increased to a crazy 72. But with it my blog saw an increase in traffic with visitors up 177% and views up 203% from 2017. I don’t write for any reasons other that those stated above. But is nice to have some data to show it has some relevance and is of benefit to others.

Visitors and Views 2017 vs 2018
Visitors and Views 2017 vs 2018

What I do find interesting is where the readers are coming from. 41% from the USA, just under 14% from Australia, 5% from the UK then followed by India, Germany, Netherlands …….

I find this interesting as there maybe some correlation between the content and the location of like-minded individuals. Leading a growing team of Identity Professionals working on projects that aren’t the traditional On Premise Identity Sync style of projects that we’ve been doing for the last 20 years comes with the increased difficulty of talent acquisition for those types of projects. It is rare to find Identity Professionals that have the traditional IDAM skills but also understand Cloud Services, SaaS and PaaS offerings and how to integrate with API’s.

 

Blog Audience Summary
Blog Audience Summary

My top 3 posts for 2018 where;

It isn’t surprising then that 2 of the top 3 posts are associated with integration of Identity between tenants in an IDentity as a Service (IDaaS) offering. It has become one of the common themes our customers are requesting solutions to, in order to solve their inter and intra company collaboration enablement problems. The other post ironically is when Cloud Services that are expected to always be on, go into a transient state.

What will 2019 bring? Is Identity Management still relevant?

Extrapolating from posts I’ve made in the last two years along with customer requests along with current and planned projects there are a couple of themes developing. Identity is more relevant than it ever was;

  1. In Australia we are seeing traditional enterprises moving their Human Capital Management (HCM) services to Cloud SaaS providers (think SAP Success Factors and Workday)
    • this requires a re-think on authoritative source integration for Identity Services
    • it is an opportune time to plan for re-thinking identity data models to drive higher capabilities driven by identity such as Role Based Provisioning and Role Based Access Control
  2. Hybrid Identity (On Premise <=> Cloud) needs to be solved by all Enterprises
    • We are seeing enterprises with traditionally lower levels of investment in IT Services finally having sweated their assets to the point they almost need to re-architect their entire estate
    • When an environment;
      • is still on physical infrastructure
      • is up to 4 major versions behind the current offerings (think Windows Server 2008/2008 R2, Microsoft Exchange 2010)
      • requires users to use legacy VPN style connections to access resources remotely
      • doesn’t support modern workplace working models and mobility isn’t even an option
        • then it requires new blood to sponsor and drive the re-organisation, along with fresh thinking to develop the Strategy and Roadmap with new patterns to accelerate the adoption to continually deliver improvements
          • and one of the key cornerstone drivers of these projects is identity
    • We are also seeing large enterprises that have embraced Cloud Services, but have approached them as tactical integrations into their existing environments. At scale though this often results in a disjoint series of silo’d identity repositories and a very confusing user experience from differing Login ID’s and Passwords.
      • We are assisting customers with planning for a holistic identity driven end-state and then re-architecting the integration of services to provide a consistent and predictable experience for Provisioning, Lifecycle Management, Separation as well as the end-user experience
  3. Historical On Premise Identity Management implementations need an uplift
    • Similar to the sweating your assets comment above we are often evaluating customers existing IDAM implementations based on heritage On Premise releases (such as Microsoft Forefront Identity Manager, Novell Identity Manager, Tivoli Identity Manager and Oracle Identity Manager).
    • Having designed and built numerous IDAM implementations on these products it is disheartening to still see implementations only doing what they were originally developed to do when implemented ~7+ years ago
      • When functionality has been enhanced it is often via a different solution in parallel with the foundation one, or even worse via manual operations performed to achieve the desired outcome
    • Many of the products listed in the first bullet point above have evolved, but so has the requirements of an enterprise organisation.
      • Do you rip and replace or uplift and extend? We are seeing requests for both

What will I be working on in 2019?

Hybrid Identity Management Solutions

My last two major Identity projects have seen me architect solutions that are a hybrid of traditional On Premise Identity Management products with Cloud PaaS and SaaS services along with IDaaS providers. As the PaaS and SaaS offerings mature and IDaaS services achieve highly functionality there will be more demand to augment existing IDAM implementation with them and in the case of IDaaS and traditional IDAM products, where does functionality X best reside. The upside to all of this is less bespoke design and development and more configuration and dissemination of IDAM functionality into micro-services.

Hackathon

In 2018 I entered my first Hackathon with my Voice Assistant for Microsoft Identity Manager. Hackathons were something I had been observing for a while but not something I ever thought I would be a part of. It is highly likely I will be part of another in 2019, naturally in my sphere of Identity Management.

Internet of Things

IoT is something I’ve messed with long before the term IoT became a three-letter acronym. In 2018 I even gave a talk on the Internet of YOUR Things at the Global Azure Bootcamp.

IoT integration with Cloud Services obviously also has an Identity component and one that I’m keen to keep on top of. From physical IoT Devices to Bots integrated with IDAM Implementations I foresee myself continuing to tinker and workout what it will mean from an IDAM perspective managing non heartbeat identities for corporate enterprises in the future.

In closing

Whilst this post started as a simple reflection on my last three years and a changing industry with a viewpoint on the future it seems to have got a lot more deep and meaningful in the middle. If you have made it this far, well done. You have the endless patience of an Identity Consultant.

Is Identity Management still relevant in 2019? Absolutely. What and how we define Identity Management is changing quickly, but it is more relevant than ever.

What is your analysis of the current state of Identity and Identity Management in 2019? Let me know on Twitter, LinkedIn or in the comments below.

Azure Self Service Password Reset Reporting using PowerShell

Just over 18 months ago I wrote this post on using PowerShell and oAuth to access the Azure AD Reports API to retrieve MIM Hybrid Report data.  This week I went to re-use that for Azure Password Reset Reporting and found out that the API had been deprecated.

API Deprecated.PNG

Using the error information that actually was informative I proceeded to the new API. Having authenticated as I had in the previous article, I executed the following to retrieve a list of the Audit Reports available.

$TenantDomain = "customer.com.au"
$url = 'https://graph.windows.net/' + $tenantdomain + "/activities/auditActivityTypes?api-version=beta"
# Returns a JSON document for the Audit Activity Types
$auditActivities= (Invoke-WebRequest-Headers $headerParams-Uri $url).Content |ConvertFrom-Json
$auditActivities.activityTypes

Before I go any further it’s a good time to mention you can easily access the Audit log events using the Azure Portal and retrieve the data you maybe looking for and download a report. There is also the Azure Audit logs content pack for PowerBI as detailed here. Those are awesome solutions, but if you want to do something a little more bespoke and programmatic then keep reading.

Azure AD Password Events Audit Log Data

For the record (as at 18 Dec 2018) there are 1023 different Activity Resource Types. The full list is available here. The ones I’m interested in right now though are a subset of the Self-service Password Management category;

SSPR Audit Reports
SSPR Audit Reports
After investigation, the API to get the information I was after is now available on the Microsoft Graph Beta Audit Logs endpoint auditLogs/directoryAudits
https://graph.microsoft.com/beta/auditLogs/directoryAudits 

My Azure AD Registered App needed to be updated to have the additional role (AuditLog.Read.All) which was done via the Registered Applications blade under Azure Active Directory in the Azure Portal;

AuditLog.Read.All

My script then needed to be updated to talk to the Microsoft Graph and the new scope;

$resource = "https://graph.microsoft.com"
$scope = "AuditLog.Read.All; Directory.Read.All"

So to get Azure Password Reset events for the last week the following calls can be made;

# Get Reset Password Events
$DirectoryAuditURL = "https://graph.microsoft.com/beta/auditLogs/directoryAudits"
$passwordMgmtAuditData = Invoke-RestMethod -Method Get -Uri "$($DirectoryAuditURL)?`$filter=category eq `'UserManagement`' and activityDateTime ge 2018-12-11T03:15:10.6837342Z and startswith(activityDisplayName%2C+`'Reset password`')" -Headers @{Authorization = "Bearer $($Global:accesstoken)"}

To retrieve the other events contained in the auditLog we just need to alter the event to retrieve events for and the timeframe of interest. The events from the table above associated with Azure Self Service Password Reset and Azure Change Password are;

Blocked from self-service password reset
Change password (self-service)
Credentials Registered and Password Reset Status of User
Reset password (by admin)
Reset password (self-service)
Security info saved for self-service password reset
Self-serve password reset flow activity progress
Self-service password reset flow activity progress
Unlock user account (self-service)
User completed security info registration for self-service password reset
User registered for self-service password reset
User started security info registration for self-service password reset

The Script

Here is the modified script from my previous post here that uses oAuth to retrieve Azure Password Reset events. As per the other script it enables the scopes required. If you’re not Global Admin get the script run initially by someone who has the Global Admin role or get them to assign the AuditLog.Read.All permission to the Azure AD Application you have created. You can then login and get an Access Token and a Refresh Token.

Update;

  • Line 5 for your Application ClientID
  • Line 6 for your Application Secret
  • Line 18 for where you want to store the Refresh Token
  • Line 97 can be un-remarked after getting a token to use the Refresh Token (and Line 96 commented out). To get a new Access Token using the stored Refresh Token (Line 18) call the Get-NewTokens function

If, due to your time filter you have more than the maximum (1000 events) that can be returned per call, you can use the skiptoken to get the next page. If you have more than 2000 use the subsequent skipToken in additional calls in a do { get data } while ($skipToken) loop with the addition of adding the returned data to a collection.

$skipToken = $passwordMgmtAuditData.'@odata.nextLink'
$results = Invoke-RestMethod -Method Get -Uri $skipToken -Headers @{Authorization = "Bearer $($Global:accesstoken)"}

If you want to automate the time period to retrieve events for then you can use the Get-Date function and set the time window.

# Date Time for the report. Last 14 Days
[string]$strDateTimeNow = Get-Date -Format yyyy-MM-ddThh:mm:ss
[datetime]$dateTimeNow = Get-Date -Format yyyy-MM-ddThh:mm:ss
[datetime]$minus14Days = $dateTimeNow.AddDays(-14)
[string]$14daysAgo = get-date($minus14Days) -Format yyyy-MM-ddThh:mm:ss

so then the call for the last 14 days of events would be

$passwordMgmtAuditData = Invoke-RestMethod -Method Get -Uri "$($DirectoryAuditURL)?`$filter=category eq `'UserManagement`' and activityDateTime ge $($14daysAgo)z and activityDateTime le $($strDateTimeNow)z and startswith(activityDisplayName%2C+`'Reset password`')" -Headers @{Authorization = "Bearer $($Global:accesstoken)"}

Summary

Using the Microsoft Graph auditLog API we can retrieve using PowerShell events of interest for reporting or other requirements.