Microsoft Identity Manager PowerShell Management Agent for Oracle Internet Directory

Why a FIM/MIM PowerShell Management Agent for Oracle Internet Directory? Why not just use the Generic LDAP Connector for Microsoft Identity Manager? I needed an integration solution that was able to update an Oracle Database behind Oracle Internet Directory. That meant I required a solution that was able to use LDAP to get visibility as to who/what was in OID, but then make updates into an Oracle DB. That functionality I wanted to be contained on a single Management Agent, not an MA for the Database and another for LDAP. Another perfect fit for the Granfeldt PowerShell Management Agent. This post details an LDAP Forefront / Microsoft Identity Manager PowerShell Management Agent for Oracle Internet Directory. The example in this post provides a working example to discover/import OID LDAP objects.

If you haven’t used the Granfeldt PowerShell Management Agent (PSMA) before, see the Getting Started with the Granfeldt PowerShell Management Agent section of my Identity Manager Management Agents page here.

Schema Script

Below is my Schema Script for Oracle Internet Directory for the Person/inetOrgPerson objectclass. Depending on what you are using OID for and what the requirements for the OID Management Agent are, you may need to add additional attributes or remove any superfluous ones. I’m using the OID Guid as the anchor.

Import Script

Key functions of the Import Script are;

  • Delta Sync (using OID Change Log)
  • Full Sync (based off an LDAP Filter)
  • Paging of Results through the MA

Authentication

Authentication credentials are provided from the Management Agent through to the Import script via the Connectivity tab Username and Password configuration items.

Microsoft Identity Manager Oracle Internet Directory Management Agent Credentials.PNG
Microsoft Identity Manager Oracle Internet Directory Management Agent Credentials

Delta Sync

The Import Script uses the OID Change Log to determine objects of interest that have changed since the last sync. The import script writes a watermark file that contains the last changenumber used so it knows on the next sync what to look for. This post here has more details around Changelog.

Full Sync

Full Sync is performing an LDAP query against OID based on an LDAP Filter and bringing through to the Management Agent attributes specified on the MA Configuration. Essentially it is a Management Agent version of the PowerShell LDAP query I detailed here.

Paging of Results

If you have a large OID its always a good idea to page the results through the MA. The Import Script below utilises Paging on the Management Agent to process the objects. The method I’ve used in this example is a little different that what I’ve previously posted here and here. Objects returned from OID as per your LDAPFilter (line 207) are split into groups based on the PageSize you have configured for your Run Profile. This is done using the technique shown here for splitting a large collection into manageable chunks.

Configuration Updates

Using the sample Import.ps1 script below, update;

  • Line 10 for the Debug Output Log location
  • Line 12 for the Delta Sync OID Change Log watermark file location
  • Line 161 for the OID Server Name
  • Line 162 for the OID Server LDAP Port
  • Line 179 for the BaseDN to search from
  • Line 207 for the LDAP Filter for OID Objects of interest for the MA

Password and Export Scripts

As per the Getting Started with the Granfeldt PowerShell Management Agent section of my Identity Manager Management Agents page here these need to be present (and can be empty). What your MA needs to do will define whether you need to implement them and with what. For example I have implemented Password Sync to an Oracle DB using this method.

Summary

Using the flexibility of the Granfeldt PowerShell Management Agent for Microsoft Identity Manager we can integrate with diverse systems in bespoke ways. Hopefully this post gives you a leg up if you need to integrate with Oracle Internet Directory keeping in mind Exports or Password Sync could be to Oracle DB’s not just OID using LDAPModify.

Querying Oracle Internet Directory (LDAP) with PowerShell

If you are an IT Professional it is highly likely you are very familiar with Microsoft Active Directory and in turn PowerShell and LDAP. At some point though you may need to integrate with another LDAP directory such as Oracle Internet Directory and you find it isn’t as straight forward as Active Directory and the rich tooling it comes with. I’ve had to create interfaces with numerous LDAP directories over the years but its been quite a long time since I had to integrate with Oracle Internet Directory. That changed recently (as also seen in this post) and I had to get up to speed again with it, and work through the gotchas.

This post details a few steps to discovering and integrating with Oracle Internet Directory using PowerShell and the .NET System.DirectoryServices.Protocols.LDAPConnection Class. We start with connecting using LDP, validating our connectivity and credentials before translating that to PowerShell. You will need:

  • LDAP Servername (or IP Address)
    • check to see you have connectivity to it by being able to resolve the DNS name
  • LDAP Server Port
    • 389 and 636 are default ports for Standard and SSL connections. Chances are OID is on a different Port though
  • Username
    • e.g cn=ldapUser
  • Password
    • password for the ldapUser Account
  • Bind DN
    • the namespace of the LDAP Directory. e.g. dc=customer,dc=com,dc=au

Testing Connectivity to Oracle Internet Directory using Microsoft LDP

Using Microsoft LDP (that comes with the Remote Server Administration Tools (RSAT) for Windows operating systems) is the best approach to start with connecting to a foreign LDAP Directory such as Oracle Internet Directory.

Using the Connection => Connect function and providing the LDAP Server and Port provides us with the RootDSE Information. That, as shown below immediately tells us two important pieces of information. The version of Oracle Internet Directory (11.1.1.5.0) and where ChangeLog is (more on that later).

LDP Connect to OID.PNG
LDP Connect to OID

With a connection to the Oracle Internet Directory now established with can Bind (connect with credentials). From the Connection menu select Bind.

Simple BIND to OID.PNG
Simple BIND to OID

With credentials correct we can then go to the View menu and select Tree

Tree View of OID.PNG
Tree View of OID

Now we can see the OU Structure of Oracle Internet Directory.

Tree View of OID.PNG
Tree View of OID

Connecting to Oracle Internet Directory with PowerShell

Now that we have verified that the information we have for the LDAP Server, Directory and connection information is all correct we can try connecting using PowerShell.

Sample PowerShell LDAP Connection Script

Below is a sample PowerShell script to connect to Oracle Internet Directory. Change;

  • Line 3 for your LDAP Username
  • Line 4 for your LDAP Account password
  • Line 5 for the LDAP Servername
  • Line 6 for the LDAP Port
  • Line 9 for the base OU to start searching for users from

Line 18 is configured to only search one level under the base OU. If you have a complex OU structure you may need to change this to Subtree.

The script will then connect to Oracle Internet Directory and find the account we connected with displaying the values of its attributes.

Below shows running the script and returning the details of the account we connected with.

LDAP PowerShell Connection to Oracle Internet Directory.PNG
LDAP PowerShell Connection to Oracle Internet Directory

Immediately you can see the first problem connecting to OID. The attribute values are returned as a Byte Array. This isn’t ideal.

LDAP Helper Functions

From the PowerShell Gallery and this LDAP Module we can leverage the Get-LdapObject and Expand-Collection functions that will convert the LDAP responses for us. Put these two functions at the top of your script (or as a separate .ps1 script) and load in the script at the beginning of PowerShell LDAP Script.

The LDAP Request and Response change a little to use these functions but still leverage the same LDAP Connection. The timeout is specified against our connection and we call Get-LdapObject using that connection and our previous Filter and SearchBase. Scope is still OneLevel but can be changed to SubTree if required.

# Connect and Search
$ldapConnection.Timeout = new-timespan -Seconds 60
$ldapResponse = Get-LdapObject -LdapConnection $ldapConnection -LdapFilter $ldapSearchFilter -SearchBase $ldapSearchBase -Scope OneLevel
$ldapResponse

PowerShell Get-Ldap Object Script

Here is the full script with the two helper Functions.

The screenshot below shows the output in text rather than Byte Array. Excellent.

LDAP PowerShell Result in Text from Oracle Internet Directory.PNG
LDAP PowerShell Result in Text from Oracle Internet Directory

Oracle Internet Directory Change Log

Change Log is a function of many LDAP directories. It is especially useful when we are synchronising an LDAP Directory to another system as it means we don’t have to return all objects in it each time, but we can get the incremental changes (hence Change Log).

To query the Change Log in Oracle Internet Directory there are a couple of gotchas. You will need;

  • to query OID to see the latest Changenumber
  • only use OneLevel as the Scope for queries. Anything else and OID won’t return the info
  • The Base DN is what is shown when we initially connected using LDP (e.g cn=changelog)

Again starting with LDP we can query and get the Changenumbers. Using LDP Search;

  • use cn=changelog for the Base DN
  • objectclass = * for the Filter
  • One Level for the Scope
  • * for Attributes

And the last of the results will have the most recent changenumber.

Change Number from Oracle Internet Directory.PNG
Change Number from Oracle Internet Directory

Change Log with PowerShell

Now that we know we can get the ChangeLog with LDP, lets do it with PowerShell.

The following script re-uses our previous connection and all you should need to do is change the changeNumber in line 2 inline with your environment.

The screenshot below shows retrieving Change Log entries that are equal or newer that the Changenumber we used in our Filter. The changeFilter scopes down the changes to users in the searchBase so that we see the changes for users rather than other operational changes.

Change Log Results from Oracle Internet Directory
Change Log Results from Oracle Internet Directory

The individual users that have changed can be identified using the following one-liner.

$uniqueUsers = $changeResponse.Entries | ForEach-Object { $_.attributes["targetdn"][0] } | Get-Unique
$uniqueUsers

The Last Changenumber present in Change Log can be found with;

$lastChangeNumber= [int]$ChangeResponse.Entries[($ChangeResponse.Entries.Count-1)].Attributes["changeNumber"][0]

Summary

I recently had to reacquaint myself with OID. I’ve written it up so that next time isn’t as painful.

 

SailPoint IdentityNow to ServiceNow Ticketing Integration

Sailpoint IdentityNow to ServiceNow Ticket Integration

SailPoint IdentityNow comes with many connectors to allow provisioning and lifecycle management of entities in connected systems. However there will always be those systems that require some manual tasks/input. In those instances SailPoint IdentityNow to ServiceNow Ticketing Integration can create a ticket in ServiceNow that can then be tracked whilst those manual steps are fulfilled.

Integration of IdentityNow with ServiceNow doesn’t use a connector in the same sense as the other Sources do in IdentityNow. It uses an Integration Module. The SailPoint ServiceNow Integration Module (SIM) is configured using the SailPoint IdentityNow integration APIs. The Integration Module Configuration Guide on Compass here provides the basis of what is required to List Integrations, Create, Update and Delete Integrations. However I had a few difficulties completing this due to a couple of ambiguous (from the sample documentation) configuration items. This post details how I got it configured so I can find it next time.

All the following API calls leverage authentication using the v3 API AuthN method I detail in this post here.

List Integrations

This call does exactly what it says it does; list any integrations such as IdentityNow to ServiceNow Ticketing Integration. If you haven’t configured any yet, then it will return nothing otherwise you will get the full configuration for each integration. To list integrations the /integration/listSimIntegrations API is called using a GET operation.

$orgName = 'yourIdentityNowOrgName'
$integrationBaseURI = "https://$($orgName).api.identitynow.com/cc/api/integration"
$listIntegrations = Invoke-RestMethod -Method GET -Uri "$($integrationBaseURI)/listSimIntegrations" -Headers @{Authorization = "$($v3Token.token_type) $($v3Token.access_token)"}
The output below is from an integration for an Application from IdentityNow to Service now that also brings through details for the request. More details on that below on Create Integration.

Create an Integration

To create an integration the /integration/createSimIntegration API is called using a POST request with a JSON Body containing the Integration configuration.

$createIntegration = Invoke-RestMethod -Method Post -Uri "https://$($orgName).api.identitynow.com/cc/api/integration/createSimIntegration" -Headers @{Authorization = "$($v3Token.token_type) $($v3Token.access_token)"; "Content-Type" = "application/json"} -Body $createBody

Create ServiceNow Integration Configuration Document

A lot of the configuration is prescriptive as per the IdentityNow documentation. However there are a few items that aren’t always obvious.

The configuration object further below is for integration from IdentityNow to ServiceNow using Basic authentication.

  • Line 4 is the ServiceNow Service Account created for IdentityNow with the permissions detailed in the IdentityNow documentation
  • Line 5 is the password for the Service Account
  • Line 7 is a piece that isn’t (or wasn’t) in the documentation when we configured this.
Important
In order for IdentityNow to pass through all the details for the account the request is for, you need to also have a ServiceNow Source configured. Make sure you have your Correlation Rules setup so that accounts in ServiceNow match/join to IdentityNow. Essentially this will match the ServiceNow Record for who the request is for and populate the Service Request with all their details (from ServiceNow). The Source is required to be able to pass the ServiceNow Account ID associated identity with the IdentityNow request.

The Source Configuration screenshot below shows the basic ServiceNow Source configured using Basic Auth. Make sure you have your Correlation configuration configured to appropriately join Accounts. Take note of the name you give the Source and the Source ID (visible in the Browser URL when configuring the Source).

  • Line 9 is the mapping from the IdentityNow Source (Flat File/Generic) that you will be sending Service Requests through to ServiceNow for, and the ServiceNow Catalog Item. The IdentityNow Source ID is the externalID. You will need to get the Source Configuration via API to get this as detailed in this post.
  • Line 12 is the Virtual Appliance Cluster where the Integration will be configured for. The clusterExternalId can be retrieved via API as detailed in this post. It can be found under Configuration on a VA Cluster object

  • Lines 13 – 23 are what you want to pass to ServiceNow for the Service Request. Modify accordingly but this example will pass through the details of the request from IdentityNow. Create or Update x, y, z etc.
  • Line 26 is the IdentityNow Source ID of the Generic/Flat file source you are configuring for integration with ServiceNow. It’s the same as you used on Line 9 for the IdentityNow to ServiceNow Catalog Item mapping.
  • Lines 29 – 34 are the status mappings for the requests. You can configure how often ServiceNow is polled for status updates through the integration/setStatusCheckDetails API. Send a POST request to the API with the provisioningStatusCheckIntervalMinutes and provisioningMaxStatusCheckDays as shown below for check every 15mins and max days 90 (dev environment type settings).
# Schedule for Status Checks
$schConfig = '{"provisioningStatusCheckIntervalMinutes":15,"provisioningMaxStatusCheckDays":90}'

$scheduleIntegration = Invoke-RestMethod -Method Post -Uri "https://$($orgName).identitynow.com/cc/api/integration/setStatusCheckDetails" -Headers @{Authorization = "$($v3Token.token_type) $($v3Token.access_token)"; "Content-Type" = "application/json"} -Body $schConfig

ServiceNow Integration Configuration Document

Below is a sample IdentityNow to ServiceNow integration configuration.

Example Request in ServiceNow

With all that detail and how to, this is what you actually get. Here is an example of a request that has been generated in ServiceNow from IdentityNow via ServiceNow Integration.

Get an Integration

If you know the ID of an integration you can get it directly using the /getSimIntegration/{ID} Get API call. The ID can be retrieved using List Integrations as detailed at the beginning of this post.

# Get Integration
$getIntegration = Invoke-RestMethod -Method Get -Uri "https://$($orgName).api.identitynow.com/cc/api/integration/getSimIntegration/2c9180846a6a22c8016a75adafake" -Headers @{Authorization = "$($v3Token.token_type) $($v3Token.access_token)"; "Content-Type" = "application/json"}

Delete an Integration

To delete an integration is similar to the Get Integration call except the API endpoint is /deleteSimIntegration/{ID} and the operation is a Delete rather than a GET.

# Delete Integration
$deleteIntegration = Invoke-RestMethod -Method Delete -Uri "https://$($orgName).api.identitynow.com/cc/api/integration/deleteSimIntegration/2c9180856a6a22d0016a6ec2a3fake" -Headers @{Authorization = "$($v3Token.token_type) $($v3Token.access_token)"; "Content-Type" = "application/json"}

Summary

Rather a long post, but hopefully it will give anyone else trying to do this integration the leg up on how to get it operational a lot quicker than it took us.

Get/Update SailPoint IdentityNow Global Reminders and Escalation Policies

SailPoint IdentityNow Access Requests for Roles or Applications usually require approvals which are configured on the associated Role or Application. The Approval could be by the Role/Application Owner, a Governance Group or the Requestor’s Manager. However for reminders and escalation policies the configuration is only available to be retrieved and set via the API. The SailPoint Identity Now api/v2/org API is used to configure these Global Reminders and Escalation Policies.

This post details how to get the configuration of your IdentityNow Org along with updating the the Global Reminders and Escalation Policies.

The PowerShell script below uses the v3 API Authentication process detailed here.

Update the script below for;

  • line 2 for your IdentityNow Orgname
  • line 5 for your IdentityNow Admin ID
  • line 6 for your IdentityNow Admin Password
  • line 16 for your Org v3 ClientID (obtained from SailPoint)
  • line 17 for your Org v3 ClientSecret (obtained from SailPoint)

Executing the script Line 35 will return the current configuration for your SailPoint IdentityNow Org.

$listOrgConfig = Invoke-RestMethod -Method GET -Uri "https://$($orgName).identitynow.com/api/v2/org" -Headers @{Authorization = "$($v3Token.token_type) $($v3Token.access_token)"}
  • lines 39-43 specify the configuration values for
    • daysBetweenReminders – Number of days between reminders or escalations
    • daysTillEscalation – Number of days from when the request is created to when the reminder/escalation process begins
    • maxReminders – Maximum number of reminders sent before starting the escalation process
    • fallbackApprover – The alias of the identity that wlll review the request if no one else reviews it
  • lines 46-50 build the configuration to write back to IdentityNow

and finally Line 53 updates the configuration in IdentityNow

$updateOrgConfig = Invoke-RestMethod -Method Patch -Uri "https://$($orgName).identitynow.com/api/v2/org" -Headers @{Authorization = "$($v3Token.token_type) $($v3Token.access_token)"; 'Content-Type' = 'application/json'} -Body ($approvalConfigBody | convertto-json)

The updated configuration is returned in the $updateOrgConfig variable. The following snippet shows the written config for Reminders and Escalations.

SailPoint IdentityNow Global Reminders and Escalation Policies
SailPoint IdentityNow Global Reminders and Escalation Policies

The Script

Will all the details described above, here is the script.

Summary

Using PowerShell with the v3 Authentication method and the v2 IdentityNow Org API  we can quickly get the Organisation configuration. We can then quickly update the Global Reminders and Escalation Policies. With a few changes other customer configurable (the majority are read/only) configuration options on the Org can also be updated. 

Azure AD Log Analytics KQL queries via API with PowerShell

Log Analytics is a fantastic tool in the Azure Portal that provides the ability to query Azure Monitor events. It provides the ability to quickly create queries using KQL (Kusto Query Language). Once you’ve created the query however you may want to run that query through automation negating the need to use the Azure Portal every time you want to get the associated report data.

In this post I detail;

  • creating a Log Analytic Workspace
  • enabling API Access
  • querying Log Analytics using the REST API with PowerShell
  • outputting data to CSV

Create a Workspace

We want to create a Workspace for our logs and queries. I created mine using the Azure Cloud Shell in the Azure Portal. I’m using an existing Resource Group. If you want it in a new Resource Group either create the RG through the portal or via the CLI using New-AzResourceGroup

$rgName = 'MYLogAnalytics-REPORTING-RG'
$location = 'australiaeast'
New-AzOperationalInsightsWorkspace -ResourceGroupName $rgName -Name Azure-Active-Directory-Logs -Location $location -Sku free

The Workspace will be created.

Create LogAnalytics Workspace.PNG

Next we need to get the logs into our Workspace. In the Azure Portal under Azure Active Directory => Monitoring => Diagnostic settings select + Add Diagnostic Setting and configure your Workspace to get the SignInLogs and AuditLogs.

API Access

In order to access the Log Analytics Workspace via API we need to create an Azure AD Application and assign it permissions to the Log Analytics API. I already had an Application I was using to query the Audit Logs so I added the Log Analytics to it.

On your Azure AD Application select Add a permission => APIs my organization uses and type Log Analytics => select Log Analytics API => Application permissions => Data.Read => Add permissions

Finally select Grant admin consent (for your Subscription) and take note of the API URI for your Log Analytics API endpoint (westus2.api.loganalytics.io) for me as shown below.

API Access to Log Analytics with KQL

Under Certificates and secrets for your Azure AD Application create a Client Secret and record the secret for use in your script.

Azure AD Application Secret.PNG

Link Log Analytics Workspace to Azure AD Application

On the Log Analytics Workspace that we created earlier we need to link our Azure AD App so that it has permissions to read data from Log Analytics.

On your Log Analytics Workspace select Access Control (IAM) => Add => Role = Reader and select your Azure AD App => save

Link Log Analytics Workspace to Azure AD Application.PNG

I actually went back and also assigned Log Analytics Reader access to my Azure AD Application as I encountered a couple of instances of InsufficientAccessError – The provided credentials have insufficient access to perform the requested operation

API Access to Log Analytics with KQL - Log Analytics Reader.PNG

Workspace ID

In order to query Log Analytics using KQL via REST API you will need your Log Analytics Workspace ID. In the Azure Portal search for Log Analytics then select your Log Analytics Workspace you want to query via the REST API and select Properties and copy the Workspace ID.

WorkspaceID for REST API Query.PNG

Querying Log Analytics via REST API

With the setup and configuration all done, we can now query Log Analytics via the REST API. I’m using my oAuth2 quick start method to make the requests. For the first Authentication request use the Get-AzureAuthN function to authenticate and authorise the application. Subsequent authentication events can use the stored refresh token to get a new access token using the Get-NewTokens function. The script further below has the parameters for the oAuth AuthN/AuthZ process.

#Functions
Function Get-AuthCode {
...
}
function Get-AzureAuthN ($resource) {
...
}
function Get-NewTokens {
...
}

#AuthN
Get-AzureAuthN ($resource)
# Future calls can just refresh the token with the Get-NewTokens Function
Get-NewTokens

To call the REST API we use our Workspace ID we got earlier, our URI for our Log Analytics API endpoint, a KQL Query which we convert to JSON and we can then call and get our data.

$logAnalyticsWorkspace = "d03e10fc-d2a5-4c43-b128-a067efake"
$logAnalyticsBaseURI = "https://westus2.api.loganalytics.io/v1/workspaces"
$logQuery = "AuditLogs | where SourceSystem == `"Azure AD`" | project Identity, TimeGenerated, ResultDescription | limit 50"
$logQueryBody = @{"query" = $logQuery} | convertTo-Json

$result = invoke-RestMethod -method POST -uri "$($logAnalyticsBaseURI)/$($logAnalyticsWorkspace)/query" -Headers @{Authorization = "Bearer $($Global:accesstoken)"; "Content-Type" = "application/json"} -Body $logQueryBody

Here is a sample script that authenticates to Azure as the Application queries Log Analytics and then outputs the data to CSV.

Summary

If you need to use the power of KQL to obtain data from Log Analytics programatically, leveraging the REST API is a great approach. And with a little PowerShell magic we can output the resulting data to CSV. If you are just getting started with KQL queries this document is a good place to start.

Output Log Analytics to CSV.PNG

Microsoft Build – ‘Build apps that integrate, automate, and manage security operations’ Presentation

User Secure Score Risk Profile - 640px

At Microsoft Build last week I was honoured to co-present the Build apps that integrate, automate, and manage security operations” session on the Microsoft Security Graph with Preeti Krishna and Sarah Fender

The session was recorded and is available here

The PowerPoint presentation itself is here BUILD19_SecurityDeveloperPlatform

I provided a demo of my Microsoft U.S.E.R (User Security Evaluation Reporter) Application that leverages the Microsoft Security Graph amongst others.

The GitHub Repo for the project is available here.

Querying SailPoint IdentityNow Virtual Appliance Clusters with PowerShell

Today I was configuring an Integration Module for SailPoint IdentityNow. As part of that integration I needed the ID of an IdentityNow Virtual Appliance Cluster. It seemed I hadn’t previously documented doing this and I couldn’t find my previous script. So here’s a quick post for the next time I need to get info about the SailPoint Identity Now Virtual Appliance Clusters in my environment.

The following script uses v3 Authentication as detailed in this post.

Update;

  • line 2 with your IdentityNow Orgname
  • line 5 with your Admin Account Name
  • line 6 with your Admin Password
  • line 16 with your IdentityNow v3 API Client ID
  • line 17 with your IdentityNow v3 API Client Secret

 

 

Forefront/Microsoft Identity Manager – Attempted to access an unloaded AppDomain

This post is more a note-to-self for future me in case I’m in this scenario again. Today I encountered the error Attempted to access an unloaded AppDomain.

I have a custom Forefront/Microsoft Identity Manager Management Agent that requires multiple credentials for the Web Service it is integrating with. In order to secure parts of the credentials that cannot be provided as part of the Connectivity configuration tab on the Management Agent Proporties I have generated them and exported them using Export-Clixml as detailed in this post here.

Today I was migrating a Management Agent from one environment to another and was sure I’d regenerated the credentials correctly. However the Management Agent wasn’t working as expected. Looking into the Applications and Services Logs => Forefront Identity Manager Management Agent Log ….

Forfront Identity Manager Management Agent.PNG

i found the following error ….

Unhandled exception, the CLR will not terminate: System.AppDomainUnloadedException: Attempted to access an unloaded AppDomain

Attempted to access an unloaded AppDomain.PNG

Retracing my steps, I had logged on to the Synchronisation Server with the incorrect credentials to generate my protected credentials XML file.

When the Synchronisation Server was running the Management Agent and attempting to run Import-clixml “credentialFilename” the credentials/account that had exported the credentials did not match the service account that the synchronisation server was running with. And the error listed above was thrown.

Summary

Import-clixml and Export-clixml do exactly what they are supposed too and respect the context by which the credentials were exported and will only be able to access them when imported under that same context. The error doesn’t really tell you that but does hint at it with Attempted to access an unloaded AppDomain if you know what you are trying to do.

 

Winner: Microsoft Graph Security Hackathon

Recently I entered my second Hackathon. My submission was my first ever Web Application for the Devpost / Microsoft Graph Security Hackathon. This morning (Australian time) the winners were announced and ……. I WON.

To say I’m thrilled and honoured would be an understatement as the hackathon was judged by the esteemed Ann Johnson, Scott Hanselman, Troy Hunt, Rick Howard, Mark Russinovich and Olli Vanhoja. Big thanks also have to go to Preeti Krishna from Microsoft who was super responsive on queries around the Microsoft Security Graph.

Microsoft Security Graph Hackathon Judges.PNG

My entry; Microsoft USER (User Security Evaluation Reporter) is an application that reports on any user within an Active Directory/Azure Active Directory and their associated security posture with respect to Risk Events, Enrolled MFA Methods, Password Pwned Status and Azure Password Reset Events. The UI provides a quick summary overview as shown immediately below with the ability to then drill down into each evaluation area.

User Secure Score Summary

Additional details about my submission can be found here including the architecture and technical details but here is a walk through demo of it.

 

Indexing a SailPoint IdentityNow Attribute in an Identity Cube for use in Correlation Rules

Joining/Matching rules in any Identity and Access Management Solution can make or break an Identity Lifecycle Management implementation. Out of the box SailPoint IdentityNow provides a number of common Identity Attributes that can be used for Correlation rules (joining/matching) from Identity Sources (connected systems).

Often though you want to add additional attributes to the list of Identity Attributes that can be used for correlation. The IdentityNow Portal does not provide this functionality, but it is possible via the IdentityNow API. However the documentation and guidance around this is a little sparse. This post details how to use the API to enable additional attributes for use with correlation rules.

NOTE: The guidance is to be pragmatic with the number of additional attributes that you add as Identity Attributes for Correlation. The guidance is a maximum of 7 additional attributes 

Prerequisites

  1. Authentication. I using the v3 method that I detail here
  2. On an Identity Profile you’ll need to map the attribute from a Source to an attribute in the Identity Profile.
    • In the IdentityNow Portal go to Admin => Identities => Identity Profiles => Your Identity Profile => Mappings => Add Attribute

Process

I’m using PowerShell, but the process can be transposed to any language that you can use to make Web Requests. The high-level process is;

  1. Get the list of Identity Attributes
  2. Locate the attribute you want to be available for Correlation rules
    • this is required as addressing the attribute to make it available is case-sensitive
  3. Update the object to make it Searchable
    • making the attribute Searchable promotes it to being available for Correlation Rules

The Script

The steps in the script below assume you are authenticated to the IdentityNow API as detailed in the prerequisites. You should only need to then update line 12 for the attribute name you want to make searchable and available for correlation rules.

The script steps (that you will want to manually step through);

  •  Get the list of Identity Attributes
    • The attribute you want to promote to be searchable should be present, as you’ve mapped it in an Identity Profile as detailed in the prerequisites
  • Get the attribute to modify
  • Modify the attribute object to make it searchable
    • modify other aspects of the Identity Attribute to allow the attribute changes to be written back to IdentityNow
  • Update the attribute in IdentityNow

Summary

As the script above shows, the process to update an attribute to make it searchable and available for correlation rules is a little more involved that just flipping an attribute value, but once you know how reasonably trivial.