How to access Microsoft Identity Manager Hybrid Report data using PowerShell, Graph API and oAuth2

Hybrid Reporting is a great little feature of Microsoft Identity Manager. A small agent installed on the MIM Sync Server will send reporting data to Azure for MIM SSPR and MIM Group activities. See how to install and configure it here.

But what if you want to get the reporting data without going to the Azure Portal and looking at the Audit Reports ? Enter the Azure AD Reports and Events REST API that is currently in preview.  It took me a couple of cracks and getting this working, because documentation is a little vague especially when accessing it via PowerShell and oAuth2. So I’ve written it up and hope it helps for anyone else looking to go down this route.

Gotchas

Accessing the Reports via the API has a couple of caveats that I had to work through:

  • Having the correct permissions to access the report data. Pretty much everything you read tells you that you need to be a Global Admin. Once I had my oAuth tokens I messed around a little and a was able to also get the following from back from the API when purposely using an identity that didn’t have the right permissions. The key piece is “Api request is not from global admin or security admin or security reader role”. I authorized the WebApp using an account that is in the Security Reader Role, and can successfully access the report data.

  • Reading the documentation here on MSDN I incorrectly assumed each category was the report name. Only when I called the “https://graph.windows.net//$metadata?api-version=beta”  and looked at the list of reports I noticed each report was plural.The three that I wanted to access (and report on) are obviously the MIM Hybrid Reports;
"Name":  "mimSsgmGroupActivityEvents",
"Name":  "mimSsprActivityEvents",
"Name":  "mimSsprRegistrationActivityEvents",

Here is the full list of Reports available as of 24 May 2017.

{
    "Name":  "b2cAuthenticationCountSummary",
    "LicenseRequired":  "False"
}
{
    "Name":  "b2cMfaRequestCount",
    "LicenseRequired":  "False"
}
{
    "Name":  "b2cMfaRequestEvent",
    "LicenseRequired":  "False"
}
{
    "Name":  "b2cAuthenticationEvent",
    "LicenseRequired":  "False"
}
{
    "Name":  "b2cAuthenticationCount",
    "LicenseRequired":  "False"
}
{
    "Name":  "b2cMfaRequestCountSummary",
    "LicenseRequired":  "False"
}
{
    "Name":  "tenantUserCount",
    "LicenseRequired":  "False"
}
{
    "Name":  "applicationUsageDetailEvents",
    "LicenseRequired":  "False"
}
{
    "Name":  "applicationUsageSummaryEvents",
    "LicenseRequired":  "True"
}
{
    "Name":  "b2cUserJourneySummaryEvents",
    "LicenseRequired":  "False"
}
{
    "Name":  "b2cUserJourneyEvents",
    "LicenseRequired":  "False"
}
{
    "Name":  "cloudAppDiscoveryEvents",
    "LicenseRequired":  "False"
}
{
    "Name":  "mimSsgmGroupActivityEvents",
    "LicenseRequired":  "True"
}
{
    "Name":  "ssgmGroupActivityEvents",
    "LicenseRequired":  "True"
}
{
    "Name":  "mimSsprActivityEvents",
    "LicenseRequired":  "True"
}
{
    "Name":  "ssprActivityEvents",
    "LicenseRequired":  "True"
}
{
    "Name":  "mimSsprRegistrationActivityEvents",
    "LicenseRequired":  "True"
}
{
    "Name":  "ssprRegistrationActivityEvents",
    "LicenseRequired":  "True"
}
{
    "Name":  "threatenedCredentials",
    "LicenseRequired":  "False"
}
{
    "Name":  "compromisedCredentials",
    "LicenseRequired":  "False"
}
{
    "Name":  "auditEvents",
    "LicenseRequired":  "False"
}
{
    "Name":  "accountProvisioningEvents",
    "LicenseRequired":  "False"
}
{
    "Name":  "signInsFromUnknownSourcesEvents",
    "LicenseRequired":  "False"
}
{
    "Name":  "signInsFromIPAddressesWithSuspiciousActivityEvents",
    "LicenseRequired":  "True"
}
{
    "Name":  "signInsFromMultipleGeographiesEvents",
    "LicenseRequired":  "False"
}
{
    "Name":  "signInsFromPossiblyInfectedDevicesEvents",
    "LicenseRequired":  "True"
}
{
    "Name":  "irregularSignInActivityEvents",
    "LicenseRequired":  "True"
}
{
    "Name":  "allUsersWithAnomalousSignInActivityEvents",
    "LicenseRequired":  "True"
}
{
    "Name":  "signInsAfterMultipleFailuresEvents",
    "LicenseRequired":  "False"
}
{
    "Name":  "applicationUsageSummary",
    "LicenseRequired":  "True"
}
{
    "Name":  "userActivitySummary",
    "LicenseRequired":  "False"
}
{
    "Name":  "groupActivitySummary",
    "LicenseRequired":  "True"
}

How to Access the Reporting API using PowerShell

What you need to do is;

  • Register a WebApp
    • Assign a reply to URL of https://localhost
    • Assign it Read.Directory permissions
  • Get an oAuth2 Authentication Code using an account that is either Global Admin or in the Security Admin or Security Reader Azure Roles
  • Use your Bearer and Refresh tokens to query for the reports you’re interested in

Register your WebApp

In the Azure Portal create a new Web app/API app and assign it https://localhost as the Reply URL. Record the Application ID for use in the PowerShell script.

Assign the Read Directory data permission as shown below

Obtain a key from the Keys option on your new Web App.  Record it for use in the PowerShell script.

Generate an Authentication Code, get a Bearer and Refresh Token

Update the following script, changing Lines 5 & 6 for the ApplicationID/ClientId and Client Secret for the WebApp you created above.

Run the script and you will be prompted to authenticate. Use an account in the tenant where you created the Web App that is a Global Admin or in the Security Admin or Security Reader Azure Roles. You will need to change the location where you want the refresh.token stored (line 18).

If you’ve done everything correctly you have authenticated, got an AuthCode which was then used to get your Authorization Tokens. The value of the $Authorization variable should look similar to this;

Now you can use the Refresh token to generate new Authorization Tokens when they time out, simply by calling the Get-NewTokens function included in the script above.

Querying the Reporting API

Now that you have the necessary prerequisites sorted you can query the Reporting API.

Here are a couple of simple queries to return some data to get you started. Update the script for the tenant name of your AzureAD. With the $Authorization values from the script above you can get data for the MIM Hybrid Reports.

Synchronizing Exchange Online/Office 365 User Profile Photos with FIM/MIM

Introduction

This is Part Two in the two-part blog post on managing users profile photos with Microsoft FIM/MIM. Part one here detailed managing users Azure AD/Active Directory profile photo. This post delves deeper into photos, specifically around Office 365 and the reason why you may want to manage these via FIM/MIM.

Background

User profile photos should be simple to manage. But in a rapidly moving hybrid cloud world it can be a lot more complex than it needs to be. The best summary I’ve found of this evolving moving target is from Paul Ryan here.

Using Paul’s sound advice we too are advising our customers to let users manage their profile photo (within corporate guidelines) via Exchange Online. However as described in this article photos managed in OnPremise Active Directory are synchronized to Azure AD and on to other Office365 services only once. And of course we want them to be consistent across AD DS, Azure AD, Exchange Online and all other Office365 Services.

This post details synchronizing user profile photos from Exchange Online to MIM for further synchronization to other systems. The approach uses a combination of Azure GraphAPI and Exchange Remote PowerShell to manage Exchange Online User Profile Photos.

The following graphic depicts the what the end goal is;

Current State

  • Users historically had a photo in Active Directory. DirSync/ADSync/AzureADConnect then synchronized that to Azure AD (and once only into Office 365).
  • Users update their photo in Office365 (via Exchange Online and Outlook Web Access)
    • the photo is synchronized across Office365 Services

Desired State

  • An extension of the Current State is the requirement to be able to take the image uploaded by users in Exchange Online, and synchronize it back to the OnPremise AD, and any other relevant services that leverage a profile photo
  • Have AzureADConnect keep AzureAD consistent with the new photo obtained from Office365 that is synchronized to the OnPrem Active Directory
  • Sync the current photo to the MIM Portal

Synchronizing Office365 Profile Photos

Whilst Part-one dealt with the AzureAD side of profile photos as an extension to an existing AzureAD PowerShell Management Agent for FIM/MIM, I’ve separated out the Office365 side to streamline it and make it as efficient as possible. More on that later. As such I’ve created a new PowerShell Management Agent specifically for Office365 User Profile Photos.

I’m storing the Exchange Online photo in the MIM Metaverse as a binary object just as I did for the AzureAD photo (but in a different attribute ). I’m also storing a checksum of the photos (as I did for the AzureAD Photo, but also in a different attribute) to make it easier for comparing what is in Azure AD and Exchange Online, to then be used to determine if changes have been made (eg. user updated their profile photo).

Photo Checksum

For generating the hash of the profile photos I’m using Get-Hash from the Powershell Community Extensions.  Whilst PowerShell has Get-FileHash I don’t want to write the profile photos out to disk and read them back in just to get the checksum. That slows the process up by 25%. You can get the checksum using a number of different methods and algorithms. Just be consistent and use the same method across both profile photos and you’ll be comparing apples with apples and the comparison logic will work.

Some notes on Photos and Exchange Online (and MFA)

This is where things went off on a number of tangents. Initially I tried accessing the photos using Exchange Online Remote PowerShell.

CAVEAT 1: If your Office365 Tenant is enabled for Multi-Factor Authentication (which it should be) you will need to get the Exchange Online Remote PowerShell Module as detailed here. Chances are you won’t have full Office365 Admin access though, so as long as the account you will be using is in the Recipient Management Role you should be able to go to the Exchange Control Panel using a URL like https://outlook.office365.com/ecp/?realm=<tenantname>&wa=wsignin1.0 where tenantname is something like customer.com.au From the Hybrid menu on in the right handside pane you will then be able to download the Microsoft.Online.CSE.PSModule.Client.application I had to use Internet Explorer to download the file and get it installed successfully. Once installed I used a few lines from this script here to load the Function and start my RPS session from within PowerShell ISE during solution development.

CAVEAT 2: The EXO RPS MFA PS Function doesn’t allow you to pass it your account password. You can pass it the identity you want to use, but not the password. That makes scheduled process automation with it impossible.

CAVEAT 3: The RPS session exposes the Get-UserPhoto cmdlet which is great. But the RPS session leverages the GraphAPI. The RPS PS Module doesn’t refresh it’s tokens, so if the import takes longer than 60 minutes then using this method you’re a bit stuffed.

CAVEAT 4: Using the Get-UserPhoto cmdlet detailed above, the syncing of photos is slow. As in I was only getting ~4 profile photos per minute slow. This also goes back to the token refresh issue as for pretty much any environment of the size I deal with, this is too slow and will timeout.

CAVEAT 5: You can whitelist the IP Address (or subnet) of your host so MFA is not required using Contextual IP Addressing Whitelisting. At that point there isn’t really a need to use the MFA Enabled PREVIEW EXO RPS function anyway. That said I still needed to whitelist my MIM Sync Server(s) from MFA to allow integration into the Graph API. I configured just the single host. The whitelist takes CIDR format so that looks like /32 (eg. 11.2.33.4/32)

Performance Considerations

As I mentioned above,

  • using the Get-UserPhoto cmdlet was slow. ~4 per minute slow
  • using the GraphAPI into Exchange Online and looking at each user and determining if they had a photo then downloading it, was also slow. Slow because at this customer only ~50% of their users have a photo on their mailbox. As such I was only able to retrieve ~145 photos in 25 minutes. *Note: all timings listed above were during development and actually outputting the images to disk to verify functionality. 

Implemented Solution

After all my trial and error on this, here is my final approach and working solution;

  1. Use the Exchange Online Remote PowerShell (non-MFA version) to query and return a collection of all mailboxes with an image *Note, add an exception for your MIM Sync host to the white-listed hosts for MFA (if your Office365 Tenant is enabled for MFA) so the process can be automated
  2. Use the Graph API to obtain those photos
    • with this I was able to retrieve ~1100 profile photos in ~17* minutes (after ~2 minutes to query and get the list of mailboxes with a profile photo)

Pre-requisites

There’s a lot of info above, so let me summarize the pre-requisties;

  • The Granfeldt PowerShell MA
  • Whitelist your FIM/MIM Sync Server from MFA (if your Office 365 environment is enabled for MFA)
  • Add the account you will run the MA as, that will in turn connect to EXO via RPS to the Recipient Management Role
  • Create a WebApp for the PS MA to use to access users Profile Photos via the Graph API (fastest method)
  • Powershell Community Extensions to generate the image checksum

Creating the WebApp to access Office365 User Profile Photos

Go to your Azure Portal and select the Azure Active Directory Blade from the Resource Menu bar on the left. Then select App Registrations and from the Manage Section of the Azure Active Directory menu, and finally from the top of the main pane select “New Application Registration“.

Give it a name and select Web app/API as the type of app. Make the sign-in URL https://localhost and then select Create.

Record the ApplicationID that you see in the Registered App Essentials window. You’ll need this soon.

Now select All Settings => Required Permissions. Select Read all users basic profiles in addition to Sign in and read user profile. Select Save.

Under Required Permissions select Add and then select 1 Select an API, and select Office 365 Exchange Online then click Select.

Choose 2 Select Permissions and then select Read user profiles and Read all users’ basic profiles. Click Select.

Select Grant Permissions

From Settings select Keys, give your key a Description, choose a key lifetime and select Save. RECORD the key value. You’ll need this along with the WebApp ApplicationID/ClientID for the Import.ps1 script.

Using the information from your newly registered WebApp, we need to perform the first authentication (and authorization of the WebApp) to the Graph API. Taking your ApplicationID, Key (Client Secret) and the account you will use on on the Management Agent (and that you have assigned the Recipient Management Role in Exchange Online) and run the script detailed in this post here. It will authenticate you to your new WebApp via the GraphAPI after asking you to provide the account you will use on the MA and Authorizing the permissions you selected when registering the app. It will also create a refresh.token file which we will give to the MA to automate our connection. The Authorization dialog looks like this.

Creating the Management Agent

Now we can create our Management Agent using the Granfeldt PowerShell Management Agent. If you haven’t created one before checkout a post like this one, that further down the post shows the creation of a Granfeldt PSMA. Don’t forget to provide blank export.ps1 and password.ps1 files on the directory where you place the PSMA scripts.

PowerShell Management Agent Schema.ps1

PowerShell Management Agent Import.ps1

As detailed above the PSMA will leverage the WebApp to read users Exchange Profile Photos via the Graph API. The Import script also leverages Remote Powershell into Exchange Online (for reasons also detailed above). The account you run the Management Agent as will need to be added to the Recipient Management Role Group in order to use Remote PowerShell into Exchange Online and get the information required.

Take the Import.ps1 script below and update;

  • Update lines 11, 24 and 42 for the path to where you have put your PSMA. Mine is under the Extensions directory in a directory named EXOPhotos.
  • copy the refresh.token generated when authenticating and authorizing the WebApp earlier into the directory you specified in line 42 above.
  • Create a Debug directory under the directory you specified in lines 11,24 and 42 above so you can see what the MA is doing as you implement and debug it the first few times.
  • I’ve written the Import to use Paged Imports, so make sure you tick the Paged Imports checkbox on the configuration of the MA
  •  Update Lines 79 and 80 with your ApplicationID and Client Secret that you recorded when creating your WebApp

 

Running the Exchange User Profile Photos MA

Now that you have created the MA, you should have select the EXOUser ObjectClass and the attributes defined in the schema. You should also create the EXOPhoto (as Binary) and EXOPhotoChecksum (as String) attributes in the Metaverse on the person ObjectType (assuming you are using the built-in person ObjectType).

Configure your flow rules to flow the EXOPhoto and EXOPhotoChecksum on the MA to their respective attributes in the MV.

Create a Stage Only run profile and run it. If you have done everything correctly you will see photos come into the Connector Space.

Looking at the Connector Space, I can see EXOPhoto and EXOPhotoChecksum have been imported.

After performing a Synchronization to get the data from the Connector Space into the Metaverse it is time to test the image that lands in the Metaverse. That is quick and easy via PowerShell and the Lithnet MIIS Automation PowerShell Module.

$me = Get-MVObject -ObjectType person -Attribute accountName -Value "drobinson"
$me.Attributes.EXOPhoto.Values.ValueBinary
[System.Io.File]::WriteAllBytes("c:\temp\myOutlookphoto.jpg" ,$me.Attributes.EXOPhoto.Values.ValueBinary )

The file is output to the directory with the filename specified.

Opening the file reveals correctly my Profile Photo.

Summary

In Part one we got the AzureAD/Active Directory photo. In this post we got the Office365 photo.

Now that we have the images from Office365 we need to synchronize any update to photos to Active Directory (and in-turn via AADConnect to Azure AD). Keep in mind the image size limits for Active Directory and that we retrieved the largest photo available from Office365 when synchronizing the photo on. There are a number of PowerShell modules for photo manipulation that will allow you to resize accordingly.

 

 

 

Getting started configuring the latest Microsoft Identity Manager IBM Notes Management Agent with Domino v9.x

Lotus Notes. My old nemesis as both a user and as an Administrator is back to haunt me again.

There’s a reasonable amount written by others on the trials and tribulations of getting the FIM/MIM Notes MA configured and working. However they are all referencing older versions of the MA and older versions of Domino. (If you are looking details on the previous versions checkout Michael’s great post here). The info on permissions is still valid, so make sure you’re on top of that.

The latest Notes MA (March 2017) is available from here and supports v9.x of Domino and v9.x of the Notes Client.

Here’s a couple of the differences straight up;

  • The v9 Lotus Notes Client does not require the v8 client requirements of setting up the MA using x86 architecture. Use “Process” as MA Architecture as per the documentation
  • I managed to get the v9.0.0 and v9.0.1 FP4 clients to work. The latest is v9.0.1 FP8 available from here
  • There are two technical references from Microsoft for the Notes MA. This is the old version. After some hunting I located the documentation for the latest release here. Yes the new version of the MA and the associated documentation only become available mid-March 2017

Here’s a couple of quick tips for getting up and running with the latest MIM Notes MA;

  • Install the v9.x Lotus Notes Client on the MIM Sync Server
    • Select Single User Install
  • Open the names.nsf and schema.nsf databases using the Notes ID you will be supplying to the Notes MA configuration

Enable Logging

Like anything, knowing what is going on, helps diagnose issues. This is how I enabled event log logging for the latest IBM Notes Management Agent

  • Edit the miiserver.exe.config file in the BIN folder. More than likely that will be in this path C:\Program Files\Microsoft Forefront Identity Manager\2010\Synchronization Service\Bin
  • Add the following lines as shown in the graphic below

<source name="ConnectorsLog" switchValue="Information">
   <listeners>
     <add name="ConnectorsLogListener" 
     type="System.Diagnostics.EventLogTraceListener, System, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089"
     initializeData="ForefrontIdentityManager.ManagementAgent"
     traceOutputOptions="LogicalOperationStack, DateTime, Timestamp, Callstack" />
  </listeners>
</source>

Restart the Forefront Identity Manager Synchronization Service in services and when you run the Notes MA you will then see output in the Event Log under Forefront Identity Manager Management Agent.

Change the following line substituting Error, WarningInformation or Verbose for what you want logged.

<source name="ConnectorsLog" switchValue="Information">

If you want it to log to the Application Log then change the initializeData value to whatever you want. I used IBM Domino MA;

<source name="ConnectorsLog" switchValue="Information">
  <listeners>
    <add name="ConnectorsLogListener" 
    type="System.Diagnostics.EventLogTraceListener, System, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089"
    initializeData="IBM Domino MA"
    traceOutputOptions="LogicalOperationStack, DateTime, Timestamp, Callstack" />
  </listeners>
</source>

… restart the Forefront Identity Manager Synchronization Service in services and when you run the Notes MA the logged data will land in the Application Log.

If you’ve used the Notes MA before, this should get you up and running with the latest.

An Azure Timer Function App to retrieve files via FTP and Remote PowerShell

Introduction

In an age of Web Services and API’s it’s an almost a forgotten world where FTP Servers exist. However most recently I’ve had to travel back in time and interact with a FTP server to get a set of files that are produced by other systems on a daily basis. These files are needed for some flat-file imports into Microsoft Identity Manager.

Getting files off a FTP server is pretty simple. But needing to do it across a number of different environments (Development, Staging and Production) meant I was looking for an easy approach that I could also replicate quickly across multiple environments. As I already had Remote PowerShell setup on my MIM Servers for other Azure Function Apps I figured I’d use an Azure Function for obtaining the FTP Files as well.

Overview

My PowerShell Timer Function App performs the following:

  • Starts a Remote PowerShell session to my MIM Sync Server
  • Imports the PSFTP PowerShell Module
  • Creates a local directory to put the files into
  • Connects to the FTP Server
  • Gets the files and puts them into the local directory
  • Ends the session

Pre-requisites

From the overview above there are a number of pre-requites that other blog posts I’ve written detail nicely the steps involved to appropriately setup and configure. So I’m going to link to those. Namely;

  • Configure your Function App for your timezone so the schedule is correct for when you want it to run. Checkout the WEBSITE_TIME_ZONE note in this post.

    WEBSITE_TIME_ZONE

  • You’ll need to configure your Server that you are going to put the files onto for Remote PowerShell. Follow the Enable Powershell Remoting on the FIM/MIM Sync Server section of this blogpost.
  • The credentials used to connect to the MIM Server are secured as detailed in the Using an Azure Function to query FIM/MIM Service section of this blog post.
  • Create a Timer PowerShell Function App. Follow the Creating your Azure App Service section of this post but choose a Timer Trigger PowerShell App.
    • I configured my Schedule for 1030 every day using the following CRON configuration
      0 30 10 * * *
  • On the Server you’ll be connecting to in order to run the FTP processes you’ll need to copy the PSFTP Module and files to the following directories. I unzipped the PSFTP files and copied the PSFTP folder and its contents to;
    • C:\Program Files\WindowsPowerShell\Modules
    • C:\Windows\System32\WindowsPowerShell\v1.0\Modules

Configuring the Timer Trigger Function App

With all the pre-requisites in place it’s time to configure the Timer Function App that you created in the pre-requisites.

The following settings are configured in the Function App Application Settings;

  • FTPServer (the server you will be connecting to, to retrieve files)
  • FTPUsername (username to connect to the FTP Sever with)
  • FTPPassword (password for the username above)
  • FTPSourceDirectory (FTP directory to get the files from)
  • FTPTargetDirectory (the root directory under which the files will be put)

ApplicationSettings

You’ll also need Application Settings for a Username and Password associated with a user that exists on the Server that you’ll be connecting to with Remote PowerShell. In my script below these application settings are MIMSyncCredUser and MIMSyncCredPassword

Function App Script

Finally here is a raw script. You’ll need to add appropriate error handling for your environment. You’ll also want to change lines 48 and 51 for the naming of the files you are looking to acquire. And line 59 for the servername you’ll be executing the process on.

Summary

A pretty quick and simple little Azure Function App that will run each day and obtain daily/nightly extracts from an FTP Server. Cleanup of the resulting folders and files I’m doing with other on-box processes.

This post is cross-blogged on both the Kloud Blog and Darren’s Blog.

Leveraging the Microsoft Graph API with PowerShell and OAuth 2.0

Background

Microsoft Graph is the evolvement of API’s into Microsoft Cloud Services. For me not being a developer, a key difference is interacting with with Graph API using OAuth 2.0 via PowerShell. Through a number of my previous posts I’ve interacted with the Graph API using client libraries such as the Microsoft.IdentityModel.Clients.ActiveDirectory library. This post details using PowerShell to talk directly to Graph API and managing Authentication and Authorization using OAuth 2.0 and Azure WebApp.

Leveraging the Graph API opens up access to the continually evolving Azure services as shown in the graphic below. Source graph.microsoft.io

Getting started with the Graph API, PowerShell and OAuth 2.0

The key difference between using a client library and going direct is you need to register and configure an Azure WebApp. It is super simple to do. Jump on over to the Office 365 App Registration Tool here. Sign in with an account associated with the Azure Tenant you are going to interact with. Depending on what you’re doing you’ll need to select the appropriate access.

Here’s the settings I selected for access to user profiles. Give the WebApp a name. This is the name that you’ll see in the OAuth Authorization step later on. I’ve highlighted the other key settings. Don’t worry about the SignIn and RedirectURL’s other than configuring HTTPS as we’ll be using PowerShell to access the WebApp.

Once you’ve registered the WebApp successfully you’ll get a Client ID and Client Secret. RECORD/WRITE THESE DOWN NOW as you won’t get access to your Client Secret again.

If you want to change what your WebApp has access to after creating it you can access it via the Classic Azure Portal. Select your Active Directory => Select Applications => Select the WebApp you created earlier  => Select Configure => (scroll to the bottom) Select Add Application. Depending on what you have in your subscription you can add access to your services as shown below.

Authenticating & Authorizing

In order to access the Graph API we need to get our Authorization Code. We only need to do this the first time.

This little script (modify with your Client ID and your Client Secret obtained earlier when we registered our WebApp) will prompt you to authenticate.

Using the account associated with the Web App you registered in the previous step authenticate.

You’ll be requested to authorize your application.

You will then have your AuthCode.

One thing to note above is admin_consent. This is from the URL passed to get the Authorization Code. &prompt=admin_consent is giving Admin Consent to all entities configured on the WebApp over just access for and to a single user.

Accessing the Graph API with OAuth 2.0 Access Token

Using your ClientID, ClientSecret and AuthCode you can now get your access token. I tripped up here and was getting Invoke-RestMethod : {“error”:”invalid_client”,”error_description”:”AADSTS70002: Error validating credentials.
AADSTS50012: Invalid client secret is provided.  Tracing back my steps and looking at my “Client Secret” I noticed the special characters in it that I hadn’t URL Encoded. After doing that I was successfully able to get my access token.

Looking at our AuthZ we can see that the Scope is what was selected when registering the WebApp.

Now using the Access Token I can query the Graph API. Here is getting my AAD Object.

Summary

In my case I now can access all users via the API. Here is what’s available. Using the Access Token and modifying the Invoke-RestMethod URI and Method (including -Body if you are doing a Post/Patch action) you are ready to rock and roll and all via PowerShell.

Your Access Token is valid for an hour. Before then you will need to refresh it. Just run the $Authorization = Invoke-RestMethod ….. line again. 

Follow Darren on Twitter @darrenjrobinson