Simple reporting from the FIM/MIM Metaverse to PowerBI using the Lithnet FIM/MIM Sync Service PowerShell Module

I have a customer that is looking to report on FIM/MIM identity information. The reports they are looking for aren’t overly complex and don’t necessarily justify the need the full FIM/MIM reporting infrastructure. So I spent a few hours over a couple of days looking at alternatives. In this blog post I give an overview of using the awesome Lithnet FIM/MIM Sync Service PowerShell Module recently released from Ryan Newington to do basic reporting on the Microsoft (Forefront) Identity Manager Metaverse into PowerBI.

I’ll briefly show how to leverage the Lithnet FIM/MIM Sync Service PowerShell Module to extract Person objects and their metadata (based on a search filter criteria) from the MIM/FIM Metaverse and output to a file for PowerBI.

I cover;

  • Building a query
  • Executing the query
  • Filtering the results for output to a file (CSV)
  • Importing to PowerBI as a dataset, creating a report showing results in a Dashboard

First up you’ll need to download and install the module from https://github.com/lithnet/miis-powershell

Using the FIM/MIM Sync Service PowerShell Module to query the Metaverse

What operators you can choose on your attribute types (boolean, string, integer, reference etc) in the Metaverse Search function in the Synchronisation Service Manager you can also perform using the Lithnet FIM/MIM Sync Service PowerShell Module.

By creating a search with multiple criteria in the Metaverse Search you can filter the results from the Metaverse.

MVQuery1.jpg

As shown below you can see that we get 302 results.

MVQuery2.jpg

So let’s import the Lithnet FIM/MIM Sync Service PowerShell Module, create a filter execute it and look at the results. As you’d expect we get the same result. Excellent.

PSQuery1.jpg

Remember that using this PowerShell automation module, the backend is still the WMI interface behind the Synchronisation Service Manager. This means you can’t for example create a query filter using “greater than/less than” if you can’t do it in the UI.

Take my Twitter FriendsCount attribute of type Number/Integer as an example.

MVObject1.jpg

I can’t create a query filter that would return results where FriendsCount > 20,000. I can only use the IsPresent, IsNotPresent and Equals.

QueryOperators1.jpg

On a sidenote the PowerShell error message will give you a hint at what operators you can use as shown below.

PSError.jpg

However, if you try and use StartsWith for an Integer attribute the search will execute but just return no results. My tip then is define your query in the Metaverse Search GUI and when you get what results you want/expect, create the equivalent query in PowerShell and validate you get the same number of results.

Final note on query filters. Multiple criteria are an AND operation filter, NOT OR.

QueryAndNotOr.jpg

Let’s do something with the results

Now that we have a query sorted let’s do something with the results. The result set is the full attribute list and values for each associated object that matched our query from the Metaverse. That’s way more info than what I and probably you need as well. So iterate through the results, pull out the attribute values that we want to do something with and export them as a CSV file.

Results1.jpg

What to do with the output ?

For this overview I’ve just chosen the local file (CSV) that I exported as part of the script as the input dataset in PowerBI. https://app.powerbi.com

On the right hand side I’ve chosen the columns that were exported to the CSV and they appear in the main window.

PBi1.jpg

Click Pin to Live Page. You’ll be prompted to save the report first so do that then I choose New Dashboard for the report. Click Pin live.

PBi2.jpg

I can then refine and get some visual reports quickly using text based queries using keywords from the dataset columns. Like Top 10 by number of friends from the dataset.

PBi3.jpg

Create a couple of queries and pin them to the Dashboard and the data comes to life.

PBi4.jpg

Summary

The Lithnet FIM/MIM Sync Service PowerShell Module provides a really easy way to expose information from the Metaverse that may satisfy many reporting and other requirements. Taking the concept further it wouldn’t be too complex to export the data to an Azure SQL DB on a schedule and have the results dynamically update on a PowerBI Dashboard.
The concept of exporting data for reporting is just one practical example using the tools. Huge thanks to Ryan for creating the Lithnet tools and publishing to the community. Keep in mind the tools disclaimer too.

Here is the sample PowerShell.

Follow Darren on Twitter @darrenjrobinson

 

Goodbye Set-MsolUser, Hello Set-AzureADUser & Azure Graph API

Update: April 13 2017. 
See this post for adapting to changes in the AzureAD 
PowerShell Module Helper Libraries

Recently Microsoft released the preview of the v2.0 Azure AD PowerShell cmdlets. https://azure.microsoft.com/en-us/updates/azure-ad-new-powershell-cmdlets-preview/

I’ve got a project coming up where I’m looking to change my approach for managing users in Azure using Microsoft Identity Manager. Good timing to do a quick proof of concept to manage users with the new cmdlets and directly using the Graph API in preparation to move away from the msol cmdlets.

New Modules

First up, the Azure AD v2.0 PowerShell module was released in public preview on July 13, 2016. There will likely be changes before they become GA, so keep that in mind.

The v2.0 Azure AD PowerShell Module modules themselves are available for download from here https://www.powershellgallery.com/packages/AzureADPreview/1.1.143.0

If you have Windows Management Framework v5 installed you can download and install from PowerShell (as below).

Once installed, pretty quickly you can import the module, authenticate to your tenant, retrieve a user and update a few attributes (as below).

Whilst functional it doesn’t really work for how we need to interact with Azure from an Identity Management perspective. So how can we still use PowerShell but enumerate and manipulate identities in Azure ?

Now that we have the AzureAD v2.0 module installed we can reference the Active Directory library it installs (Microsoft.IdentityModel.Clients.ActiveDirectory.dll), authenticate to our Tenant retrieve users, and update them. That’s exactly what is shown in the commands below.

Where interacting with the GraphAPI directly really shines however is at the directory services layer and the Differential Query functionality.  https://msdn.microsoft.com/en-us/Library/Azure/Ad/Graph/howto/azure-ad-graph-api-differential-query

As such this is the approach that I’ll be taking for integration of Azure with Microsoft Identity Manager for managing users for entitlements (such as Azure licensing).
I hope this though also saves a few people time in working out how to use PowerShell to manage Azure objects via the Graph API (using both the PowerShell Module or via the RestAPI).

Exception from HRESULT 0x80230729 creating a new FIM/MIM Management Agent

Another day, another piece of FIM/MIM experimentation. I had built a fresh MIM 2016 environment in Azure to test a few scenarios out. That all went quick and seamlessly thanks to some great templates and a few scripts. Until I came to create the management agent (the purpose of today’s experimentation).

It didn’t matter if I tried to Create a New Management Agent or Import the Management Agent. I just got “Exception from HRESULT 0x80230729”. The common element however was that the Management Agent I was creating was based off a 3rd party MA based on Microsoft’s Extensible Connectivity Management Agent (ECMA). Specifically I was using Soren Granfeldts PowerShell MA.

HResult 0x80230729

Now I’ve used this MA extensively and not had a problem previously.

So I retraced my steps, clean build, pre-requisites etc. All good. I then tried creating an MA from the out of the box connectors. That worked. I successfully created an Active Directory Management Agent.

In the Windows Application Log I found the following from when I was trying to create the PSMA. A little more to go on from that information.

AppLog

The link in the error message provides some info https://msdn.microsoft.com/en-us/library/dd409252(VS.100).aspx but it is a generic .NET article. Having experience with MIIS/ILM/FIM/MIM I figured the SyncEngine WebServices Config file would be appropriate place for the information provided in the MSDN link to go.

The Fix

The miiserver.exe.config file located in the default installation directory C:\Program Files\Microsoft Forefront Identity Manager\2010\Synchronization Service\Bin is what you need to edit.

Near the end of the misserver.exe.config file find the <runtime> section. Insert the line <loadFromRemoteSources enabled=”true”/> as shown below.

RemoteSources

Restart the Forefront Identity Manger Server Service from the Services Control Panel and you’re back in action.

Management Agent created and back to the task at hand. Happy days.

MACreated

Follow Darren on Twitter @darrenjrobinson

Consuming CSV files from an Exchange Mailbox via Exchange Web Services and FIM/MIM 2016 using the Granfeldt PowerShell MA

This solution on first look is quite random. A management agent that consumes a flat file (comma separated file) isn’t ground breaking, but when the twist is that the CSV file is in an email in an Exchange Inbox, it’s quite a different scenario.

Background

My customer uses a Cloud Service for their recruitment processes. The cloud service does have a SOAP API that I could potentially develop a FIM/MIM solution for using the Microsoft Web Services Management Agent, however my customer does not have API access to their tenant, the vendor isn’t overly responsive and I need a solution in days not weeks.

On the upside, my customer can quickly create reports in the SaaS Portal, and schedule them to be delivered (via CSV/Excel) to an email address. So, what if I was able to integrate FIM/MIM to the inbox that receives the emails with attached reports that contain the information I require and process it accordingly? This blog post is that solution.

Overview

Once a day there is a scheduled process that generates a report (CSV) of new staff from a SaaS provider. That CSV is emailed to an Inbox we created to receive these reports. Using the Granfeldt PowerShell Management Agent I created a solution that;

  • Connects to the specified Exchange Mailbox using Exchange Web Services
    • Enumerates the inbox looking for emails with attachments
    • Validates the emails with attachments by looking for the sender and attachment type we are expecting
    • Extracts the attachment to a file share
    • Moves all messages with attachments to a Processed subfolder
  • Processes the most recent report attachment (CSV) (in case the MA hasn’t run for few days or the reports start coming more than once a day) or if there is no new email message with attachment in the inbox, processes the most recent attachment we previously put in the file share
    • Each report is cumulative so the MA logic stays simple
  • Imports to MIM the new staff that are due to start in the next 7 days (to allow for all access to be setup prior to their first day of employment) and kicks off the MIM Provisioning processes
    • Triggers entitlements and access through the system accordingly (not covered in this post, but includes provisioning of mailbox, home directory, group memberships etc)

Notes:

  • The MIM Synchronisation Service Account will need access permissions to save files into the File Share
  • The MIM Server and this PSMA will require the Microsoft EWS 2.2 API to be installed on the MIM Synchronisation Server. It is available from here https://www.microsoft.com/en-us/download/details.aspx?id=42951

Getting Started with the Granfeldt PowerShell Management Agent

First up, you can get it from here. Søren’s documentation is pretty good but does assume you have a working knowledge of FIM/MIM and this blog post is no different.

Three items I had to work out that I’ll save you the pain of are;

  • You must have a Password.ps1 and Export.ps1 file. Even though we’re not doing password management, or exporting back to the SaaS Provider on this MA, the PS MA configuration requires a file for these fields. The .ps1 doesn’t need to have any logic/script inside it. It just needs to be present.
  • The credentials you give the MA to run the scripts as, needs to be in the format of just ‘accountname’ NOT ‘domain\accountname’. I’m using the AD Account associated with the Exchange Mailbox that receives the emails with the CSV reports.
  • The path to the scripts in the PS MA Config must not contain spaces and be in old-skool 8.3 format. I’ve chosen to store my scripts in an appropriately named subdirectory under the MIM Extensions directory. Tip: from a command shell use dir /x to get the 8.3 directory format name. Mine looks like C:\PROGRA~1\MICROS~4\2010\SYNCHR~1\EXTENS~2\PageUp

Schema.ps1

My schema is essentially the columns that are in the CSV report that I’m importing.

Password Script (password.ps1)

Empty as described above

Import.ps1

Connect to the Exchange Mailbox, find messages from the defined user sending them where the attachment is of the expected naming and format. Extract the CSV file to a File Share. Move emails with attachments to a processed folder. Parse the CSV, perform some logic on the data and import objects and values for new employees.

Export.ps1

Empty as we’re not writing anything back to the SaaS provider.

Wiring it all together

In order to wire the functionality all together there are the usual number of configuration steps to be completed. Below I’ve shown a number of the key points associated with making it all work. This is all Synchronisation Engine MA configuration tasks. Basically create the PS MA, import attributes from the PS MA, create your MA Run Profiles and let it loose.

Management Agent Configuration

As per the tips above, the format for the script paths must be without spaces etc. I’m using 8.3 format and I’m using the same service account as my AD MA.

Password and Export scripts must be specified but as we’re not doing password management or exporting they’re empty as detailed above.

If your schema.ps1 file is formatted correctly, you can select your attributes/columns that will be coming in from the CSV file.

My join rule is simple. StaffID to AccountName in the MetaVerse.

My import flows are direct flows with a Boolean flag to kick off a bunch of declarative rules out of the Portal.

Summary

Thinking outside of the box and using the Granfeldt PowerShell MA I was able to quickly consume a CSV file from an Exchange Inbox to kick off the provisioning process.

Follow Darren on Twitter @darrenjrobinson

Managing SharePoint Online (SPO) User Profiles with FIM/MIM 2016 and the Granfeldt PowerShell MA

Forefront / Microsoft Identity Manager does not come with an out-of-the-box management agent for managing SharePoint Online.

Whilst the DirSync/AADConnect solution will allow you to synchronise attributes from your On Premise Active Directory to AzureAD, SharePoint only leverages a handful of them. It then has its own set of attributes that it leverages. Many are similarly named to the standard Azure AD attributes but with the SPS- prefix.

For example, here is a list of SPO attributes and a couple of references to associated Azure AD attributes;

  • UserProfile_GUID
  • SID
  • SPS-PhoneticFirstName
  • SPS-PhoneticLastName
  • SPS-PhoneticDisplayName
  • SPS-JobTitle
  • SPS-Department
  • AboutMe
  • PersonalSpace
  • PictureURL
  • UserName
  • QuickLinks
  • WebSite
  • PublicSiteRedirect
  • SPS-Dotted-line
  • SPS-Peers
  • SPS-Responsibility
  • SPS-SipAddress
  • SPS-MySiteUpgrade
  • SPS-ProxyAddresses
  • SPS-HireDate
  • SPS-DisplayOrder
  • SPS-ClaimID
  • SPS-ClaimProviderID
  • SPS-ClaimProviderType
  • SPS-SavedAccountName
  • SPS-SavedSID
  • SPS-ResourceSID
  • SPS-ResourceAccountName
  • SPS-ObjectExists
  • SPS-MasterAccountName
  • SPS-PersonalSiteCapabilities
  • SPS-UserPrincipalName
  • SPS-O15FirstRunExperience
  • SPS-PersonalSiteInstantiationState
  • SPS-PersonalSiteFirstCreationTime
  • SPS-PersonalSiteLastCreationTime
  • SPS-PersonalSiteNumberOfRetries
  • SPS-PersonalSiteFirstCreationError
  • SPS-DistinguishedName
  • SPS-SourceObjectDN
  • SPS-FeedIdentifier
  • SPS-Location
  • Certifications
  • SPS-Skills
  • SPS-PastProjects
  • SPS-School
  • SPS-Birthday
  • SPS-Interests
  • SPS-StatusNotes
  • SPS-HashTags
  • SPS-PictureTimestamp
  • SPS-PicturePlaceholderState
  • SPS-PrivacyPeople
  • SPS-PrivacyActivity
  • SPS-PictureExchangeSyncState
  • SPS-TimeZone
  • SPS-EmailOptin
  • OfficeGraphEnabled
  • SPS-UserType
  • SPS-HideFromAddressLists
  • SPS-RecipientTypeDetails
  • DelveFlags
  • msOnline-ObjectId
  • SPS-PointPublishingUrl
  • SPS-TenantInstanceId

My customer has AADConnect in place that is synchronising their On Premise AD to Office 365. They also have a MIM 2016 instance that is managing user provisioning and lifecycle management. I’ll be using that MIM 2016 instance to manage SPO User Profile Attributes.

The remainder of this blog post describes the PS MA I’ve developed to manage the SPO attributes to allow their SPO Online Forms etc to leverage business and organisation user metadata.

Using the Granfeldt PowerShell Management Agent to manage SharePoint Online User Profiles

In this blog post I detail how you can synchronise user attributes from your On Premise Active Directory to an associated users SharePoint Online user profile utilising Søren Granfeldt’s extremely versatile PowerShell Management Agent. Provisioning and licensing of users for SPO is performed in parallel by the DirSync/AADConnect solution. This solution just provides attribute synchronisation to SPO User Profile attributes.

Overview

In this solution I’m managing the attributes that are pertinent to the customer. If you need an additional attribute or you have created custom attributes it is easy enough to extent.

Getting Started with the Granfeldt PowerShell Management Agent

First up, you can get it from here. Søren’s documentation is pretty good but does assume you have a working knowledge of FIM/MIM and this blog post is no different.

Three items I had to work out that I’ll save you the pain of are;

  • You must have a Password.ps1 file. Even though we’re not doing password management on this MA, the PS MA configuration requires a file for this field. The .ps1 doesn’t need to have any logic/script inside it. It just needs to be present
  • The credentials you give the MA to run this MA are the credentials for the account that has permissions to manage SharePoint Online User Profiles. More detail on that further below.
  • The path to the scripts in the PS MA Config must not contain spaces and be in old-skool 8.3 format. I’ve chosen to store my scripts in an appropriately named subdirectory under the MIM Extensions directory. Tip: from a command shell use dir /x to get the 8.3 directory format name. Mine looks like C:\PROGRA~1\MICROS~4\2010\SYNCHR~1\EXTENS~2\SPO

Managing SPO User Profiles

In order to use this working example there are a couple of items to note;

  • At the top of the Import and Export scripts you’ll need to enter your SPO Tenant Admin URL. If your tenant URL is ‘CORP.sharepoint.com’ then at the top of the scripts enter ‘CORP-admin.sharepoint.com’. The Import script will work with corp.sharepoint.com but the export won’t.
  • Give the account you’re using to connect to SPO via your MIM permissions to manage/update SPO User Profiles

Schema.ps1

As mentioned above I’m only syncing attributes pertinent to my customers’ requirements. That said I’ve selected a number of attributes that are potentials for future requirements.

Password Script (password.ps1)

Empty as described above

Import.ps1

A key part of the import script is connecting to SPO and accessing the full User Profile. In order to do this, you will need to install the SharePoint Online Client Components SDK. It’s available for download here https://www.microsoft.com/en-us/download/details.aspx?id=42038

The import script then imports two libraries that give us access to the SPO User Profiles.

Import-Module
‘C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\16\ISAPI\Microsoft.SharePoint.Client.UserProfiles.dll’

Import-Module
‘C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\16\ISAPI\Microsoft.SharePoint.Client.dll’

Import values for attributes defined in the schema.

Export.ps1

The business part of the MA. Basically enough to take attribute value changes from the MV to the SPO MA and export them to SPO. In the example script below I’m only exporting three attributes. Add as many as you need.

Wiring it all together

In order to wire the functionality together, I’m doing it just using the Sync Engine MA configuration as we’re relying on AADConnect to create the users in Office365, and we’re just flowing through attribute values.

Basically, create the PS MA, create your MA Run Profiles, import users and attributes from the PS MA, validate your joins and Export to update SPO attributes as per your flow rules.

Management Agent Configuration

As per the tips above, the format for the script paths must be without spaces etc. I’m using 8.3 format and I’m using the Office 365 account we gave permissions to manage user profiles in SPO earlier.

Password script must be specified but as we’re not doing password management it’s empty as detailed above.

If your schema.ps1 file is formatted correctly you can select your attributes.

I have a few join rules. In the pre-prod environment though I’m joining on WorkEmail => mail.

My import flow is just bringing back in users mobile numbers that users are able to modify in SPO. I’m exporting Title, Location and Department to SPO.

Summary

Using the Granfeldt PowerShell MA it was very easy to manage user SharePoint Online User Profile attributes.

Follow Darren on Twitter @darrenjrobinson

Creating Microsoft Identity Manger (MIM) Run Profiles using PowerShell (post MIM rollup build 4.3.2124.0)

A new hotfix rollup was released on the 11th of March for Microsoft Identity Manager that contains a number of fixes and some new functionality.

One new feature according to the release notes is a new cmdlet Add-MIISADMARunProfileStep

This cmdlet allows the creation of MIM Synchronisation Management Agent Run Profiles using PowerShell.

From the MS Documentation

Add-MIISADMARunProfileStep -MAName ‘AD_MA’ -Partition ‘DC=CONTOSO,DC=COM’ -StepType ‘FI’ -ProfileName ‘ADMA_FULLIMPORT’

Possible values of the StepType parameter (short form or long one can be used):
“FI”, “FULL IMPORT”
“FS”, “FULL SYNCHRONIZATION”
“FIFS”, “FULL IMPORT AND FULL SYNCHRONIZATION”
“FIDS”, “FULL IMPORT AND DELTA SYNCHRONIZATION”
“DI”, “DELTA IMPORT”
“DS”, “DELTA SYNCHRONIZATION”
“DIDS”, “DELTA IMPORT AND DELTA SYNCHRONIZATION”
“EXP”,”EXPORT”

The neat feature of this cmdlet is that it will create the Run Profile if it doesn’t exist.

Cool, so let’s have a play with it. I installed the Rollup on the Synchronisation server in my dev environment. I then went looking for the PS module, and it can’t be seen.

That’s a bit weird. Re-read the release notes. New cmdlet. Ah, maybe the new cmdlet has been added to the old PSSnapIn ?

Yes, there it is. Add-PSSnapin MIIS.MA.Config gives us access to the new cmdlet.

Creating Run Profiles using PowerShell.

Here is a Management Agent from my Dev environment with no Run Profiles configured.

Using ISE I used the new cmdlet to create a Run Profile to run a Stage (Full Import) on the ADMA. Note: You need to put parentheses around the Partition name.

And there you go. The Run Profile didn’t exist so it got created.

A nice enhancement would be if you could create multi-step Run Profiles. That would really save some time, to be able to script that.

Follow Darren on Twitter @darrenjrobinson

Automating the simultaneous deployment of AzureRM Virtual Machines for a development environment

This post is details my method for automating the creation of AzureRM virtual machines for use in a development environment. I’m using this process to quickly standup an environment for testing configurations on.

In summary this process;

  • parallel creation of the AzureRM Virtual Machines
  • All machines have the same configuration
    • NIC, Disks etc
  • All machines are created in a new Resource Group, with associated Virtual Network

Simultaneous Creating the AzureRM Virtual Machines for MIM 2016

For my MIM 2016 Lab I’m going to create 5 Virtual Machines. Their roles are;

  • Active Directory Domain Controllers (2)
  • MIM 2016 Portal Servers (2)
  • MIM 2016 Synchronization Server & collocated SQL Server (1)

In order to stand up the Virtual Machines as quickly as possible I spawn the VM Create process in parallel leveraging the brilliant Invoke-Parallel Powershell Script from Cookie.Monster just as I did for my Simultaneous Start|Stop all AzureRM VM’s script detailed here.

VM Creation Script

In my script at the bottom of this post I haven’t included the ‘invoke-parallel.ps1’. The link for it is in the paragraph above. You’ll need to either reference it at the start of your script, or include it in your script. If you want to keep it all together in a single script include it like I have in the screenshot below.

Parts you’ll need to edit

The first part of the script defines what, where and names associated with the development environment. Where will the lab be deployed (‘Australia East’), what will be the Resource Group name (‘MIM2016-Dev10’), the virtual network name in the resource group (‘MIM-Net10’), the storage account name (‘mimdev16021610’). Update each of these for the name you’ll be using for your environment.

#Global Variables
# Where do we want to put the VM’s
$global:locName ‘Australia East’
# Resource Group name
$global:rgName ‘MIM2016-Dev10’
# Virtual Network Name
$global:virtNetwork ‘MIM-Net10’
# Storage account names must be between 3 and 24 characters in length and use numbers and lower-case letters only
$global:stName ‘mimdev16021610’
# VMName
$global:NewVM $null

The VM’s that will be deployed are added to an array. The names used here will become the ComputerName for each VM. Change for the number of machines and the names you want.

# MIM Servers to Auto Deploy
$VMRole = @()
$VMRole += ,(‘MIMPortal1’)
$VMRole += ,(‘MIMPortal2’)
$VMRole += ,(‘MIMSync’)
$VMRole += ,(‘ADDC1’)
$VMRole += ,(‘ADDC2’)

The script will then prompt you to authN to AzureRM.

# Authenticate to the Azure Portal
Add-AzureRmAccount

The script will then prompt you provide a username and password that will be associated with each of the VM’s.

# Get the UserID and Password info that we want associated with the new VM’s.
$global:cred Get-Credential -Message “Type the name and password for the local administrator account that will be created for your new VM(s).”

The Resource Group will be created. The environment subnet and virtual network will be created in the resource group.
# Create Resource Group
New-AzureRmResourceGroup -Name $rgName -Location $locName
# Create RG Storage Account
$storageAcc New-AzureRmStorageAccount -ResourceGroupName $rgName -Name $stName -Type “Standard_GRS” -Location $locName
# Create RG Subnet
$singleSubnet New-AzureRmVirtualNetworkSubnetConfig -Name singleSubnet -AddressPrefix 10.0.0.0/24
# Create RG Network
$global:vnet New-AzureRmVirtualNetwork -Name $virtNetwork -ResourceGroupName $rgName -Location $locName -AddressPrefix 10.0.0.0/16 -Subnet $singleSubnet

A configuration for each of the VM’s that we listed in the array earlier will be created and added to a new array $VMConfig

Update the -VMSize “Standard_A1” for a different Azure VM Spec and  DiskSizeInGB for a different data disk size.

# VM Config for each VM
$VMConfig = @()
# Create VMConfigs and add to an array
foreach ($NewVM in $VMRole) {
# ******** Create IP and Network for the VM ***************************************
# *****We do this upfront before the bulk create of the VM**************


$pip New-AzureRmPublicIpAddress -Name $NewVM-IP1″ -ResourceGroupName $rgName -Location $locName -AllocationMethod Dynamic
$nic New-AzureRmNetworkInterface -Name $NewVM-NIC1″ -ResourceGroupName $rgName -Location $locName -SubnetId $vnet.Subnets[0].Id -PublicIpAddressId $pip.Id
$vm New-AzureRmVMConfig -VMName $NewVM -VMSize “Standard_A1”
$vm Set-AzureRmVMOperatingSystem -VM $vm -Windows -ComputerName $NewVM -Credential $cred -ProvisionVMAgent -EnableAutoUpdate
$vm Set-AzureRmVMSourceImage -VM $vm -PublisherName MicrosoftWindowsServer -Offer WindowsServer -Skus 2012-R2-Datacenter -Version “latest”
$vm Add-AzureRmVMNetworkInterface -VM $vm -Id $nic.Id
# VM Disks. Deploying an OS and a Data Disk for each
$osDiskUri $storageAcc.PrimaryEndpoints.Blob.ToString() “vhds/WindowsVMosDisk$NewVM.vhd”
$DataDiskUri $storageAcc.PrimaryEndpoints.Blob.ToString() “vhds/WindowsVMDataDisk$NewVM.vhd”
$vm Set-AzureRmVMOSDisk -VM $vm -Name “windowsvmosdisk” -VhdUri $osDiskUri -CreateOption fromImage
$vm Add-AzureRmVMDataDisk -VM $vm -Name “windowsvmdatadisk” -VhdUri $DataDiskUri -CreateOption Empty -Caching ‘ReadOnly’ -DiskSizeInGB 10 -Lun 0

# Add the Config to an Array
$VMConfig += ,($vm)
# ******** End NEW VM ***************************************
}

And finally BAM, we let it loose. Using the Invoke-Parallel script each of the VM’s are created in AzureRM simultaneously.

# In Parallel Create all the VM’s
$VMConfig Invoke-Parallel -ImportVariables -ScriptBlockNew-AzureRmVM -ResourceGroupName $rgName -Location $locName -VM $_ }

The screenshot below is near the end of the create VM process. The MIMPortal1 VM that was first in the $VMConfig array is finished and the others are close behind.

The screenshot below shows all resources in the resource group. The Storage Account, the Network, each VM, its associated NIC and Public IP Address.

For me all that (including authN to AzureRM and providing the ‘username’ and ‘password’ for the admin account associated with each VM) executed in just under 13 minutes. Nice.

The full script? Sure thing, here it is.

Follow Darren Robinson on Twitter

Simultaneously Start|Stop all Azure Resource Manager Virtual Machines in a Resource Group

Problem

How many times have you wanted to Start or Stop all Virtual Machines in an Azure Resource Group ? For me it seems to be quite often, especially for development environment resource groups. It’s not that difficult though. You can just enumerate the VM’s then cycle through them and call ‘Start-AzureRMVM’ or ‘Start-AzureRMVM’. However, the more VM’s you have, that approach running serially as PowerShell does means it can take quite some time to complete. Go to the Portal and right-click on each VM and start|stop ?

There has to be a way of starting/shutting down all VM’s in a Resource Group in parallel via PowerShell right ?

Some searching and it seems common to use Azure Automation and Workflow’s to accomplish it. But I don’t want to run this on schedule or necessarily mess around with Azure Automation for development environments, or have to connected to the portal and kickoff the workflow.

What I wanted was a script that was portable. That lead me to messing around with ‘ScriptBlocks’ and ‘Start-Job’ functions in PowerShell. Passing variables in for locally hosted jobs running against Azure though was painful. So I found a quick clean way of doing it, that I detail in this post.

Solution

I’m using the brilliant Invoke-Parallel Powershell Script from Cookie.Monster, to in essence multi-thread and run in parallel the Virtual Machine ‘start’ and ‘stop’ requests.

In my script at the bottom of this post I haven’t included the ‘invoke-parallel.ps1’. The link for it is in the paragraph above. You’ll need to either reference it at the start of your script, or include it in your script. If you want to keep it all together in a single script include it like I have in the screenshot below.

My rudimentary PowerShell script takes two parameters;

  1. Power state. Either ‘Start’ or ‘Stop’
  2. Resource Group. The name of the Azure Resource Group containing the Virtual Machines you are wanting to start/stop. eg. ‘RG01’

<

p style=”background:white;”>Example: .\AzureRGVMPowerGo.ps1 -power ‘Start’ -azureResourceGroup ‘RG01’ or PowerShell .\AzureRGVMPowerGo.ps1 -power ‘Start’ -azureResourceGroup ‘RG01’

Note: If you don’t have a session to Azure in your current environment, you’ll be prompted to authenticate.

Your VM’s will simultaneously start/stop.

What’s it actually doing ?

It’s pretty simple. The script enumerates the VM’s in the Resource Group you’ve specified. It looks to see the status of the VM’s (Running or Deallocated) that is the inverse of the ‘Power’ state you’ve specified when running the script. It’ll start stopped VM’s in the Resource Group when you run it with ‘Start’ or it will stop all started VM’s in the Resource Group when you run it with ‘Stop’. Simples.

This script could also easily be updated to do other similar tasks. Like, delete all VM’s in a Resource Group.

Here it is

Enjoy.

Follow Darren Robinson on Twitter

Dynamic Active Directory User Provisioning placement (OU) using the Granfeldt Powershell Management Agent

When using Forefront / Microsoft Identity Manager for provisioning users into Active Directory, determining which organisational unit (OU) to place the user in varies from customer to customer. Some AD OU structures are flat, others hierarchical based on business, departmental, functional role or geography. Basically every implementation I’ve done has been different.

That said the most recent implementation I’ve done is for an organisation that is growing and as such the existing structure is in flux and based on differing logic depending on who you talk to. Coming soon they’re restructuring their OU structure but not before MIM will be live. Damn.

This blog post details my interim solution to enable provisioning of new users into AD with an existing complex OU structure until the customer simplifies their hierarchy with upcoming work and without having to maintain a matrix of mapping tables.

Overview

In this blog post I’ll document how you can perform dynamic Active Directory user provisioning placement utilising Søren Granfeldt’s extremely versatile PowerShell Management Agent.

Using the Synchronisation Rule for AD Outbound I land the new AD account into a landing OU. My Placement MA (using the PS MA), see’s accounts in this landing OU and then queries back into AD using the new users Title (title) and State (st) attributes to find other AD users with the same values, location (DN). It orders the results and moves our new AD account to the location of the greatest number of users with the same metadata.

My User Placement MA is used in conjunction with an Active Directory MA and Declarative Rules in the MIM Portal.

Getting Started with the Granfeldt PowerShell Management Agent

In case you haven’t read my other PS MA blog posts here is the background on getting started with the Granfeldt PS MA. If you have you can skip through this section.

First up, you can get it from here. Søren’s documentation is pretty good but does assume you have a working knowledge of FIM/MIM and this blog post is no different. Configuration tasks like adding additional attributes the User Object Class in the MIM Portal, updating MPR’s, flow rules, Workflows, Sets etc are assumed knowledge and if not is easily Bing’able for you to work it out.

Three items I had to work out that I’ll save you the pain of are;

  • You must have a Password.ps1 file. Even though we’re not doing password management on this MA, the PS MA requires a file for this field. The .ps1 doesn’t need to have any logic/script inside it. It just needs to be present
  • The credentials you give the MA to run the scripts as, needs to be in the format of just ‘accountname’ NOT ‘domain\accountname’. I’m using the service account that I’ve used for the Active Directory MA. The target system is the same directory service and the account has the permissions required (you’ll need to add the service account to the appropriate Lync role group for user management)
  • The path to the scripts in the PS MA Config must not contain spaces and be in old-skool 8.3 format. I’ve chosen to store my scripts in an appropriately named subdirectory under the MIM Extensions directory. Tip: from a command shell use “dir /x” to get the 8.3 directory format name. Mine looks like C:\PROGRA~1\MICROS~4\2010\SYNCHR~1\EXTENS~2\Placement

Schema Script (schema.ps1)

As I’m using the OOTB (out of the box) Active Directory MA to provision the AD account and only using this MA to move the AD Object the schema only consists of the attributes needed to workout where the user should reside and update them.

Password Script (password.ps1)

Just a placeholder file as detailed above.

Import Script (import.ps1)

The import script looks to see if the user is currently residing in the LandingOU that the AD MA puts new users into. If the user is, then the Import Script gets the users ‘Title’ and ‘State’ attribute values and searches AD to find any other users with the same State and Title and pulls back their OU. The script discounts any users that are in the Expired OU structure and any users (including the user being processed) that are in the Landing OU. It orders the list by occurrence and then selects the first entry. This becomes the location where we will move the user to. We also update the targetOU custom attribute with where the user is, once moved to keep the synchronisation rule happy and no longer trigger a move on the user (unless someone moves the user back to the LandingOU). The location of the LandingOU and the Expired OU’s path are set near the top of the script. If we don’t find any matches then I have catchall per State (st) where we will put the user and they can then be moved manually if required later.

On the $request line you’ll see I have a filter for the User Object Class and users with an account name starting with ‘U’. For the customer I wrote this for, all users start with U. You can obviously also have Filters on the MA. I have a few there as well, but limiting the size of the returned array makes things quicker to run full imports.

Export Script (export.ps1)

The export script is basically doing the move of the users AD account.

Wiring it all together

In order to wire the functionality all together there are the usual number of configuration steps to be completed. Below I’ve shown a number of the key points associated with making it all work.

Basically create the PS MA, import attributes from the PS MA, add any additional attributes to the Portal Schema, update the Portal Filter to allow Administrators to use the attribute, update the Synchronisation MPR to allow the Sync Engine to flow in the new attributes, create the Set used for the transition, create your Synchronisation Rule, create your Terminal Services Workflow, create your Terminal Services MPR, create your MA Run Profiles and let it loose.

Management Agent Configuration

As per the tips above, the format for the script paths must be without spaces etc. I’m using 8.3 format and I’m using the same service account as my AD MA.

Password script must be specified but as we’re not doing password management it’s empty as detailed above.

If your schema.ps1 file is formatted correctly you can select your attributes.

My join rule is simple. AccountName (which as you’ll see in the Import.ps1 is aligned with sAMAccountName) to AccountName in the MetaVerse.

My import flow is just bringing back in the DN to a custom MV attribute.

Synchronisation Rules

My Placement MA Outbound Sync rule relies on the value of the new OU that was bought in on the Import script.

We flow out the new DN for the user object.

Set

I created a Set that I use as a transition to trigger the move of the user object on my Placement MA. My Set looks to see if the user account is in AD and active (I have a Boolean attribute not shown and another rule in the MIM Portal that is set based on an advanced flow rule in the Sync engine that has some logic to determine if employment date as sourced from my HR Management Agent is current and the user should be active) and that the user is currently in the LandingOU.

Workflow

An action based workflow that will use the trigger the Synchronisation rule on the Placement MA.

MPR

Finally my MPR for moving the AD object is based on the transition set,

and my Placement Workflow.

Summary

Using the Granfeldt PowerShell MA it was very easy to accomplish something that would have been a nightmare to try and do with declarative rules or un-maintainable using a matrix of mapping tables. It would also be possible to use this as a basis to move users when their State or Title changes. This example could also be reworked easily to use different attributes.

Follow Darren on Twitter @darrenjrobinson

Managing AD Terminal Services Configuration with FIM / MIM using the Granfeldt PowerShell Management Agent

Forefront / Microsoft Identity Manager contains numerous Management Agents (MA’s) out of the box. However an MA for managing AD Terminal Services user configuration isn’t one of them. And at first pass you’d think you could just manipulate a few attributes in AD on an AD MA like you do for home directories (aside from creating the file and permissions on the filesystem) and you’d be done. Don’t worry, I made that wrong assumption too.

Overview

In this blog post I’ll document how you can enable Active Directory users with the necessary attributes and file system elements utilising Søren Granfeldt’s extremely versatile PowerShell Management Agent. I’ll show you how to do the minimum of provisioning a terminal services home directory, a terminal services profile directory, update the associated users AD account with those paths and the terminal services drive letter whilst also allowing the user to logon. Understanding how this is done you can then easily then extend the functionality for lifecycle management (e.g. change location of terminal services directories, delete for de-provisioning). For the file system directories and permissions I’ve re-used logic from my AD home director PS MA detailed here which is based on Kent Nordström’s home directory MA stripped back with a few other subtle changes.

My Terminal Services MA is used in conjunction with an Active Directory MA and Declarative Rules in the MIM Portal.

A note on managing Terminal Services Configuration

Terminal Services configuration properties are, historically, held in a single attribute ‘userParameters’ (a BLOB, a Binary Large OBject). Inside the userParameters BLOB are the four attributes we’re interested in;

  • allowLogon
  • terminalServicesHomeDirectory
  • terminalServicesProfilePath
  • terminalServicesHomeDrive

What you see via LDAP in the userParameters attribute is either an empty attribute (no terminal services configuration) or an attribute full of encoded characters. We’ll use this as our trigger for the MIM rules to determine who has an existing configuration and who doesn’t.

Getting Started with the Granfeldt PowerShell Management Agent

In case you haven’t read my other PS MA blog posts here is the background on getting started with the Granfeldt PS MA. If you have you can skip through this section.

First up, you can get it from here. Søren’s documentation is pretty good but does assume you have a working knowledge of FIM/MIM and this blog post is no different. Configuration tasks like adding additional attributes the User Object Class in the MIM Portal, updating MPR’s, flow rules, Workflows, Sets etc are assumed knowledge and if not is easily Bing’able for you to work it out.

Three items I had to work out that I’ll save you the pain of are;

  • You must have a Password.ps1 file. Even though we’re not doing password management on this MA, the PS MA requires a file for this field. The .ps1 doesn’t need to have any logic/script inside it. It just needs to be present
  • The credentials you give the MA to run the scripts as, needs to be in the format of just ‘accountname’ NOT ‘domain\accountname’. I’m using the service account that I’ve used for the Active Directory MA. The target system is the same directory service and the account has the permissions required (you’ll need to add the service account to the appropriate Lync role group for user management)
  • The path to the scripts in the PS MA Config must not contain spaces and be in old-skool 8.3 format. I’ve chosen to store my scripts in an appropriately named subdirectory under the MIM Extensions directory. Tip: from a command shell use “dir /x” to get the 8.3 directory format name. Mine looks like C:\PROGRA~1\MICROS~4\2010\SYNCHR~1\EXTENS~2\TermServ

Schema Script (schema.ps1)

As I’m using the OOTB (out of the box) Active Directory MA to provision the AD account and only showing provisioning of terminal services configuration the schema only consists of the attributes needed to create the terminal services directories, the terminal services drive letter and update the user object in AD.

Password Script (password.ps1)

Just a placeholder file as detailed above.

Import Script (import.ps1)

The import script looks to see if there is a terminal services config for the user (in the userParameters BLOB) and then uses the ADSI provider that allows us to access the terminal services configuration attributes to read and flow through the four attributes we’re interested in.

On the $request line you’ll see I have a filter for the User Object Class and users with an account name starting with ‘U’. For the customer I wrote this for, all users start with U. You can obviously also have Filters on the MA. I have a few there as well, limiting the size of the returned array makes things quicker to run full imports.

Export Script (export.ps1)

The export script is doing a couple of functions. It is creating the terminal services home and profile directories, setting the necessary permissions on the home directory for the user and then updating the AD object with the terminal services home and profile directory paths, the terminal services directory drive letter and allowing the user to logon.

Wiring it all together

In order to wire the functionality all together there are the usual number of configuration steps to be completed. Below I’ve shown a number of the key points associated with making it all work.

Basically create the PS MA, import attributes from the PS MA, add any additional attributes to the Portal Schema, update the Portal Filter to allow Administrators to use the attribute, update the Synchronisation MPR to allow the Sync Engine to flow in the new attributes, create the Set used for the transition, create your Synchronisation Rule, create your Terminal Services Workflow, create your Terminal Services MPR, create your MA Run Profiles and let it loose.

Management Agent Configuration

As per the tips above, the format for the script paths must be without spaces etc. I’m using 8.3 format and I’m using the same service account as my AD MA.

Password script must be specified but as we’re not doing password management it’s empty as detailed above.

If your schema.ps1 file is formatted correctly you can select your attributes.

My join rule is simple. AccountName (which as you’ll see in the Import.ps1 is aligned with sAMAccountName) to AccountName in the MetaVerse.

My import flow is just bringing back in the DN to a custom MV attribute.

Synchronisation Rules

My Terminal Services Outbound Sync rule doesn’t need to be very complex. All it is doing is sync’ing out the Terminal Services Profile and Directory paths, Terminal Services Drive Letter and Allow Logon attributes.

Set

I created a Set that I use as a transition to trigger provisioning to the Terminal Services MA. My Set looks to see if the user account is in AD and active (I have a Boolean attribute not shown and another rule in the MIM Portal that is set based on an advanced flow rule in the Sync engine that has some logic to determine if employment date as sourced from my HR Management Agent is current and the user should be active) and that the Terminal Services Config is not already set (this is also based on a Boolean attribute I set via an Advanced Flow rule on my AD MA based on whether there is a value in the userParameters attribute). See my Exchange mailbox provisioning post using the PS MA for an example of setting a boolean attribute based on attribute being present or not.

Workflow

An action based workflow that is used to trigger the Synchronisation rule on the Terminal Services MA.

MPR

Finally my MPR for provisioning Terminal Services config is based on the transition set,

and my Terminal Services Workflow.

Summary

Using the Granfeldt PowerShell MA it was very easy to provision a terminal services configuration to new users and permission the directories correctly. It is also easy to extend the solution to manage moving terminal services directories to different locations, deletion etc.

 

Follow Darren on Twitter @darrenjrobinson