This solution on first look is quite random. A management agent that consumes a flat file (comma separated file) isn’t ground breaking, but when the twist is that the CSV file is in an email in an Exchange Inbox, it’s quite a different scenario.
My customer uses a Cloud Service for their recruitment processes. The cloud service does have a SOAP API that I could potentially develop a FIM/MIM solution for using the Microsoft Web Services Management Agent, however my customer does not have API access to their tenant, the vendor isn’t overly responsive and I need a solution in days not weeks.
On the upside, my customer can quickly create reports in the SaaS Portal, and schedule them to be delivered (via CSV/Excel) to an email address. So, what if I was able to integrate FIM/MIM to the inbox that receives the emails with attached reports that contain the information I require and process it accordingly? This blog post is that solution.
Once a day there is a scheduled process that generates a report (CSV) of new staff from a SaaS provider. That CSV is emailed to an Inbox we created to receive these reports. Using the Granfeldt PowerShell Management Agent I created a solution that;
First up, you can get it from here. Søren’s documentation is pretty good but does assume you have a working knowledge of FIM/MIM and this blog post is no different.
Three items I had to work out that I’ll save you the pain of are;
My schema is essentially the columns that are in the CSV report that I’m importing.
Empty as described above
Connect to the Exchange Mailbox, find messages from the defined user sending them where the attachment is of the expected naming and format. Extract the CSV file to a File Share. Move emails with attachments to a processed folder. Parse the CSV, perform some logic on the data and import objects and values for new employees.
Empty as we’re not writing anything back to the SaaS provider.
In order to wire the functionality all together there are the usual number of configuration steps to be completed. Below I’ve shown a number of the key points associated with making it all work. This is all Synchronisation Engine MA configuration tasks. Basically create the PS MA, import attributes from the PS MA, create your MA Run Profiles and let it loose.
As per the tips above, the format for the script paths must be without spaces etc. I’m using 8.3 format and I’m using the same service account as my AD MA.
Password and Export scripts must be specified but as we’re not doing password management or exporting they’re empty as detailed above.
If your schema.ps1 file is formatted correctly, you can select your attributes/columns that will be coming in from the CSV file.
My join rule is simple. StaffID to AccountName in the MetaVerse.
My import flows are direct flows with a Boolean flag to kick off a bunch of declarative rules out of the Portal.
Thinking outside of the box and using the Granfeldt PowerShell MA I was able to quickly consume a CSV file from an Exchange Inbox to kick off the provisioning process.
Follow Darren on Twitter @darrenjrobinson
Today, I’m super excited to finally announce the Beta release of EntraPulse Lite – a…
I'm excited to share some significant authentication enhancements I've contributed to the Lokka MCP Server…
Last month I had the pleasure of speaking at the Sydney event for Global Azure.…
Model Context Protocol (MCP) is a powerful framework that extends AI clients like Claude and…
I've just completed participating in the Azure AI Developer Hackathon that was looking to provide…
Updated: July 2025 v1.0.2 Fixes issue setting D365SalesGlobals enabling session management for D365 Sales API…
This website uses cookies.