Darren’s PowerShell Snippets Volume 1

I live in PowerShell and my memory is pretty good. There are a number of common PowerShell commands and one-liners or functions that I use a lot and I can remember them. However, then there are the ones I use less regularly and I often find myself trying to recall the last time I used them in a script, in order to locate that script just to get those couple of lines. So I’m posting a bunch of them in this post, if for nothing else to help me find them quickly. Let’s consider this as my common PowerShell Snippets and Volume 1.

Unix Time

For a number of API’s I interact with I need to provide the current time in Unix format as part of the API request. This online liner around Get-Date does that.

$utime = [int][double]::Parse((Get-Date -UFormat %s))

the output looks like this

1551906706

URL Encode

Often you need to encode a URL or query. The following shows taking a query and URL encoding it.

$query = "attributes.firstname='Darren' AND attributes.lastname='Robinson'"
Add-Type -AssemblyName System.Web
$queryEncoded = [System.Web.HttpUtility]::UrlEncode($query)

The encoded query then looks like

attributes.firstname%3d%27Darren%27+AND+attributes.lastname%3d%27Robinson%27

Basic Authentication Header

The following will create a Basic Authentication Header from a ClientID and Client Secret that can then be used with Invoke-RestMethod or Invoke-WebRequest

$clientID = 'abcd1234567'
$clientSecret = 'abcd12345sdkslslfjahd'
$Bytes = [System.Text.Encoding]::utf8.GetBytes("$($clientID):$($clientSecret)")
$encodedAuth =[Convert]::ToBase64String($Bytes)
$header = @{Authorization = "Basic $($encodedAuth)" }

You then use the $header variable in your web request e.g

invoke-restmethod -method get -uri "https://webservice.com" -headers $header

Converting a String to Proper/Title Case

Sometimes you get a string that is SHOUTING at you. Or just badly formatted and you need to make it look as it should. The following will convert a string to Title Case

$Surname = (Get-Culture).textinfo.totitlecase("BaDlY-ForMAtted SurNAME".tolower())

The BaDlY-ForMAtted SurNAME will then become

Badly-Formatted Surname

TLS

I have written this up before in more detail for a slightly different scenario here. But often the quick one-liner to force PowerShell to use TLS 1.2 for web requests is

[Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12

Splitting a Large Collection into manageable chunks

Way to regularly I have a data-set of 100k+ objects. In order to parallelise the processing of that data into multiple threads I need to break that 100k+ collection into smaller chunks.

The following takes LARGEDATASET, say a collection of 10’s or 100’s of thousands of objects and splits it into collections of 1000.

$counter = [pscustomobject] @{ Value = 0 }
$groupSize = 1000
$groups = $LARGEDATASET | Group-Object -Property { [math]::Floor($counter.Value++ / $groupSize) }

You can then determine how many it created with

$groups.Count

and then checkout each Group by incrementing through the collection array

$groups[0].Group

Parallel Processing

As an extension of splitting a large collection into smaller collections you may then want to process a collection with multiple threads (e.g in Parallel).

The following snippet leverages the Invoke-Parallel Function from Rambling Cookie Monster that I’ve mentioned previously such as here.

Change the Throttle switch for the number of threads. Change the timeout if required (if for instance you are smashing your (not someone else’s) API.

& .\PATH-TO\invoke-parallel.ps1
$result = Invoke-Parallel -InputObject $CollectionOfObjects -throttle 10 -runspaceTimeout 60 -ImportVariables -ImportModules -ScriptBlock {
   try {
        $query = Invoke-RestMethod -method Get -uri "https://api.application.com" -headers $header
      } catch {
        write-host $_
      }
   @($query)
}
$result.Count

Join Large Collections

This I mentioned previously in this post here.  This snippet is the ability to quickly join objects between two large collections. It utilises the Join-Object function again from Rambling Cookie Monster

$reportData = Join-Object -Left $collection1 -Right $collection2 -LeftJoinProperty UserPrincipalName -RightJoinProperty UserPrincipalName -Type AllInLeft

Un-escaping a JSON Document

Using ConvertTo-JSON will escape special characters as shown below.

{"description": "Australian G\u0026f Logistics Ltd"}

Using the following will unescape the JSON document (if the system you’re interfacing with doesn’t unescape on consumption).

$jsonString | ConvertTo-Json | % { [System.Text.RegularExpressions.Regex]::Unescape($_) }

The JSON document will then become

{"description": "Australian G&f Logistics Ltd"}

That’s it for Vol 1 of Darren’s PowerShell snippets. I’ll start compiling others as I search for them and don’t find them in this Vol.

Aggregating SailPoint IdentityNow Sources via API with PowerShell

Aggregating an IdentityNow Source can be achieved in a number of ways, but when you are in a development environment there will be times where you need to add additional attributes for a Source to load. If the additional attribute(s) is/are used for Correlation, it’s at this time you will need to perform a full aggregation from a Source to re-evaluate each object with the new Correlation rules and to bring in the additional attributes for each identity on the Source.

The LoadAccounts API briefly mentions this in this SailPoint IdentityNow Compass document. It also details the option that needs to be disabled disableOptimization so that each identity is imported and re-evaluated.

This post details how to call the LoadAccounts API using PowerShell and disable optimization.

Prerequisites

The LoadAccounts API is a Private API that has a different authentication process that the v2 and v3. In this post I detail accessing the v1 Private API’s using PowerShell.  That post gives some more detail around the v1 Private API’s.

The following script will aggregate a SailPoint IdentityNow Source disabling optimisation.

Update;

  • Line 2 with your API ClientID
  • Line 4 with your API Client Secret
  • Line 8 with your IdentityNow Org Name
  • Line 10 with your IdentityNow Admin Account Name
  • Line 11 with your IdentityNow Admin Account Password
  • Line 25 with the SourceID (5 digit number) for the Source to Aggregate

Line 29 contains the Web Request Post Body disableOptimization=true” that disables optimisation for the aggregation. If you just require an aggregation of a source omit the body.

Executing the script with valid credentials and a Source will result in the aggregate variable returning a summary for the trigger of the aggregation. You will notice in the attributes that optimizedAggregation is disabled.

Returned Object when initiating IdentityNow Aggregation via API with optimisation disabled.PNG

This is also reflected in the Aggregation Summary from the Portal when completed.

IdentityNow Aggregation via API with optimisation disabled.PNG

Summary

Using the above script it is possible to quickly perform a full aggregation and re-evaluation of correlation rules for an IdentityNow Source.

 

Creating SailPoint IdentityNow Access Profiles via API and PowerShell

Managing SailPoint IdentityNow Access Profiles is easy enough to do using the SailPoint IdentityNow Portal. But what if you have the requirement to update, report on, or create numerous Access Profiles? That’s where the SailPoint IdentityNow API comes into play. The Access Profiles API is documented here but doesn’t go into a lot of detail. In this post I’ll detail interfacing with it using PowerShell primarily to create and update Access Profiles.

Prerequisites

You will need to Authenticate to the IdentityNow API. Both v2 and v3 authentication methods work. I detail the v2 method here and the v3 method here. Personally I’m using the v3 method. Just make sure you change your Headers for the requests to whatever method you use and the naming of your variables.

For reference my v3token variable is $v3Token so my Authentication Header is then @{Authorization = “$($v3Token.token_type) $($v3Token.access_token)”}

Getting Access Profiles

The Access Profiles API URI is

https://$($yourOrgName).api.identitynow.com/v2/access-profiles

If you know the ID of the Access Profile you can return just that Access Profile via it’s ID (where $accessProfileID is the ID that looks like 2c91808466a64e330112a96902ff1f69)

https://$($yourOrgName).api.identitynow.com/v2/access-profiles/$($accessProfileID)

The following script will return Access Profiles from your SailPoint IdentityNow Tenant. Update;

  • Line 2 for your IdentityNow Org name

Updating Access Profiles

To update an Access Profile the API URI is;

https://$($orgName).api.identitynow.com/v2/access-profiles/$($accessProfileID)

The following will update an existing Access Profile to make Request Comments Required and Denied Comments Required equal False.

Update;

  • Line 2 for your IdentityNow Org name
  • Lines 6 for the ID of the Access Profile you want to update
  • Lines 9,10 for the settings to update

Preparing to create an Access Profile

In order to create an Access Profile, there are a number of configuration items that you will need to provide. The key items are;

  • SourceID is the (currently) five digit ID of a source that you can get from the IdentityNow Portal when looking at the properties of a Source. Or via API as I detailed in this post.
  • OwnerID is the Identity ID for the user you will make the owner. To do that you will need to query IdentityNow for the user (see below for an example)
  • Entitlements
    • In order to get the Entitlements ID(s) to assign to the Access Profile you will need to query the Source. This post here details querying Sources to get Entitlements whereby you can get the ExternalID of Entitlements.

Search for Owner ID Request Object

Here is an example Search Request JSON Object required for the Search User call. Update it for a unique attribute for an Identity to query and return.

Update for your criteria. e.g if you copy the JSON below;

  • create a variable name
    • $requestFilter = ‘JSON snippet content from below’
  • Update the search criteria for your search
    • $newRequestFilter = $requestFilter .Replace(“darren.robinson@customer.com.au”,”yourUser@mydomain.com”)
  • Then search for the user and get the ID of the identity
  • Update
    • Line 2 for your Orgname
    • Line 8 for the user to search for that matches the JSON object from above

Creating an Access Profile

Finally, now that we have the prerequisite information to create an Access Profile we can create it. Modify for your environment based off information retrieved from the processes above. Namely;

  • Line 1 for your Orgname
  • Line 2 for the SourceID associated with the Access Profile
  • Line 3 for the Access Profile Owner’s ID
  • Lines 7-10 for your Access Profile Details
  • Line 14 for the Entitlements
  • Line 19 for the Approver (see below for more details)

Access Profile Approvers

For Approvers you can provide the order for approval. For the approval by the Access Profile Owner and then the Manager use the following when creating the Access Profile in Line 19 above.

  • $accessProfile.add(“approvalSchemes”,”accessProfileOwner, manager”)

Other options are:

  • SourceOwner
  • appOwner
  • Governance Group. See managing Governance Groups here to get the Governance Group ID (GUID format)
    • workgroup: 86929844-3391-4ce2-80ef-760127e15813

Summary

Whilst the creation of an Access Profile via API does require some configuration, if you have multiple to create and you know the criteria it is possible to automate the task. I hope this helps others.

 

Searching and Returning SailPoint IdentityNow Entitlements using the API and PowerShell

Entitlements on IdentityNow Sources can be leveraged for many purposes within IdentityNow. Recently I’ve been looking to automate some Access Profiles that will in-turn have entitlements associated with them.

This post details how to query for Entitlements in IdentityNow using the v3 API and PowerShell.

Prerequisites

You will need to Authenticate to the IdentityNow API. The v3 authentication method is required. I detail the v3 method here. The Headers for the requests detailed in this post use the following variables for the JWT oAuth Token.

My v3token variable is $v3Token so my Authentication Header is then
@{Authorization = “$($v3Token.token_type) $($v3Token.access_token)”; “Content-Type” = “application/json”}

Searching for Entitlements

The Base API URI to search for entitlements is;

https://$($org).api.identitynow.com/cc/api/entitlement/list

You will also need to provide a timestamp and a source for which you want to retrieve entitlements for.

Generating the Timestamp

The timestamp is in Unix format which can be generated in PowerShell like this;

$utime = [int][double]::Parse((Get-Date -UFormat %s))

Getting a list of Sources

I’ve previously described listing IdentityNow Sources in this post and this post. Essentially though you can return a list of all sources by performing a GET request to

https://$($orgName).api.identitynow.com/cc/api/source/list

Obtain the Source ExternalID from the source of  your choice that you then wish to return entitlements for.

Entitlement Results

You can limit the number of entitlements returned by using the limit option. The following will return the first 1000 entitlements for a source starting at 0

&start=0&limit=1000

If the source has more than 1000 then you will need to page the results to return the next 1000 results. Continue until you’ve returned all.

&start=1000&limit=1000

Of course you can just not provide a limit and all entitlements will be returned in a single call.

Finding an Entitlement on a Source

Using the power of PowerShell it is quick to find the Entitlement you want if you know some of the information about it. For referencing an Entitlement when creating an Access Profile via the API you will need the Entitlement ID  e.g

$myEntitlement = $sourceEntitlements.items | Select-Object | Where-Object {$_.displayName -like "*Sydney*"}
$myEntitlement.id

Summary

Using the IdentityNow API and the v3 endpoints we can retrieve entitlements for a Source and quickly locate the entitlement and the entitlement ID for use during automation of IdentityNow tasks such as Access Profile creation.

Azure AD/Active Directory User Security Evaluation Reporter

During December 2018 – February 2019 Microsoft have run an online Microsoft Graph Security Hackathon on Devpost.

The criteria of the hackathon was;

  • Build or update a functioning Microsoft Graph-powered solution that leverages the Microsoft Graph Security API

Following the announcement of the Hackathon I was encouraged by Kloud management to enter. During the busy month of December I started to formulate a concept for entry in the Hackathon taking learnings from the hackathon I entered in 2018. Over the Xmas holiday period I started working on my entry which continued into January and February at nights and weekends.

Problem

A Security Administrator within an Organisation enables security related configuration options on an Azure Tenant to implement security controls that align an organisation with Microsoft recommendations and best practice.

The Azure Security Score provides an evaluation on the alignment of an organisation with best practice, however to some extent it still requires end users to have the right configuration for security related elements of their profile. But as a Service Desk Operator or Cyber Security Officer there isn’t a single view of a user’s security posture that can give you an individual user security score summary. My solution……

Microsoft User Security Evaluation Reporter (USER)

Microsoft User Security Evaluation Reporter (USER) is an Azure AD and Active Directory tool for use by the Service Desk and Cyber Security Officers to get instant visibility of an organisations Azure Security Score that allows them to then evaluate current risks within an organisation right down to individual users.

When the Microsoft USER loads the current Azure Security Score is retrieved, evaluated and displayed for alignment with Microsoft Recommendations. Also, on load the last 5 Active Security Risk Events are displayed.

Microsoft USER Recent Risk Events and Azure Secure Score.PNG

The Service Desk Operator or Cyber Security Officer can select one of the recent Security Events or search for a user and drill down into the associated identity. They will be quickly able to understand the users’ individual security posture aligned with best practice.

What are the recent Security Risk Events for that user? Does that user;

  • Have MFA enabled? Is MFA enabled with an Authenticator App as the primary method?
  • Is the users Active Directory password in the Pwned Passwords v4 list from Have I Been Pwned?
  • Has the user recently being attempting Azure Password Reset functions?
  • What are the last 10 logins for that user?
  • What is the base user information for that user and what devices are registered to that user? Are they Azure AD Joined?

User Secure Score Summary.PNG

The clip below gives a walk through with more detail of my Microsoft USER tool.

How I built it

The solution is built using;

  • NodeJS and Javascript
  • leveraging Azure Functions to interface with Azure AD, Microsoft Graph, Azure Table Service
  • Lithnet Password Protection for Active Directory that in turn leverages the Have I Been Pwned v4 dataset
  • All secrets are stored in Azure Key Vault.
  • The WebApp is Application Insights enabled.
  • The WebApp is deployed using a Docker Container into Azure App Service

The architecture is shown below.

MS User Security Evaluation Reporter Architecture

The Code

A Repo with the code can be found here. Keep in mind I’m not a developer and this is my first WebApp that was put together late at night and over weekends and only tested in Chrome and Edge. The Readme also contains hopefully everything you should need to deploy it.

 

Recovering from USB device driver is still in memory / USB Composite Device has error (Code 38)

Something you don’t often think about is how many devices you plug into your computer …….. until you plug-in a device and it doesn’t show up or interact as expected. This post details how I recovered from such a situation so I can find it next time, and hopefully it also helps others recover quickly, rather than the numerous dead-ends I went through to fix the problem.

My Situation

I’ve been evaluating a number of different YubiKey’s recently and out of the blue they stopped being recognised. Below is a message from the YubiKey Manager indicating that there is no device inserted (when in actual fact there is).

Insert YubiKey

To prove the point, plugging in two YubiKey’s informs me I should only have one at a time.

Insert only one YubiKey

Going into Device Manager and selecting Show hidden devices reveals the plethora of USB devices I’ve previously inserted into my computer.

Yubikey Device Manager Properties

Looking into the USB Composite Device‘s with the warning symbols reveals

Windows cannot load the device driver for this hardware because a previous instance of the device driver is still in memory. (Code 38)

USB Composite Device Manager

The Fix

After trying numerous different avenues, this is how I repaired the issue and got up and running again.

Navigate to Windows => Settings => Update and Security => Troubleshoot => Run the troubleshooter

Hardware Troubleshooter.PNG

The troubleshooter will detect the USB Composite Device has error for the devices being problematic. Select one and continue.

USB Composite Devices has error

Select Apply this fix

Apply Fix

Repeat for other USB Composite Device has error errors. Then Restart the computer.

Restart PC

After Restart you should be back in action like I was.

YubiKey Manager - Success.PNG

Hope this helps someone else, and will help me find it next time too.

 

Configuring the Lithnet REST API for the FIM/MIM Service post MIM Version 4.4.x.x

Last year I wrote this post on installing and configuring the Lithnet REST API for the FIM/MIM Service and integrating it with Azure API Management.

This week on a fresh installation of Microsoft Identity Manager with SP1 I was installing the Lithnet REST API for the FIM/MIM Service and was getting errors from the WCF Web Service finding the correct version of the Microsoft.ResourceManagement.dll.

Error finding Microsoft.ResourceManagement DLL.PNG

After a little troubleshooting and no progress I recalled Kent Nordström posting the following tweet last month.

Kent Nordstrom Tweet.PNG

Looking back at my own environment the version of the Microsoft.ResourceManagement.dll that was installed in the product directory C:\Program Files\Microsoft Forefront Identity Manager\2010\Synchronization Service\Bin was version 4.5.285.0. A different version to what Kent had.

ResourceManagement DLL version from Program Files.PNG

Looking for the Microsoft.ResourceManagement.dll under c:\windows\assembly\gac_msil\Microsoft.ResourceManagement the version that was on my installation was 4.4.1302.0.

ResourceManagement DLL version from GAC.PNG

Updating the Lithnet REST API for FIM/MIM Service web.config as detailed in my previous post on the Lithnet REST API for FIM/MIM Service therefore needed to reference 4.4.1302.0. After making that change everything worked as expected.

Version for Resource Management Web Service

Summary

Big thanks to Kent for saving me hours of fault finding. If you are on MIM version 4.4.x.x or later keep in mind that the version of the Microsoft.ResourceManagement.dll located in the product installation directory ( C:\Program Files\Microsoft Forefront Identity Manager\2010\Synchronization Service\Bin ) differs from the version of the file that the installation puts in the GAC.

Also if you then subsequently update your Microsoft Identity Manager installation (maybe because of this obscure reason), don’t forget to then go back and update the Lithnet REST API for the FIM/MIM Service web.config to reflect the latest version of the Microsoft.ResourceManagement.dll.

Multiple Versions of Microsoft.ResourceManagement DLL.PNG

Error: Failed to connect to the specified database when creating a Microsoft Identity Manager Service MA

Last week I was installing Microsoft Identity Manager into a development environment. The install was using Microsoft Identity Manager 2016 with SP1 and was version 4.5.285.0. The install had gone well, SQL, Synchronisation Server, MIM Service and Portal etc. I had even created a couple of Management Agents. However when it came time to create the Microsoft Identity Manager Service MA, the Synchronisation Server returned the error “Failed to connect to the specified database”.

Failed to connect to the specified database.PNG

Jumping over to the Event Log I found the error below. The current version of database is not compatible with the one expected by Forefront Identity Manager service. The current version of the database is : 2008. The expected version is : 2015.

Now this was a fresh install. That error usually indicates the database has been restored from a previous version. But speed reading I thinking SQL Server, not Database. My SQL Server is 2016.

Current version of the database is 2008..PNG

Validating that via SQL Management Studio returned what I expected.

Actual SQL Version

Looking at the database itself, it showed a compatibility level of 2008.

FIMService Database Version

With nothing to lose I set the compatibility level to 2016. On the next attempt to create the MIM Service MA I still got my database error.

Change version from 2008 to 2016

At this point I was short on options. This was a fresh brand new installation so I had no backups yet.

I downloaded the latest hotfix for Microsoft Identity Manager (currently 4.5.286.0) from here and updated my Synchronisation Server and MIM Service and Portal.

Following that I was able to create the MIM Service MA and successfully perform a Stage of data from the MIM Service.

MIM Service MA Created and working

Summary

If on a fresh install of Microsoft Identity Manager you receive the error “Failed to connect to the specified database” and all your configuration settings are correct, and the event logs indicate a Database Version error, install the latest hotfix and you should be back in action.

Error 25009 HResult 0x80131700 when installing Microsoft Identity Manager

This week I was installing Microsoft Identity Manager in a new environment and wasn’t using my usual scripts that semi automate the process. During the installation of the Microsoft Identity Manager Synchronization Service I got the Error 25009 HResult 0x80131700 as shown below.

FIM Sync Server Installation Failed Configurating SQL.PNG
FIM Sync Server Installation Failed Configurating SQL

As mentioned above I normally do this semi-automated but this time I was updating a bunch of that so was starting with a fresh install on a Windows Server 2016 host.

Note: Windows Server 2019 isn’t an officially supported platform currently.

Re-running the install with an installation log as I detail in this post didn’t help much as the install log did show an error but not too much more.

Additional research indicated that this error can be caused for three varying scenarios;

  • insufficient permissions on the SQL Server
  • missing SQL Native Client on the Microsoft Identity Manager Sync Server
  • missing .NET Framework 3.5

Checking the SQL Server I could see that the login for the Sync Server Service has been created, so that discounted the first two. Looking at the installed applications on the Sync Server confirmed that I did not have the .NET Framework 3.5 installed.

Install NET 3.5.PNG

Looking back at my automation scripts one of my first lines is;

Install-WindowsFeature NET-Framework-Core
Following installation of the .NET Framework, 3.5 re-running the setup got me up and running.
Completed Successfully.PNG

Summary

In 2019 installing Microsoft Identity Manager 2016 with SP1 still has the same dependants that Identity Lifecycle Manager had in 2007.  .NET Framework 3.5 however isn’t installed by default on Server 2016 (.NET Framework 4.x is). If nothing else this will jog my memory for next time.

Loading and Querying Data in Azure Table Storage using PowerShell

As part of both a side project and a work project I recently had a couple of larger datasets that I needed to put into a database and be able to search them. I had previously used Azure Blob Storage but hadn’t done too much with Azure Table Storage. Naturally I needed to use PowerShell to perform this and I quickly found out that the AzureRM PowerShell Module could do the basics, but it wasn’t going to scale to the size of the datasets I had. Some trial and effort later I got it to do want I needed and then using the Azure Table Service REST API I was able to easily query the dataset.

Note: Initially I performed an initial load of one of the datasets (~35k rows), a row at a time which took close to 5 hours. Once I got Batch Operations for dataset insertion into Table Storage working I got that down to ~4 minutes. This post details how I did it.

Prerequisites

You will need;

  • the Azure Storage Explorer which you can get from here 
  • An Azure Storage Account
    • you can create one though many different methods such as the Azure Portal UI, Azure Portal CLI, Azure CLI, PowerShell …..
  • PowerShell 5.1 or later and the AzureRM PowerShell Module

Creating an Azure Table Storage Table

Using the Azure Storage Explorer, authenticate to Azure and navigate to your Storage Account. Expand the Storage Account, select Tables and right-click and select Create Table. Give the Table a name and hit enter. Whilst you can also create the Table via the PowerShell AzureRM Module or the Azure CLI, it is also super quick and easy using the Azure Storage Explorer which will be used later to verify the loaded dataset.

Using the Azure Storage Explorer I created a Table named NICVendors.

Create Azure Table Service Table.PNG

Loading Data into Azure Table Service

The example data I will use here is the dataset from a post last year for MAC Address Vendors lookup. Rather than exporting to XML I will load it into Azure Table Storage. The following script will obtain the Vendors list from here and save to your local disk. This will provide ~26k entries and is a good test for loading into Azure Table Service.

Update Line 3 for where you want to output the file too.

With the dataset in memory we can parse it and insert each row into the table. The quickest method is to batch the inserts. The maximum number of rows allowed in a batch is 100. Each row must also be using the same Primary Key.

Update:

  • Line 2 for your Azure Subscription
  • Line 3 for the Resource Groups your Storage Account is located in
  • Line 4 for the name of your Storage Account
  • Line 5 for the name of the Table you created
  • Line 6 for the name of the Partition for the dataset

With my Storage Account being in Central US and myself in Sydney Australia loading the ~26k entries took 4 mins 27 seconds to insert.

Azure Table Storage Rows Inserted.PNG

Querying Azure Table Service using the RestAPI and PowerShell

To then query the Table entries to find results for a Vendor the following script can be used. Change;

  • Line 2 for your Storage Account name
  • Line 3 for your Storage Account Key (which you obtain from the Azure Portal for your Storage Account)
  • Line 4 for your Table name
  • Line 20 for the Vendor to query for

Executing the script to query for Dell Inc. returns 113 entries. The graphic below shows one.

Search Azure Table Storage Response.PNG

Summary

Using the AzureRM PowerShell Module and the TableBatchOperation class from the Microsoft.WindowsAzure.Storage.dll we are able to batch the record inserts into 100 row batches.

Using the Azure Table Service REST API we are able to quickly search the Table for the records we are looking for.