Recently I started working on another side IoT Project. As part of that I needed to identify the Vendor / Manufacturer of networking equipment. As you are probably aware each network device has a unique MAC Address. A MAC Address looks like this 60:5b:b4:f9:63:05. The first 24 bits (6 hex characters) detail the vendor / manufacturer.
There are a number of online lookup tools to determine who the vendor is from the MAC address. And some like that one have an API to allow lookup too. If you are only looking up small volumes that is all good, but after that you get into subscription fee costs. I needed more than 1000 per day, but I also had a good idea of what the vendors were likely to be for a lot of my requests. So I rolled my own using an Azure Trigger Function.
Overview
The IEEE standards body maintains a list of the manufacturers assigned the 24 bit identifiers. A full list can be found here which is updated regularly. I downloaded this list and wrote a simple parser that created a PowerShell Object with the Hex, Base16 and Name of each Manufacturer.
I then extracted the manufacturers I expect to need to reference/lookup into a PSObject that is easily exportable and importable (export-clixml / import-clixml) and use that locally in my application. The full list to too large to keep locally so I exported the full list (again using export-clixml) and implemented a lookup as an Azure Function (that reads in the full list as a PSObject that takes ~1.7 seconds for 25,000+ records) which can then be queried with either Hex or Base16 as per the format in the IEEE list and the vendor name is returned.
Converting the IEEE List to a PowerShell Object
This little script will download the latest version of the OUI list and convert to a PowerShell Object. The resulting object looks like this:
vendor base16 hex
------ ------ ---
Apple, Inc. F0766F 40-CB-C0
Apple, Inc. 40CBC0 40-98-AD
Apple, Inc. 4098AD 6C-4D-73
Update:
Line 4 for the local location to output the OUI List too
Line 39 for the PSObject file to create
If you want to query the file locally using PowerShell you can like this:
and FYI, Apple have 671 registrations. Yes they make a LOT of equipment.
Azure Function
Here is the Azure Trigger PowerShell Function that takes a JSON object with a query containing the Base16 or Hex values for the 24bit Vendor Manufacturer and returns the Vendor / Manufacturer. e.g
{"query": "0A-00-27"}
Don’t forget to upload the Vendors.xml exported above to your Azure Function (you can drag and drop using Kudu) and update the path in Line 7.
An example PowerShell script to query would be similar to the following. Update $queryURI with the URI to your Azure Function.
The output will then return the manufacturer name. e.g
Microsoft Corporation
To lookup all MAC addresses from your local windows computer the following snippet will do that after updating $queryURI for you Azure Function.
# Query MAC Address
$queryURI = "https://FUNCTIONAPP.azurewebsites.net/api/AZUREFUNCTION?code=12345678/uiEx6kse6NujQG0b4OIcjx6B2wHLhZepBD/8Jy6fFawg=="
$netAdaptors = Get-NetAdapter
foreach ($adaptor in $netAdaptors){
$mac=$adaptor.MacAddress
$macV=$mac.Split("-")
$macLookup="$($macV[0])$($macV[1])$($macV[2])"
$body=@{"query"=$macLookup} |ConvertTo-Json
$result=Invoke-RestMethod-Method Post -Uri $queryURI-Body $body-Headers @{"content-type"="application/text"}
Write-Host-ForegroundColor Blue $result
}
Summary
With the power of PowerShell it is quick to take a large amount of information and transform it into a usable collection that can then also be quickly exported and re-imported. It is also quickly searchable and thanks to Azure Functions supporting PowerShell it’s simple to stand-up the collection and query it as required programatically.
This weekend I was attempting to rework some older Azure Automation tasks I wrote some time ago that were a combination of PowerShell scripts and Azure (PowerShell Functions). I was looking to leverage Microsoft Flow so that I could have them handy as ‘Buttons’ in the Microsoft Flow mobile app.
Quite quickly I realized that Microsoft Flow didn’t have the capability to perform some of the automation I required, so I handed that off to an Azure Function. The Azure Function then needed to leverage a Registered AAD Application. That required an Application ID and Secret (or a certificate). This wasn’t going the way I wanted so I took a step back.
The Goals I was attempting to achieve were;
A set of Azure Functions that perform small repetitive tasks that can be re-used across multiple Flows
Separation of permissions associated with function/object orientated Azure Functions
The Constraints I encountered were;
Microsoft Flow doesn’t currently have Azure Key Vault Actions
The Flows I was looking to build required functionality that isn’t currently covered by available Actions within Flow
With my goal to have a series of Functions that can be re-used for multiple subscriptions I came up with the following workaround (until Flow has actions for Key Vault or Managed Service Identity).
Current working Workaround/Bodge;
I created an Azure Function that can be passed Key Vault URI’s for credential and subscription information
typically this is the Application ID, Application Secret, Azure Subscription. These are retrieved from Azure Key Vault using Managed Service Identity
returns to the Flow the parameters listed above
Flow calls another Azure Function to perform required tasks
that Azure Function can be leveraged for an AAD App in any Subscription as credentials are passed to it
Example Scenario (as shown above);
Microsoft Flow triggered using a Flow Button in the mobile application to report on Azure Virtual Machines
Flow calls Azure Function (Get-Creds) to get credentials associated with the Flow for the environment being reported on
Managed Service Identity used from Azure Function to obtain credentials from Azure Key Vault
Application ID, Application Secret and Azure Subscription returned to Flow
Flow calls Azure Function (Get-VM-Status) that authenticates to Azure AD based of credentials and subscription passed to it
Azure Resource Group(s) and VM’s queried from the Function App with the details returned to Flow
Concerns/thoughts;
Passing credentials between integration elements isn’t the best idea
obfuscation is that best that can be done for now
having the information stored in three different secrets means all information isn’t sent in one call
but three web requests are required to get the necessary creds
A certificate for AAD App Authentication would reduce the Key Vault calls to one
would this be considered better or worse?
At least the credentials aren’t at rest anywhere other than in the Key Vault.
Summary
We’ve come a long way in a year. Previously we just had Application Settings in Azure Functions and we were obfuscating credentials stored their using encryption techniques. Now with Managed Service Identity and Azure Key Vault we have Function sorted. Leveraging modular Azure Functions to perform actions not possible in Flow though still seems like a gap. How are you approaching such integration?
As mentioned in this post yesterday I needed to copy a bunch of Azure WebApps and Functions from one Tenant to another. With anything cloud based, things move fast. Some of the methods I found were too onerous and more complex than they needed to be. There is of course the Backup option as well for Azure Functions. This does require a storage account associated with the Function App Plan. My Functions didn’t have the need for storage and the plan tier they were on meant that wasn’t a prerequisite. I didn’t have the desire to add a storage account to backup to then migrate.
Overview
In this post I show my method to quickly copy Azure Functions from one Azure Tenant to another. My approach is;
In the Source Tenant from the Azure Functions App
Using Kudu take a backup of the wwwroot folder (that will contain one or more Functions)
In the Target Tenant
Create an Azure Function App
Using Kudu locate the wwwroot archive in the new Azure Function App
Configure Azure Function Run From Zip
Backing up the Azure Functions in the Source Tenant
Using the Azure Portal in the Source Tenant go to your Function App => Application Settings and select Advanced Tools. Select Debug Console – Powershell and navigate to the Site Folder. Next to wwwroot select the download icon to obtain an archive of your functions.
Copying the Azure Functions to the Target Tenant
In the Target Tenant first create a New Azure Function App. I did this as I wanted to change the naming, the plan and a few other configuration items. Then using the Azure Portal go to your new Function App, Application Settings and select Advanced Tools.
Create a folder under D:\home\data named SitePackages.
Drag and drop your wwwroot.zip file into the SitePackages Folder.
In the same folder select the + icon to create a file named siteversion.txt
Inside the file give the name of your archive file e.g. wwwroot.zip Select Save.
Back in your new Function App select Application Settings
Under Application Settings add a new setting for Website_Use_Zip with a setting value of ‘1’.
Refresh your Function App and you’ll notice it is now Read Only as it is running from Zip. All the Functions that were in the Zip are displayed.
Summary
This is a quick and easy method to get your functions copied from one Tenant to another. Keep in mind if your functions are using Application Settings, KeyVaults, Managed Service Identity type options you’ll need to add those settings, certificates, credentials in the target environment.
This is the third and final post on my recent experiments integrating small micro controllers (ESP8266) running Mongoose OS integrated with Azure IoT Services.
Now that we have end to end functionality it’s time to do something with it.
I have two teenagers who’ve been trained well to use headphones. Whilst this is great at not having to hear the popular teen bands of today, and numerous Facetime, Skype, Snapchat and similar communications it does come with the downside of them not hearing us when we require their attention and they are at the other end of the house. I figured to avoid the need to shout to get attention, a simple visual notification could be built to achieve the desired result. Different colours for different requests? Sure why not. This is that project, and the end device looks like this.
IoT Notifier using Neopixel
Overview
Quite simply the solution goes like this;
With the Microsoft Flow App on our phones we can select the Flow that will send a notification
Send IoT Notification Message
Choose the Notification intent which will drive the color displayed on the Teenager Notifier.
IoT Notifier Task Message
The IoT Device will then display the color in a revolving pattern as shown below.
The Architecture
The end to end architecture of the solution looks like this.
IoT Message Cloud to Device
Using the Microsoft Flow App on a mobile device gives a nice way of having a simple interface that can be used to trigger the notification. Microsoft Flow sends the desired message and details of the device to send it to, to an Azure Function that puts a message into an MQTT queue associated with the Mongoose OS driven Azure IoT Device (ESP8266 based NodeMCU micro controller) connected to an Azure IoT Hub. The Mongoose OS driven Azure IoT Device takes the message and displays the visual notification in the color associated with the notification type chosen in Microsoft Flow at the beginning of the process.
The benefits of this architecture are;
the majority of the orchestration happens in Azure, yet thanks to Azure IoT and MQTT no inbound connection is required where the IoT device resides. No port forwarding / inbound rules to configure on your home router. The micro controller is registered with our Azure IoT Hub and makes an outbound connection to subscribe to its MQTT topic. As soon as there is a message for the device it triggers its logic and does what we’ve configured
You can initiate a notification from anywhere in the world (most simply using the Flow mobile app as shown above)
And using Mongoose OS allows for the device to be managed remote via the Mongoose OS Dashboard. This means that if I want to add an additional notification (color) I can update Flow for a new option to select and update the configuration on the Notifier device to display the new color if it receives such a command.
Solution Prerequisites
This post builds on the previous two. As such the prerequisites are;
you have an Azure account and have set up an IoT Hub, and registered an IoT Device with it
your IoT device (micro controller) can run Mongoose OS on. I’m using a NodeMCU ESP8266 that I purchased from Amazon here.
3D printer if you want to print an enclosure for the IoT device
With those sorted we can;
Install and configure my Mongoose OS Application. It includes all the necessary libraries and sample config to integrate with a Neopixel, Azure IoT, Mongoose Dashboard etc.
Create the Azure PowerShell Function App that will publish the MQTT message the IoT Device will consume
Create the Microsoft Flow that will kick off the notifications and give use a nice interface to send what we want
Build an enclosure for our IoT device
How to build this project
The order I’ve detailed the elements of the architecture here is how I’d recommend approaching this project. I’d also recommend working through the previous two blog posts linked at the beginning of this one as that will get you up to speed with Mongoose OS, Azure IoT Hub, Azure IoT Devices, MQTT etc.
Installing the AzureIoT-Neopixel-js Application
I’ve made the installation of my solution easy by creating a Mongoose OS Application. It includes all the libraries required and sample code for the functionality I detail in this post.
Clone it from Github here and put it into your .mos directory that should be in the root of your Windows profile directory. e.g C:\Users\Darren\.mos\apps-1.26 then from the MOS Configuration page select Projects, select AzureIoT-Neopixel-JS then select the Rebuild App spanner icon from the toolbar. When it completes select the Flash icon from the toolbar. When your micro controller restarts select the Device Setup from the top menu bar and configure it for your WiFi network. Finally configure your device for Azure MQTT as per the details in my first post in this series (which will also require you to create an Azure IoT Hub if you don’t already have one and register your micro controller with it as an Azure IoT Device). You can then test sending a message to the device using PowerShell or Device Explorer as shown in post two in this series.
I have the Neopixel connected to D1 (GPIO 5) on the NodeMCU. If you use a different micro controller and a different GPIO then update the init.js configuration accordingly.
Creating the Azure Function App
Now that you have the micro controller configured and working with Azure IoT, lets abstract the sending of the MQTT messages into an Azure Function. We can’t send MQTT messages from Microsoft Flow, so I’ve created an Azure Function that uses the AzureIoT Powershell module to do that.
Note: You can send HTTP messages to an Azure IoT device but …
I’m using the Managed Service Identity functionality to access the Azure Key Vault where credentials for the identity that can interact with my Azure IoT Hub is stored. To enable and use that (which I highly recommend) follow the instructions in my blog post here to configure MSI on an Azure Function App. If you don’t already have an Azure Key Vault then follow my blog post here to quickly set one up using PowerShell.
Azure PowerShell Function App
The Function App is an HTTP Trigger Based one using PowerShell. In order to interact with Azure IoT Hub and integrate with the IoT Device via Azure I’m using the same modules as in the previous posts. So they need to be located within the Function App.
Specifically they are;
AzureIoT v1.0.0.5
AzureRM v5.5.0
AzureRM.IotHub v3.1.0
AzureRM.profile v4.2.0
I’ve put them in a bin directory (which I created) under my Function App. Even though AzureRM.EventHub is shown below, it isn’t required for this project. I uploaded the modules from my development laptop (C:\Program Files\WindowsPowerShell\Modules) using WinSCP after configuring Deployment Credentials under Platform Features for my Azure Function App. Note the path relative to mine as you will need to update the Function App script to reflect this path so the modules can be loaded.
Azure Function PS Modules
The configuration in WinSCP to upload to the Function App for me is
WinSCP Configuration
Edit the AzureRM.IotHub.psm1 file
The AzureRM.IotHub.psm1 will locate an older version of the AzureRM.IotHub PowerShell module from within Azure Functions. As we’ve uploaded the version we need, we need to comment out the following lines in AzureRM.IotHub.psm1 so that it doesn’t do a version check. See below the lines to remark out (put a # in front of the lines indicated below) that are near the start of the module. The AzureRM.IotHub.psm1 file can be edited via WinSCP & notepad.
#$module = Get-Module AzureRM.Profile
#if ($module -ne $null -and $module.Version.ToString().CompareTo("4.2.0") -lt 0)
#{
# Write-Error "This module requires AzureRM.Profile version 4.2.0. An earlier version of AzureRM.Profile is imported in the current PowerShell session. Please open a new session before importing this module. This error could indicate that multiple incompatible versions of the Azure PowerShell cmdlets are installed on your system. Please see https://aka.ms/azps-version-error for troubleshooting information." -ErrorAction Stop
#}
#elseif ($module -eq $null)
#{
# Import-Module AzureRM.Profile -MinimumVersion 4.2.0 -Scope Global
#}
HTTP Trigger Azure PowerShell Function App
Here is my Function App Script. You’ll need to update it for the location of your PowerShell Modules (I created a bin directory under my Function App D:\home\site\wwwroot\myFunctionApp\bin), your Key Vault details and the user account you will be using. The User account will need permissions to your Key Vault to retrieve the password (credential) for the account you will run the process as and to your Azure IoT Hub.
You can test the Function App from within the Azure Portal where you created the Function App as shown below. Update for the names of the IoT Hub, IoT Device and the Resource Group in your associated environment.
Test Function App
Microsoft Flow Configuration
The Flow is very simple. A manual button and a resulting HTTP Post.
Microsoft Flow Configuration
For the message I have configured a list. This is where you can choose the color of the notification.
Microsoft Flow Manual Trigger
The Action is an HTTP Post to the Azure Function URL. The body has the configuration for the IoTHub, IoTDevice, Resource Group Name, IoTKeyName and the Message selected from the manual button above. You will have the details for those settings from your initial testing via the Function App (or PowerShell).
The Azure Function URL you get from the top of the Azure Portal screen where you configure your Function App. Look for “Get Function URL”.
Microsoft Flow HTTP Post
Testing
Now you have all the elements configured, install the Microsoft Flow App on your mobile if you don’t already have it for Apple iOS Appstore and Android Google Play Log in with the account you created the Flow as, select the Flow, the message and done. Depending on your internet connectivity you should see the notification in < 10 seconds displayed on the Notifier device.
Case 3D Printer Files
Lastly, we need to make it look all pretty and make the notification really pop. I’ve created a housing for the neopixel that sits on top of a little case for the NodeMCU.
As you can see from the final unit, I’ve printed the neopixel holder in a white PLA that allows the RGB LED light to be diffused nicely and display prominently even in brightly lit conditions.
Neopixel Enclosure
I’ve printed the base that holds the micro controller in a different color. The top fits snugly through the hole in the micro controller case. The wires from the neopixel to connect it to the micro controller slide through the shaft of the top housing. It also has a backplate that attaches to the back of the enclosure that I secure with a little hot glue.
Depending on your micro controller you will also need an appropriately sized case for that. I’ve designed the neopixel light holder top assembly to sit on top of my micro controller case. Also available on Thingiverse here.
Summary
Using a combination of Azure IoT, Azure PaaS Services, Mongoose OS and a cheap micro controller with an RGB LED light ring we have a very versatile Internet of Things device. The application here is a simple visual notifier. A change of output device or even in conjunction with an input device could change the application, whilst still re-using all the elements of the solution that glues it all together (micro-controller, Mongoose OS, Azure IoT, Azure PaaS). Did you build one? Did you use this as inspiration to build something else? Let me know.
In the Microsoft / Forefront Identity Manager Synchronization Service Manager under Tools we have a Statistics Report. This gives a break down of each of the Management Agents and the Connectors on each MA.
I had a recent requirement to expose this information for a customer but I didn’t want them to have to connect to the Synchronization Server (and be given the permissions to allow them to). So I looked into another way of providing a subset of this information in the MIM Portal itself. This post details that solution.
MIM / FIM Synchronization Server Management Agent & Metaverse Statistics
Overview
I approached this in a similar way I did for the User Object Report I recently developed. The approach is;
Azure PowerShell Function App that uses Remote PowerShell to connect to the MIM Sync Server and leverage the Lithnet MIIS Automation PowerShell Module to enumerate all Management Agents and build a report on the information required in the report
A NodeJS WebApp calls the Azure PowerShell Function App onload to generate the report and display it
The NodeJS WebApp is embedded in the MIM Portal as a new Nav Bar Resource and Page
The graphic below details the basic logical integration.
Prerequisites
The prerequisites to perform this I’ve covered in other posts. In concept as described above it is similar to the User Object report, that has the same prerequisites and I did a pretty good job on detailing those here. To implement this then that post is the required reading to get you ready.
Azure PowerShell Function App
Below is the raw script from my Function App that connects to the MIM Sync Server and retrieves the Management Agent Statistics for the report.
NodeJS Web App
The NodeJS Web App is the app that gets embedded in the MIM Portal that calls the Azure Function to retreive the data and then display it. To get started you’ll want to start with a based NodeJS WebApp. This post will get you started. Implementing a NodeJS WebApp using Visual Studio Code
The only extension I’m using on top of what is listed there is JQuery. So once you have NodeJS up and running in your VSCode Terminal type npm install jquery and then npm install.
I’ve kept it simple and contained all in a single HTML file using JQuery.
In you NodeJS project you will need to reference your report.html file. It should look like this (assuming you name your report report.html)
var express = require('express');
var router = express.Router();
/* GET - Report page */
router.get('/', function(req, res, next) {
res.sendFile('report.html', { root:'./public'});
});
module.exports = router;
The Embedded Report
This is what my report looks like embedded in the MIM Portal.
Microsoft Identity Manager Statistics Report
Summary
Integration of FIM / MIM with Azure Platform as a Service Services opens a world of functionality including the ability to expose information that was previously only obtainable by the FIM / MIM Administrator.
This isn’t an out of the box solution. This is a bespoke solution that takes a number of elements and puts them together in a unique way. I’m not expecting anyone to implement this specific solution (but you’re more than welcome to) but to take inspiration from it to implement solutions relevant to your environment(s). This post supports a presentation I did to The MIM Team User Group on 14 June 2017.
This post describes a solution that;
Leverages an Azure WebApp (NodeJS) to present a simple website. That site can be integrated easily in the FIM/MIM Portal
The NodeJS website leverages an Azure Function App to get a list of users from the FIM/MIM Synchronization Server and allows the user to use typeahead functionality to find the user they want to generate a FIM/MIM object report on
On selection of a user, a request will be sent to another Azure Function App to generate and return the report to the user in a new browser window
This is shown graphically below.
Integration of Microsoft Identity Manager with Azure Serverless Services
Report Request UI
The NodeJS WebApp is integrated into the FIM/MIM portal. Bootstrap Typeahead is used to find the user to generate a report on. The Typeahead userlist if fulfilled by an Azure Function into the MIM Sync Metaverse. The Generate Report button fires off a call to FIM/MIM via another Azure Function into the MIM Sync and MIM Service to generate the report.
The returned report opens in a new tab in the users browser. The report contains details of the FIM/MIM connectors the user is represented on.
The values of all attributes for the users hologram from the Metaverse are displayed along with the MA the value came from and the last modified date.
Finally the metadata report from the MIM Service MA Connector Space and the MIM Service.
Prerequisites
These are numerous, but I’ve previously posted about them. You will need;
this will give you the base NodeJS site to which you can then build out your report request UI (more details on that further below)
I encourage you to digest those posts to understand how to configure the prerequisites for this solution.
Additional Solution Requirements
To bring all the individual components together, there are a few additional tasks to enable this solution.
Enable CORS on your Azure Function App Configuration (see details further below)
If you want to display User Object Photos as part of the report, you will likely need to synchronize them into FIM/MIM from an authoritative source (e.g. Office365/Exchange Online) Checkout this post and additional details further below
In order to embed the NodeJS WebApp into the FIM/MIM Portal, this post provides the details. Change the target URL from PowerBI URL to your NodeJS site
Object Report Request WebApp (see below for sample site)
You will need to configure CORS to allow the NodeJS WebApp to access the Azure Functions (from both local and Azure). Reflect your port number if it is different from 3000, and use the DNS name for your Azure WebApp.
Sample UI NodeJS HTML
Here is a sample HTML file for your NodeJS WebApp with the UI to provide Input for LoginID fulfilled by the NodeJS Javascript file further below.
Sample UI NodeJS JavaScript
The following NodeJS JavaScript supports the HTML UI above. It populates the LoginID typeahead box and takes the Submit Report button to fulfill the report for the desired object(s). Yes if you use the UI to select (individually) multiple different objects all will be returned in their separate output windows.
As the HTML file above indicates you will need to obtain and make available as part of your NodeJS project the typeahead.bundle.js library.
Azure PowerShell Trigger Function App for AccountNames Lookup
The following Azure Function takes the call from the load of the NodeJS WebApp to populate the typeahead userlist.
Azure PowerShell Trigger Function App for User Object Report
Similar in structure to the Username List Lookup Azure Function above, but in the ScriptBlock you embed the Report Generation Script that is detailed here. Modify for what you want to report on.
Photos in the Report
If you want to display images in your report, you will need to determine if the user has an image during the MV metadata report generation part of the script. Add the following lines (updating for the name of your Image attribute; mine is named EXOPhoto) after the Try {} Catch {}in this section $obj = @() ; foreach ($attr in $attributes.Keys)
# Display the Objects Photo rather than Base64 string
if ($attr.equals("EXOPhoto")){
$objectphoto = "<img src=$([char]0x22)data:image/jpeg;base64,$($attributes.$attr.Values.Valuestring)$([char]0x22)>"
$val = "System.Byte[]"
}
Then in the output of the HTML report at the end of the report generation insert the $objectphoto variable into the HTML stream.
# Output MIM Service Object Data
$MIMServiceObjOut = $MIMServiceObjectMetaData | Sort-Object -Property Attribute | ConvertTo-Html -Fragment
$htmlreport = ConvertTo-HTML -Body "$htmlcss<h1>Microsoft Identity Manager User Object Report</h1><h2>Query</h2>$sourcequery</br><b><center>$objectphoto</br>NOTE: Only attributes with values are displayed.</center></b><h2>Connector(s) Summary</h2>$connectorsummary<h2>MetaVerse Data</h2>$objectmetadata <h2>MIM Service CS Object Data</h2>$MIMServiceCSobjectmetadata <h2>MIM Service Object Data</h2>$MIMServiceObjOut" -Title "MIM Object Report"
As you can see above I’ve also injected the CSS ($htmlcss) into the output stream at the beginning of the Body section. Somewhere in your script block you will need to define your CSS values. e.g.
An interesting solution integrating Azure PaaS Services with Microsoft Identity Manager via PowerShell and the extremely versatile Lithnet FIM/MIM PowerShell Modules.
Please share your implementations enhancing your FIM/MIM Solution.
In an age of Web Services and API’s it’s an almost a forgotten world where FTP Servers exist. However most recently I’ve had to travel back in time and interact with a FTP server to get a set of files that are produced by other systems on a daily basis. These files are needed for some flat-file imports into Microsoft Identity Manager.
Getting files off a FTP server is pretty simple. But needing to do it across a number of different environments (Development, Staging and Production) meant I was looking for an easy approach that I could also replicate quickly across multiple environments. As I already had Remote PowerShell setup on my MIM Servers for other Azure Function Apps I figured I’d use an Azure Function for obtaining the FTP Files as well.
Overview
My PowerShell Timer Function App performs the following:
Starts a Remote PowerShell session to my MIM Sync Server
Gets the files and puts them into the local directory
Ends the session
Pre-requisites
From the overview above there are a number of pre-requites that other blog posts I’ve written detail nicely the steps involved to appropriately setup and configure. So I’m going to link to those. Namely;
Configure your Function App for your timezone so the schedule is correct for when you want it to run. Checkout the WEBSITE_TIME_ZONE note in this post.
WEBSITE_TIME_ZONE
You’ll need to configure your Server that you are going to put the files onto for Remote PowerShell. Follow the Enable Powershell Remoting on the FIM/MIM Sync Server section of this blogpost.
The credentials used to connect to the MIM Server are secured as detailed in the Using an Azure Function to query FIM/MIM Service section of this blog post.
Create a Timer PowerShell Function App. Follow the Creating your Azure App Service section of this post but choose a Timer Trigger PowerShell App.
I configured my Schedule for 1030 every day using the following CRON configuration
0 30 10 * * *
On the Server you’ll be connecting to in order to run the FTP processes you’ll need to copy the PSFTP Module and files to the following directories. I unzipped the PSFTP files and copied the PSFTP folder and its contents to;
With all the pre-requisites in place it’s time to configure the Timer Function App that you created in the pre-requisites.
The following settings are configured in the Function App Application Settings;
FTPServer (the server you will be connecting to, to retrieve files)
FTPUsername (username to connect to the FTP Sever with)
FTPPassword (password for the username above)
FTPSourceDirectory (FTP directory to get the files from)
FTPTargetDirectory (the root directory under which the files will be put)
You’ll also need Application Settings for a Username and Password associated with a user that exists on the Server that you’ll be connecting to with Remote PowerShell. In my script below these application settings are MIMSyncCredUserand MIMSyncCredPassword
Function App Script
Finally here is a raw script. You’ll need to add appropriate error handling for your environment. You’ll also want to change lines 48 and 51 for the naming of the files you are looking to acquire. And line 59 for the servername you’ll be executing the process on.
Summary
A pretty quick and simple little Azure Function App that will run each day and obtain daily/nightly extracts from an FTP Server. Cleanup of the resulting folders and files I’m doing with other on-box processes.
Just on a year ago I wrote this blog post that detailed a method to “Simultaneously Start|Stop all Azure Resource Manager Virtual Machines in a Resource Group”. It’s a simple script that I use quite a lot and I’ve received a lot of positive feedback on it.
One year on though and there are a few enhancements I’ve been wanting to make to it. Namely;
host the script in an environment that is a known state. Often I’m authenticated to different Azure Subscriptions, my personal, my employers and my customers.
prioritize the order the virtual machines startup|shutdown
allow for a delay between starting each VM (to account for environments where the VM’s have roles that have cross dependencies; e.g A Domain Controller, an SQL Server, Application Servers). You want the DC to be up and running before the SQL Server, and so forth
and if I do all those the most important;
secure it so not just anyone can start|stop my environments at their whim
Overview
This blog post is the first that executes the first part of implementing the script in an environment that is a known state aka implementing it as an Azure Function App. This won’t be a perfect implementation as you will see, but will set the foundation for the other enhancements. Subsequent posts (as I make time to develop the enhancements) will add the new functionality. This post covers;
Creating the Azure Function App
Creating the foundation for automating management of Virtual Machines in Azure using Azure Function Apps
Starting | Stopping all Virtual Machines in an Azure Resource Group
Create a New Azure Function App
First up we are going to need a Function App. Through your Azure Resource Manager Portal create a new Function App.
For mine I’ve created a new Resource Group and a new Storage Account as this solution will flesh out over time and I’d like to keep everything organised.
Now that we have the Azure App Plan setup, create a New PowerShell HTTP Trigger Function App.
Give it a name and hit Create.
Create Deployment Credentials
In order to get some of the dependencies into the Azure Function we need to create deployment credentials so we can upload them. Head to the Function App Settings and choose Go to App Service Settings.
Create a login and give it a password. Record the FTP/Deployment username and the FTP hostname along with your password as you’ll need this in the next step.
Connect to your Azure Function App using your favourite FTP Client using the credentials you created earlier. I’m using WinSCP. Create a new sub-directory under /site/wwwroot/ named “bin” as shown below.
Upload the Invoke-Parallel.ps1 file from wherever you extracted it to on your local machine to the bin folder you just created in the Function App.
We are also going to need the AzureRM Powershell Modules. Download those via Powershell to your local machine (eg. Save-Module -Name AzureRM -Path c:\temp\azurerm). There are a lot of modules obviously and you’re not going to need them all. At a minimum for this solution you’ll need;
AzureRM
AzureRM.profile
AzureRM.Compute
Upload them under the bin directory also as shown below.
Test that our script dependencies are accessible
Now that we have our dependent modules uploaded let’s test that we can load and utilise them. Below is commands to load the Invoke-Parallel script and test that it has loaded by getting the Help.
# Load the Invoke-Parallel Powershell Script. "D:\home\site\wwwroot\RG-Start-Stop-VirtualMachines\bin\Invoke-Parallel.ps1"# See if it is loaded by getting some outputGet-Help Invoke-Parallel -Full
Put those lines into the code section, hit Save and Run and select Logs to see the output. If successful you’ll see the help. If you don’t you probably have a problem with the path to where you put the Invoke-Parallel script. You can use the Kudu Console from the Function App Settings to get a command line and verify your path.
Mine worked successfully. Now to test our AzureRM Module Loads. Update the Function to load the AzureRM Profile PSM as per below and test you have your path correct.
# Import the AzureRM Powershell Module
import-module 'D:\home\site\wwwroot\RG-Start-Stop-VirtualMachines\bin\AzureRM.profile\2.4.0\AzureRM.Profile.psm1'
Get-Help AzureRM
Success. Fantastic.
Create an Azure Service Principal
In order to automate the access and control of the Azure Virtual Machines we are going to need to connect to Azure using a Service Principal with the necessary permissions to manage the Virtual Machines.
The following script does just that. You only need to run this as part of the setup for the Azure Function so we have an account we can use for our automation tasks. Update line 6 for your naming and the password you want to use. I’m assigning the Service Principal the “DevTest Labs User” Azure Role (Line 17) as that allows the ability to manage the Virtual Machines. You can find a list of the available roles here.
Take note of the key outputs from this script. You will need to note the;
ApplicationID
TenantID
I’m also securing the credential that has the permissions to Start|Stop the Virtual Machines using the example detailed here in Tao’s post.
For reference here is an example to generate the keyfile. Update your path in line 5 if required and make sure the password you supply in line 18 matches the password you supplied for the line in the script (line 6) when creating the Security Principal.
Take note of the password encryption string from the end of the script to pair with the ApplicationID and TenantID from the previous steps. You’ll need these shortly in Application Settings.
Additional Dependencies
I created another sub-directory under the function app site named ‘keys’ again using WinSCP. Upload the passkey file created above into that directory.
Whilst we’re there I also created a “logs” directory for any erroneous output (aka logfiles created when you don’t specify them) from the invoke-parallel script.
Application Variables
Using the identity information you have created and generated we will populate variables on the Function App, Application Settings that we can then leverage in our Function App. Go to your Azure Function App, Application Settings and add an application setting (with the respective values you have gathered in the previous steps) for;
AzureAutomationPWD
AzureAutomationAppID
AzureAutomationTennatID (bad speed typing there)
Don’t forget to click Save up the top of the Application Settings screen.
The Function App Script
Below is the sample script for your testing purposes. If you plan to use something similar in a production environment you’ll want to add more logging and error handling.
Testing the Function
Select the Test option from the right-hand side pane and update the request body for what the Function takes (mode and resourcegroup) as below. Select Run and watch the logs. You will need to select Expand to get more screen real estate for them.
You will see the VM’s enumerate then the script starting them all up. My script has a 30 second timeout for the Invoke-Parallel Runspace as the VM’s will take longer than 30 seconds to startup. And you pay for use, so we want to keep this lean. Increase the timeout if you have more VM’s or latency that doesn’t see all your VM’s state transitioning.
Checking in the Azure Portal I can see my VM’s all starting up (too fast on the screenshot for the spfarm-mim host).
Sample Remote PowerShell Invoke Script
Below is a sample PowerShell script that is remotely calling the Azure Function and providing the info the Function takes (mode and resourcegroup) the same as we did in the Test Request Body script in the Azure Function Portal. This time to stop the VMs.
Looking in the Azure Portal and we can see all the VMs shutting down.
Summary
A foundational implementation of an Azure Function App to perform orchestration of Azure Virtual Machines.
The Function App is rudimentary in that the script exits (as described in the Runspace timeout) after 30 seconds which is prior to the VMs fully returning after starting|stopping. This is because the Function App will timeout after 5mins anyway.
Now to workout the enhancements to it.
Finally, yes I have renewed/changed the Function Key so no-one else can initiate my Function 🙂
This blog post details how to use a Powershell Azure Function App to get information from a RestAPI and send a social media update.
The data can come from anywhere, and in the case of this example I’m getting the data from WioLink IoT Sensors. This builds upon my previous post here that details using Powershell to get environmental information and put it in Power BI. Essentially the difference in this post is outputting the manipulated data to social media (Twitter) whilst still using a TimerTrigger Powershell Azure Function App to perform the work and leverage the “serverless” Azure Functions model.
Prerequisites
The following are prerequisites for this solution;
Create a folder on your local machine for the Powershell Module then save the module to your local machine using the powershell command ‘Save-Module” as per below.
If you don’t already have a Function App Plan create one by searching for Function App in the Azure Management Portal. Give it a Name, Select Consumption so you only pay for what you use, and select an appropriate location and Storage Account.
Create a Twitter App
Head over to http://dev.twitter.com and create a new Twitter App so you can interact with Twitter using their API. Give you Twitter App a name. Don’t worry about the URL too much or the need for the Callback URL. Select Create your Twitter Application.
Select the Keys and Access Tokens tab and take a note of the API Key and the API Secret. Select the Create my access token button.
Take a note of your Access Token and Access Token Secret. We’ll need these to interact with the Twitter API.
Create a Timer Trigger Azure Function App
Create a new TimerTrigger Azure Powershell Function. For my app I’m changing from the default of a 5 min schedule to hourly on the top of the hour. I did this after I’d already created the Function App as shown below. To update the schedule I edited the Function.json file and changed the schedule to “schedule”: “0 0 * * * *”
Give your Function App a name and select Create.
Configure Azure Function App Application Settings
In your Azure Function App select “Configure app settings”. Create new App Settings for your Twitter Account, Twitter Account AccessToken, AccessTokenSecret, APIKey and APISecret using the values from when you created your Twitter App earlier.
Deployment Credentials
If you haven’t already configured Deployment Credentials for your Azure Function Plan do that and take note of them so you can upload the Twitter Powershell module to your app in the next step.
Take note of your Deployment Username and FTP Hostname.
Upload the Twitter Powershell Module to the Azure Function App
Create a sub-directory under your Function App named bin and upload the Twitter Powershell Module using a FTP Client. I’m using WinSCP.
From the Applications Settings option start Kudu.
Traverse the folder structure to get the path do the Twitter Powershell Module and note it.
Update the code to replace the sample from the creation of the Trigger Azure Function as shown below to import the Twitter Powershell Module. Include the get-help lines for the module so we can see in the logs that the modules were imported and we can see the cmdlets they contain.
Validating our Function App Environment
Update the code to replace the sample from the creation of the Trigger Azure Function as shown below to import the Twitter Powershell Module. Include the get-help line for the module so we can see in the logs that the module was imported and we can see the cmdlets they contain. Select Save and Run.
Below is my output. I can see the output from the Twitter Module.
Function Application Script
Below is my sample script. It has no error handling etc so isn’t production ready, but gives a working example of getting data in from an API (in this case IoT sensors) and sends a tweet out to Twitter.
Viewing the Tweet
And here is the successful tweet.
Summary
This shows how easy it is to utilise Powershell and Azure Function Apps to get data and transform it for use in other ways. In this example a social media platform. The input could easily be business data from an API and the output a corporate social platform such as Yammer.
This blog post details using a Powershell Azure Function App to get IoT data from a RestAPI and update a table in Power BI with that data for visualization.
The data can come from anywhere, however in the case of this post I’m getting the data from WioLink IoT Sensors. This builds upon my previous post here that details using Powershell to get environmental information and put it in Power BI. Essentially the major change is to use a TimerTrigger Azure Function to perform the work and leverage the “serverless” Azure Functions model. No need for a reporting server or messing around with Windows scheduled tasks.
Prerequisites
The following are the prerequisites for this solution;
Create a folder on your local machine for the Powershell Modules then save the modules to your local machine using the powershell command ‘Save-Module” as per below.
If you don’t already have a Function App Plan create one by searching for Function App in the Azure Management Portal. Give it a Name, Select Consumption Plan for the Hosting Plan so you only pay for what you use, and select an appropriate location and Storage Account.
Register a Power BI Application
Register a Power BI App if you haven’t already using the link and instructions in the prerequisites. Take a note of the ClientID. You’ll need this in the next step.
Configure Azure Function App Application Settings
In this example I’m using Azure Functions Application Settings for the Azure AD AccountName, Password and the Power BI ClientID. In your Azure Function App select “Configure app settings”. Create new App Settings for your UserID and Password for Azure (to access Power BI) and our PowerBI Application Client ID. Select Save.
Not shown here I’ve also placed the URL’s for the RestAPI’s that I’m calling to get the IoT environment data as Application Settings variables.
Create a Timer Trigger Azure Function App
Create a new TimerTrigger Azure Powershell Function App. The default of a 5 min schedule should be perfect. Give it a name and select Create.
Upload the Powershell Modules to the Azure Function App
Now that we have created the base of our Function App we’re going to need to upload the Powershell Modules we’ll be using that are detailed in the prerequisites. In order to upload them to your Azure Function App, go to App Service Settings => Deployment Credentials and set a Username and Password as shown below. Select Save.
Take note of your Deployment Username and FTP Hostname.
Create a sub-directory under your Function App named bin and upload the Power BI Powershell Module using a FTP Client. I’m using WinSCP.
To make sure you get the correct path to the powershell module from Application Settings start Kudu.
Traverse the folder structure to get the path to the Power BI Powershell Module and note the path and the name of the psm1 file.
Now upload the Azure AD Preview Powershell Module in the same way as you did the Power BI Powershell Module.
Again using Kudu validate the path to the Azure AD Preview Powershell Module. The file you are looking for is the “Microsoft.IdentityModel.Clients.ActiveDirectory.dll” file. My file after uploading is located in “D:\home\site\wwwroot\MyAzureFunction\bin\AzureADPreview\2.0.0.33\Microsoft.IdentityModel.Clients.ActiveDirectory.dll”
This library is used by the Power BI Powershell Module.
Validating our Function App Environment
Update the code to replace the sample from the creation of the Trigger Azure Function as shown below to import the Power BI Powershell Module. Include the get-help line for the module so we can see in the logs that the modules were imported and we can see the cmdlets they contain. Select Save and Run.
Below is my output. I can see the output from the Power BI Module get-help command. I can see that the module was successfully loaded.
Function Application Script
Below is my sample script. It has no error handling etc so isn’t production ready, but gives a working example of getting data in from an API (in this case IoT sensors) and puts the data directly into Power BI.
Viewing the data in Power BI
In Power BI it is then quick and easy to select our Inside and Outside temperature readings referenced against time. This timescale is overnight so both sensors are reading quite close to each other.
Summary
This shows how easy it is to utilise Powershell and Azure Function Apps to get data and transform it for use in other ways. In this example a visualization of IoT data into Power BI. The input could easily be business data from an API and the output a real time reporting dashboard.