I have a number of Azure IoT Hubs each with a number of devices configured on them. I wanted to export the details for each IoT Device. This can’t be done via the Azure Portal (May 2018) so I looked to leverage the Azure.IoTHub New-AzureRmIotHubExportDevices cmdlet.
Now the documentation for New-AzureRmIotHubExportDevices is a little light on. When I was running the New-AzureRmIotHubExportDevices I kept getting the error ‘Operation returned an invalid status code ‘InternalServerError’.
After many attempts (over weeks) I finally was able to export my IoT devices using PowerShell. The key was to generate the SAS Storage Token for the Container rather than creating a blob file to export to and generating a SAS Token for the file. Simply specify the Storage Container to export too.
My sample script below uses the latest (as of May 2018) version of the Azure.IoTHub Module (v3.1.3). It;
enumerates all Resource Groups in an Azure Subscription and looks for IoT Hubs and puts them into a collection
then iterates through each IoT Hub, creates an associated Storage Account (if one doesn’t exist)
Exports the IoT Devices associated with the IoT Hub to Azure Storage
Downloads the IoT Devices Blob File, opens it and displays through PowerShell console output the IoT Device Names and Status
To use the script you will just need to;
change your Subscription Name in Line 4
The location where you want to download the blob files too in Line 31
if you want to display additional info on each device or do something else with the info change line 71 accordingly
The exported file can be found using Azure Storage Explorer as shown below.
And the script outputs the status to the PowerShell console as shown below.
The exported object contains all the details for each IoT Device as shown below in the IoT Device PSObject.
It is obvious once you work out how the cmdlet works. Hopefully this working example will save someone else a few hours of head scratching.
The CentOS image that SailPoint provide for the IdentityNow Virtual Appliance that performs integration between ‘Sources’ and IdentityNow is VMWare based. I don’t have any VMWare Infrastructure to run it on and really didn’t want to run up any VMWare environments for this component. All my other infrastructure is in Azure. I’d love to run my VA(s) in Azure too.
In discussions with SailPoint I understand it is simply a case that they haven’t certified their CentOS image on Azure. So I figured I’d convert the VM, get it into Azure and see if it works from my Sandpit environment. This blog post details how I got it working.
Disclaimer: If you use this for more than a Sandpit/Test environment let your SailPoint CSM know. This isn’t an approved process or a support configuration. That said it works for me.
This is the high-level process I threw together that worked for me.
Obtain the CentOS Image from the IdentityNow Virtual Appliance Setup
Convert the VMWare VMDK image to Hyper-V VHD format using VirtualBox vboxmanage (free)
From the Azure MarketPlace create a Seed VM based on CentOS (with new Resource Group, Storage Account, Virtual Network etc)
Upload the VHD to the Azure Storage Account (associated with VM from Step 3) using Azure Storage Explorer
Create a new VM based off the VM from Step 3 to use the disk from Step 4 as the Operating System disk
To download the CentOS VMWare Image login to the Admin section of your IdentityNow Tenant. Under Admin => Connections => Virtual Appliances create a New Cluster. Select that Cluster then Virtual Appliances => New
Download the Appliance Package
Converting the CentOS VMWare Virtual Disk to a Fixed Hyper-V Virtual Disk
I already had Virtual Box installed on my computer. I had to give the full path to VBoxManage (as shown below) and called it with the switches to convert the image;
vboxmanage clonehd –format VHD –variant Fixed
The –variant Fixed switch takes the dynamic image and converts it to Fixed as this is a requirement in Azure.
The image conversion started and completed in under ten minutes.
Creating an Azure CentOS VM
In the Azure Portal I created a New Resource and chose CoreOS.
I gave it a name, chose HDD as the disk type and gave it a Username and Password.
I chose sizing in line with the recommendations for a Virtual Appliance.
And kept everything else simple (for my sandpit environment).
After the VM had deployed I had a Resource Group with the necessary Virtual Network, Storage Account etc.
Upload the Converted Disk to Azure Storage
I created a vhd container (in the Storage Group associated with the VM I just created) to hold the new VHD. Using Azure Storage Explorer I then uploaded the converted image. Select Page Blob for the blob type.
You’ll want to have a decent internet connection to do this. I converted the SailPoint image on an Azure VM (to which I added a 256Gb data disk too). I then uploaded the new 128Gb VHD disk image from within Azure to the target Resource Group in about 75 minutes.
Below I show the SailPoint Virtual Appliance CentOS OS converted disk image uploaded to Azure Storage Account Blob Storage.
Generate SAS Token / Get Blob URI
We won’t used a SAS Token, but this just gives easy access to the Storage Blob URL. Right click on the VHD Blob and select Generate Shared Access Signature. Select Create.
Copy the URL. We’ll need parts of this for the script to create a new CentOS VM with our VA Disk Image.
Create the new VM for our Virtual Appliance
Update the script below for:
The Resource Group you created the Seed VM in (line 2)
The Seed VM Name (line 4)
The Seed VM Subnet Name (line 6)
Each of those are easily obtained from the Seed VM Summary as highlighted below.
update the Disk Blob details in Live 8 and 10 as copied earlier
After stepping through the script to create the new VM, and happy with the new name etc, I executed the New-AzureRMVM command.
And the VM was created in a couple of minutes.
Accessing the new VM
Getting the IP address from the new VM Summary I SSH’d into it.
Change the password on your Virtual Appliance (passwrd)
Create a DNS Name, update the configuration as per SailPoint VA Configuration tasks
Create the VA and Test the Connection from the IdentityNow Portal
Delete your original SeedVM as it is no longer required
Add an NSG to the new VM
Create another VM in a different location for High Availability and configure it in IdentityNow
Below shows my Azure based Virtual Appliance connected and all setup.
Whilst not officially supported it is possible to convert the SailPoint Virtual Appliance VMWare based image to an Azure compatible Hyper-V image and assign it as the Operating System disk on an Azure Linux (CoreOS) Virtual Machine. If you need to do something similar I hope my approach gives you some ideas.
If you then need to create another Virtual Appliance in Azure you have a Data Disk you can assign to a VM and upload to wherever it needs to be for creation of another Virtual Appliance VM.
Microsoft have documented a number of scenarios for implementing the management agent. The scenarios the MA has been built for are valid and I have customers that will benefit from the new MA immediately. There is however another scenario I’m seeing from a number of customers that is possible but not detailed in the release notes. That is B2B Sync between Azure Tenants; using Microsoft Identity Manager to automate the creation of Guests in an Azure Tenant.
This could be one-way or multi-way depending on what you are looking to achieve. Essentially this is the Azure equivalent of using FIM/MIM for Global Address List Sync.
The changes are minimal to the documentation provided with the Management Agent. Essentially;
ensure you enable Write Permissions to the Application you create in the AAD Tenant you will be writing too
Enable the Invite Guest users to the organization permission on the AAD Application
Create an Outbound Sync Rule to an AAD Tenant with the necessary mandatory attributes
Configure the Management Agent for Export Sync Profiles
In the scenario I’m detailing here I’m showing taking a number of users from Org2 and provisioning them as Guests in Org1.
When setting up the Graph Permissions you will need to have Write permissions to the Target Azure AD for at least Users. If you plan to also synchronize Groups or Contacts you’ll need to have Write permissions for those too.
In addition as we will be automating the invitation of users from one Tenant to another we will need to have the permission ‘Invite guest users to the organization’.
With those permissions selected and while authenticated as an Administrator select the Grant Permissions button to assign those permissions to the Application.
Repeat this in both Azure AD Tenants if you are going to do bi-directional sync. If not you only need write and invite permissions on the Tenant you will be creating Guest accounts in.
Creating the Import/Inbound Sync Rules Azure Tenants
Here is an example of my Import Sync Rules to get Members (Users) in from an Azure Tenant. I have an inbound sync rule for both Azure Tenants.
Make sure you have ‘Create Resource in FIM‘ configured on the source (or both if doing bi-directional) Graph Connector.
The attribute flow rules I’ve used are below. They are a combination of the necessary attributes to create the corresponding Guest account on the associated management agent and enough to be used as logic for scoping who gets created as a Guest in the other Tenant. I’ve also used existing attributes negating the need to create any new ones.
Creating the Export/Outbound Sync Rule to a Partner B2B Tenant
For your Export/Outbound rule make sure you have ‘Create resource in external system’ configured.
There are a number of mandatory attributes that need to be flowed out in order to create Guests in Azure AD. The key attributes are;
userType = Guest
accountEnabled = True
displayName is required
password is required (and not export_password as normally required on AD style MA’s in FIM/MIM)
mailNickname is required
for dn and id initially I’m using the id (flowed in import to employeeID) from the source tenant. This needs to be provided to the MA to get the object created. Azure will generate new values on export so we’ll see a rename come back in on the confirming import
userPrincipalName is in the format of
SOURCEUPN (with @ replaced with _ ) #EXT# DestinationUPNSuffix
Here is an example of building a UPN.
Sets, Workflows and MPR’s
I didn’t need to do anything special here. I just created a Set based on attributes coming in from the source Azure Tenant to scope who gets created in the target Tenant. An MPR that looks for transition into the Set and applies the Workflow that associates the Sync Rule.
End to End
After synchronizing in from the source (B2B Org 2) the provisioning rules trigger and created the Users as Guests on B2B Org 1.
Looking at the Pending Export we can see our rules have applied.
On Export the Guest accounts are successfully created.
On the confirming import we get the rename as Azure has generated a new CN and therefore DN for the Guest user.
Looking into Azure AD we can see one of our new Guest users.
Using the Microsoft Azure B2B Graph Management Agent we can leverage it to invite Users from one Tenant as Guests in another Tenant. Essentially an Azure version of GALSync.
This weekend I was attempting to rework some older Azure Automation tasks I wrote some time ago that were a combination of PowerShell scripts and Azure (PowerShell Functions). I was looking to leverage Microsoft Flow so that I could have them handy as ‘Buttons’ in the Microsoft Flow mobile app.
Quite quickly I realized that Microsoft Flow didn’t have the capability to perform some of the automation I required, so I handed that off to an Azure Function. The Azure Function then needed to leverage a Registered AAD Application. That required an Application ID and Secret (or a certificate). This wasn’t going the way I wanted so I took a step back.
The Goals I was attempting to achieve were;
A set of Azure Functions that perform small repetitive tasks that can be re-used across multiple Flows
Separation of permissions associated with function/object orientated Azure Functions
The Constraints I encountered were;
Microsoft Flow doesn’t currently have Azure Key Vault Actions
The Flows I was looking to build required functionality that isn’t currently covered by available Actions within Flow
With my goal to have a series of Functions that can be re-used for multiple subscriptions I came up with the following workaround (until Flow has actions for Key Vault or Managed Service Identity).
Current working Workaround/Bodge;
I created an Azure Function that can be passed Key Vault URI’s for credential and subscription information
typically this is the Application ID, Application Secret, Azure Subscription. These are retrieved from Azure Key Vault using Managed Service Identity
returns to the Flow the parameters listed above
Flow calls another Azure Function to perform required tasks
that Azure Function can be leveraged for an AAD App in any Subscription as credentials are passed to it
Example Scenario (as shown above);
Microsoft Flow triggered using a Flow Button in the mobile application to report on Azure Virtual Machines
Flow calls Azure Function (Get-Creds) to get credentials associated with the Flow for the environment being reported on
Managed Service Identity used from Azure Function to obtain credentials from Azure Key Vault
Application ID, Application Secret and Azure Subscription returned to Flow
Flow calls Azure Function (Get-VM-Status) that authenticates to Azure AD based of credentials and subscription passed to it
Azure Resource Group(s) and VM’s queried from the Function App with the details returned to Flow
Passing credentials between integration elements isn’t the best idea
obfuscation is that best that can be done for now
having the information stored in three different secrets means all information isn’t sent in one call
but three web requests are required to get the necessary creds
A certificate for AAD App Authentication would reduce the Key Vault calls to one
would this be considered better or worse?
At least the credentials aren’t at rest anywhere other than in the Key Vault.
We’ve come a long way in a year. Previously we just had Application Settings in Azure Functions and we were obfuscating credentials stored their using encryption techniques. Now with Managed Service Identity and Azure Key Vault we have Function sorted. Leveraging modular Azure Functions to perform actions not possible in Flow though still seems like a gap. How are you approaching such integration?
In my session I gave an overview on where IoT is going and some of the amazing things we can look forward to (maybe). I then covered a number of IoT devices that you can buy now that can enrich your life.
As mentioned in this post yesterday I needed to copy a bunch of Azure WebApps and Functions from one Tenant to another. With anything cloud based, things move fast. Some of the methods I found were too onerous and more complex than they needed to be. There is of course the Backup option as well for Azure Functions. This does require a storage account associated with the Function App Plan. My Functions didn’t have the need for storage and the plan tier they were on meant that wasn’t a prerequisite. I didn’t have the desire to add a storage account to backup to then migrate.
In this post I show my method to quickly copy Azure Functions from one Azure Tenant to another. My approach is;
In the Source Tenant from the Azure Functions App
Using Kudu take a backup of the wwwroot folder (that will contain one or more Functions)
In the Target Tenant
Create an Azure Function App
Using Kudu locate the wwwroot archive in the new Azure Function App
Configure Azure Function Run From Zip
Backing up the Azure Functions in the Source Tenant
Using the Azure Portal in the Source Tenant go to your Function App => Application Settings and select Advanced Tools. Select Debug Console – Powershell and navigate to the Site Folder. Next to wwwroot select the download icon to obtain an archive of your functions.
Copying the Azure Functions to the Target Tenant
In the Target Tenant first create a New Azure Function App. I did this as I wanted to change the naming, the plan and a few other configuration items. Then using the Azure Portal go to your new Function App, Application Settings and select Advanced Tools.
Create a folder under D:\home\data named SitePackages.
Drag and drop your wwwroot.zip file into the SitePackages Folder.
In the same folder select the + icon to create a file named siteversion.txt
Inside the file give the name of your archive file e.g. wwwroot.zip Select Save.
Back in your new Function App select Application Settings
Under Application Settings add a new setting for Website_Use_Zip with a setting value of ‘1’.
Refresh your Function App and you’ll notice it is now Read Only as it is running from Zip. All the Functions that were in the Zip are displayed.
This is a quick and easy method to get your functions copied from one Tenant to another. Keep in mind if your functions are using Application Settings, KeyVaults, Managed Service Identity type options you’ll need to add those settings, certificates, credentials in the target environment.
In the last couple of weeks I’ve had to copy a bunch of Azure WebApps and Functions from one Azure Tenant to another. I hadn’t had to do this for a while and went looking for the quickest and easiest way to accomplish it. As with anything cloud based, things move fast. Some of the methods I found were too onerous and more complex than they needed to be. There is of course the Backup option as well. However for WebApps that is only available if you are on a Standard or above tier Plan. Mine weren’t and I didn’t have the desire to uplift to get that feature.
In this post I show my method to quickly copy an Azure WebApp from one Azure Tenant to another. I cover copying Azure Functions in another post. My approach is;
In the Source Tenant from the WebApp
Download the Automation Scripts for the WebApp
Using Kudu take a backup of the wwwroot folder
In the Target Tenant
Create a new Resource from a Template
Import the Deployment Automation Scripts from above
Modify for any changes, Resource Group, Location etc
Use Zip Push Deploy to upload the wwwroot archive and deploy it
Backing up the WebApp in the Source Tenant
Open your WebApp in the Azure Portal. Select Automation Script
Download the Automation Script
Select Advanced Tools
Select the Site Folder then on the right menu of wwwroot select the download icon and save the backup of the WebApp.
Expand the Deployment Script archive file from the first step above. The contents will look like those below.
Deploy the WebApp to another Tenant
In the Azure Portal select Create a Resource from the top of the menu list on the left hand side. Type Template in the search box and select Template Deployment then select Create.Select Build your own template in the editor. Select Load File and select the parameters.json file. Then select Load File again and select the template.json file. Select Save.
Make any changes to naming, and provide an existing or new Resource Group for the WebApp. Select Purchase.
The WebApp will be created. Once completed select it from the Resource Group you specified and select Advanced Tools. From the Tools menu select Zip Push Deploy.
Drag and drop the Zip file with the archive of the wwwroot folder you created earlier.
The zip will be processed and the WebApp deployed.
Selecting the App in the new Tenant we can see it is deployed and running.
Hitting the App URL we can see that is being served.
In less that 10 minutes the WebApp is copied. No modifying JSON files, no long command lines, no FTP clients. Pretty simple. In the next post I’ll detail how I copied Azure Functions using a similar process.
Keep in mind if your WebApp is using Application Settings, KeyVaults, Managed Service Identity type options you’ll need to add those settings, certificates/credentials in the target environment.
Over the Easter break I enhanced it with the inclusion of a display. I was rummaging around in a box of parts when I found a few LCD displays I’d purchased on speculation some time ago. They are SSD1306 LCD driven units that can be found on Amazon here. A quick upgrade later and …
… scrolling text to go with rotating lights. The addition of the display requires the following changes to the previous project which are detailed in this post;
a few changes in the Mongoose OS Init.JS file to have the appropriate text displayed for the notification
change to the Notifier Base case to integrate the display
it is available in the Thingiverse Project for this thing here and named NodeMCU with Display Window.stl
Incorporating the SSD1306 Library
Before starting, with your micro controller connected and using the MOS UI, take a copy of your Init.js configuration file by selecting Device Files, then Init.js and copying the content to somewhere safe. Also the Device Config by choosing Device Config, Expert View and Save Configuration.
From the MOS UI select Projects, select the AzureIoT-Neopixel-js project then from the drop down menu select mos.yml.
Add the line – origin: https://github.com/mongoose-os-libs/arduino-adafruit-ssd1306 then select the Spanner icon to Rebuild the App. Once completed select the Flash icon to update your micro controller.
Once written to your micro controller check your Init.js and copy back your backup. Check your Configuration and make sure your MQTT settings are still present. Copy your previous config back if required.
Configure your Micro Controller for the SSD1306 Display
We need to tell your micro controller which GPIO Pins we have attached the display too. I actually also moved the GPIO Pin I attached for the Neopixel as part of this. The configuration is;
Neopixel connected to GPIO 12
SSD1306 SDA connected to GPIO 4
SSD1306 SCL connected to GPIO 5
In the Expert Device Config mode update the I2C section as shown below. Save the configuration.
Looking at the NodeMCU diagram you can see where the connections need to be made for the NeoPixel and SSD1306 display. SSD1306 SCL to D1, SDA to D2. The Neopixel data connection is now on D6. Power and GND using the PWR and GND pins. I’m using them all on the same side of the NodeMCU to make it fit cleanly into the case later.
Init.js code additions
Incorporate the display library in your Init.js by including the line below.
With that done we to initialize the display also in the Init.js. The following lines initialize the display address, SCL pin the display is connected to, the size of the text we are going to display and color. Put them before or after the initialization for the Neopixel.
//------------ Setting up Display ----------------
let oled_addr = 0x3C; // I2C Address for SSD1306let
oled = Adafruit_SSD1306.create_i2c(5 /* RST GPIO */, Adafruit_SSD1306.RES_128_32);
// Initialize the display.
In the MQTT Subscriber section where you are looking at the MQTT message being sent from the Microsoft Flow and displaying a color on the Neopixel add the following lines to send output to the display. The following below outputs Pink to the display. If Pink indicates some task then change oled.write(‘PINK’); to oled.write(‘TASK’); or similar.
Almost 18 months ago I wrote this post on integrating Twitter with Azure Functions to Tweet IoT data. A derivative of that solution has been successfully running for about the same period. Azure Functions have been bullet proof for me.
After recently implementing Microsoft Flow as detailed in my Teenager Notification Device post here I started looking at a number of the Azure Functions I have running and looked at what would be better suited to being implemented with Flow. What could I simplify by migrating to Microsoft Flow?
The IoT Twitter Function linked above was one the simpler Functions I had running that I’ve transposed and it has been running seamlessly. I chose this particular function to migrate as the functions it was performing were actions that Microsoft Flow supported. Keep in mind (see the Summary), that there isn’t a one size fits all. Flow and Functions each have their place and often work even better together.
Transposing the IoT Twitter Function App to Microsoft Flow provided me with the same outcome, however the effort to get to that outcome is considerably less. As a quick comparison I’ve compared the key steps I needed to perform with the Azure Function to enable the integration vs what it took to implement with Microsoft Flow.
That’s pretty compelling. For the Azure Function I needed to register an App with Twitter and I needed to create an Azure Function App Plan to host my Azure Function. With Microsoft Flow I just created a Flow.
To setup and configure the Azure Function I needed to set up Deployment Options to upload the Twitter PowerShell Module (this is the third-party module), and I needed to store the two credential sets associated with the Twitter Account/App. In Microsoft Flow I just chose Twitter as an Action and provided conscent to the oAuth2 challenge.
Finally for the logic of the Azure Function I had to write the script to retrieve the data, manipulate it, and then post it to Twitter. In Microsoft Flow it was simply a case of configuring the workflow logic.
As detailed above, the logic is still the same. On a schedule, get the data from the IoT Devices via a RestAPI, manipulate/parse the response and output a Tweet with the environment info. Doing that in Flow though means selection of an action and configuring it. No code, no modules, no keys.
The schedule part is triggered hourly. Using Recurrence it is easy to set the schedule (much easier than a CRON format in Azure Functions) complete with timezone (within the advanced section). I then get the Current time to allow me to acquire the Date and Time in a format that I will use in the resulting tweet.
Next is to perform the first RestAPI call to get the data from the first of the IoT devices. Parse the JSON response to get the temperature value.
Repeat the above step for the other IoT Device located in a different environment and parse that. Formulate the Tweet using elements of information from the Flow.
Looking at Twitter we see a resultant Tweet from the Flow.
This is a relatively simple flow. Bare in mind I haven’t included any logic to validate what is returned or perform any conditional operations during processing. But very quickly it is possible to retrieve, manipulate and output to a different medium.
So why don’t I used Flow for everything? The recent post I mentioned at the beginning for the Teenager Notification Device that also used a Flow, also uses an Azure Function. For that use case the integration of the IoT Device with Azure IoT is via MQTT. There isn’t currently that capability in Flow. But Flow was used to initiate an Action of initiating a trigger for an Azure Function that in turn sent an MQTT message to an IoT Device. The combination of Flow with Functions provides a lot of flexibility and power.
Now that we have end to end functionality it’s time to do something with it.
I have two teenagers who’ve been trained well to use headphones. Whilst this is great at not having to hear the popular teen bands of today, and numerous Facetime, Skype, Snapchat and similar communications it does come with the downside of them not hearing us when we require their attention and they are at the other end of the house. I figured to avoid the need to shout to get attention, a simple visual notification could be built to achieve the desired result. Different colours for different requests? Sure why not. This is that project, and the end device looks like this.
Quite simply the solution goes like this;
With the Microsoft Flow App on our phones we can select the Flow that will send a notification
Choose the Notification intent which will drive the color displayed on the Teenager Notifier.
The IoT Device will then display the color in a revolving pattern as shown below.
The end to end architecture of the solution looks like this.
Using the Microsoft Flow App on a mobile device gives a nice way of having a simple interface that can be used to trigger the notification. Microsoft Flow sends the desired message and details of the device to send it to, to an Azure Function that puts a message into an MQTT queue associated with the Mongoose OS driven Azure IoT Device (ESP8266 based NodeMCU micro controller) connected to an Azure IoT Hub. The Mongoose OS driven Azure IoT Device takes the message and displays the visual notification in the color associated with the notification type chosen in Microsoft Flow at the beginning of the process.
The benefits of this architecture are;
the majority of the orchestration happens in Azure, yet thanks to Azure IoT and MQTT no inbound connection is required where the IoT device resides. No port forwarding / inbound rules to configure on your home router. The micro controller is registered with our Azure IoT Hub and makes an outbound connection to subscribe to its MQTT topic. As soon as there is a message for the device it triggers its logic and does what we’ve configured
You can initiate a notification from anywhere in the world (most simply using the Flow mobile app as shown above)
And using Mongoose OS allows for the device to be managed remote via the Mongoose OS Dashboard. This means that if I want to add an additional notification (color) I can update Flow for a new option to select and update the configuration on the Notifier device to display the new color if it receives such a command.
This post builds on the previous two. As such the prerequisites are;
you have an Azure account and have set up an IoT Hub, and registered an IoT Device with it
3D printer if you want to print an enclosure for the IoT device
With those sorted we can;
Install and configure my Mongoose OS Application. It includes all the necessary libraries and sample config to integrate with a Neopixel, Azure IoT, Mongoose Dashboard etc.
Create the Azure PowerShell Function App that will publish the MQTT message the IoT Device will consume
Create the Microsoft Flow that will kick off the notifications and give use a nice interface to send what we want
Build an enclosure for our IoT device
How to build this project
The order I’ve detailed the elements of the architecture here is how I’d recommend approaching this project. I’d also recommend working through the previous two blog posts linked at the beginning of this one as that will get you up to speed with Mongoose OS, Azure IoT Hub, Azure IoT Devices, MQTT etc.
Installing the AzureIoT-Neopixel-js Application
I’ve made the installation of my solution easy by creating a Mongoose OS Application. It includes all the libraries required and sample code for the functionality I detail in this post.
Clone it from Github here and put it into your .mos directory that should be in the root of your Windows profile directory. e.g C:\Users\Darren\.mos\apps-1.26 then from the MOS Configuration page select Projects, select AzureIoT-Neopixel-JS then select the Rebuild App spanner icon from the toolbar. When it completes select the Flash icon from the toolbar. When your micro controller restarts select the Device Setup from the top menu bar and configure it for your WiFi network. Finally configure your device for Azure MQTT as per the details in my first post in this series (which will also require you to create an Azure IoT Hub if you don’t already have one and register your micro controller with it as an Azure IoT Device). You can then test sending a message to the device using PowerShell or Device Explorer as shown in post two in this series.
I have the Neopixel connected to D1 (GPIO 5) on the NodeMCU. If you use a different micro controller and a different GPIO then update the init.js configuration accordingly.
Creating the Azure Function App
Now that you have the micro controller configured and working with Azure IoT, lets abstract the sending of the MQTT messages into an Azure Function. We can’t send MQTT messages from Microsoft Flow, so I’ve created an Azure Function that uses the AzureIoT Powershell module to do that.
Note: You can send HTTP messages to an Azure IoT device but …
I’m using the Managed Service Identity functionality to access the Azure Key Vault where credentials for the identity that can interact with my Azure IoT Hub is stored. To enable and use that (which I highly recommend) follow the instructions in my blog post here to configure MSI on an Azure Function App. If you don’t already have an Azure Key Vault then follow my blog post here to quickly set one up using PowerShell.
Azure PowerShell Function App
The Function App is an HTTP Trigger Based one using PowerShell. In order to interact with Azure IoT Hub and integrate with the IoT Device via Azure I’m using the same modules as in the previous posts. So they need to be located within the Function App.
Specifically they are;
I’ve put them in a bin directory (which I created) under my Function App. Even though AzureRM.EventHub is shown below, it isn’t required for this project. I uploaded the modules from my development laptop (C:\Program Files\WindowsPowerShell\Modules) using WinSCP after configuring Deployment Credentials under Platform Features for my Azure Function App. Note the path relative to mine as you will need to update the Function App script to reflect this path so the modules can be loaded.
The configuration in WinSCP to upload to the Function App for me is
Edit the AzureRM.IotHub.psm1 file
The AzureRM.IotHub.psm1 will locate an older version of the AzureRM.IotHub PowerShell module from within Azure Functions. As we’ve uploaded the version we need, we need to comment out the following lines in AzureRM.IotHub.psm1 so that it doesn’t do a version check. See below the lines to remark out (put a # in front of the lines indicated below) that are near the start of the module. The AzureRM.IotHub.psm1 file can be edited via WinSCP & notepad.
#$module = Get-Module AzureRM.Profile
#if ($module -ne $null -and $module.Version.ToString().CompareTo("4.2.0") -lt 0)
# Write-Error "This module requires AzureRM.Profile version 4.2.0. An earlier version of AzureRM.Profile is imported in the current PowerShell session. Please open a new session before importing this module. This error could indicate that multiple incompatible versions of the Azure PowerShell cmdlets are installed on your system. Please see https://aka.ms/azps-version-error for troubleshooting information." -ErrorAction Stop
#elseif ($module -eq $null)
# Import-Module AzureRM.Profile -MinimumVersion 4.2.0 -Scope Global
HTTP Trigger Azure PowerShell Function App
Here is my Function App Script. You’ll need to update it for the location of your PowerShell Modules (I created a bin directory under my Function App D:\home\site\wwwroot\myFunctionApp\bin), your Key Vault details and the user account you will be using. The User account will need permissions to your Key Vault to retrieve the password (credential) for the account you will run the process as and to your Azure IoT Hub.
You can test the Function App from within the Azure Portal where you created the Function App as shown below. Update for the names of the IoT Hub, IoT Device and the Resource Group in your associated environment.
Microsoft Flow Configuration
The Flow is very simple. A manual button and a resulting HTTP Post.
For the message I have configured a list. This is where you can choose the color of the notification.
The Action is an HTTP Post to the Azure Function URL. The body has the configuration for the IoTHub, IoTDevice, Resource Group Name, IoTKeyName and the Message selected from the manual button above. You will have the details for those settings from your initial testing via the Function App (or PowerShell).
The Azure Function URL you get from the top of the Azure Portal screen where you configure your Function App. Look for “Get Function URL”.
Now you have all the elements configured, install the Microsoft Flow App on your mobile if you don’t already have it for Apple iOS Appstore and Android Google Play Log in with the account you created the Flow as, select the Flow, the message and done. Depending on your internet connectivity you should see the notification in < 10 seconds displayed on the Notifier device.
Case 3D Printer Files
Lastly, we need to make it look all pretty and make the notification really pop. I’ve created a housing for the neopixel that sits on top of a little case for the NodeMCU.
As you can see from the final unit, I’ve printed the neopixel holder in a white PLA that allows the RGB LED light to be diffused nicely and display prominently even in brightly lit conditions.
I’ve printed the base that holds the micro controller in a different color. The top fits snugly through the hole in the micro controller case. The wires from the neopixel to connect it to the micro controller slide through the shaft of the top housing. It also has a backplate that attaches to the back of the enclosure that I secure with a little hot glue.
Depending on your micro controller you will also need an appropriately sized case for that. I’ve designed the neopixel light holder top assembly to sit on top of my micro controller case. Also available on Thingiverse here.
Using a combination of Azure IoT, Azure PaaS Services, Mongoose OS and a cheap micro controller with an RGB LED light ring we have a very versatile Internet of Things device. The application here is a simple visual notifier. A change of output device or even in conjunction with an input device could change the application, whilst still re-using all the elements of the solution that glues it all together (micro-controller, Mongoose OS, Azure IoT, Azure PaaS). Did you build one? Did you use this as inspiration to build something else? Let me know.