Evaluating the migration of Azure Functions to Microsoft Flow – Twitter IoT Integration

 

Introduction

Almost 18 months ago I wrote this post on integrating Twitter with Azure Functions to Tweet IoT data. A derivative of that solution has been successfully running for about the same period. Azure Functions have been bullet proof for me.

After recently implementing Microsoft Flow as detailed in my Teenager Notification Device post here I started looking at a number of the Azure Functions I have running and looked at what would be better suited to being implemented with Flow. What could I simplify by migrating to Microsoft Flow?

The IoT Twitter Function linked above was one the simpler Functions I had running that I’ve transposed and it has been running seamlessly. I chose this particular function to migrate as the functions it was performing were actions that Microsoft Flow supported. Keep in mind (see the Summary), that there isn’t a one size fits all. Flow and Functions each have their place and often work even better together.

Comparison

Transposing the IoT Twitter Function App to Microsoft Flow provided me with the same outcome, however the effort to get to that outcome is considerably less. As a quick comparison I’ve compared the key steps I needed to perform with the Azure Function to enable the integration vs what it took to implement with Microsoft Flow.

Function vs Flow.PNG

That’s pretty compelling. For the Azure Function I needed to register an App with Twitter and I needed to create an Azure Function App Plan to host my Azure Function. With Microsoft Flow I just created a Flow.

To setup and configure the Azure Function I needed to set up Deployment Options to upload the Twitter PowerShell Module (this is the third-party module), and I needed to store the two credential sets associated with the Twitter Account/App. In Microsoft Flow I just chose Twitter as an Action and provided conscent to the oAuth2 challenge.

Finally for the logic of the Azure Function I had to write the script to retrieve the data, manipulate it, and then post it to Twitter. In Microsoft Flow it was simply a case of configuring the workflow logic.

Microsoft Flow

As detailed above, the logic is still the same. On a schedule, get the data from the IoT Devices via a RestAPI, manipulate/parse the response and output a Tweet with the environment info. Doing that in Flow though means selection of an action and configuring it. No code, no modules, no keys.

Below is a resultant Flow (overview) to achieve the same result as my Azure Function that I originally implemented as an Azure Function as detailed here.

MS Flow - Twitter.PNG

The schedule part is triggered hourly. Using Recurrence it is easy to set the schedule (much easier than a CRON format in Azure Functions) complete with timezone (within the advanced section). I then get the Current time to allow me to acquire the Date and Time in a format that I will use in the resulting tweet.

Schedule

Next is to perform the first RestAPI call to get the data from the first of the IoT devices. Parse the JSON response to get the temperature value.

GET

Repeat the above step for the other IoT Device located in a different environment and parse that. Formulate the Tweet using elements of information from the Flow.

Repeat and Tweet

Looking at Twitter we see a resultant Tweet from the Flow.

Tweet.PNG

Summary

This is a relatively simple flow. Bare in mind I haven’t included any logic to validate what is returned or perform any conditional operations during processing. But very quickly it is possible to retrieve, manipulate and output to a different medium.

So why don’t I used Flow for everything? The recent post I mentioned at the beginning for the Teenager Notification Device that also used a Flow, also uses an Azure Function. For that use case the integration of the IoT Device with Azure IoT is via MQTT. There isn’t currently that capability in Flow. But Flow was used to initiate an Action of initiating a trigger for an Azure Function that in turn sent an MQTT message to an IoT Device. The combination of Flow with Functions provides a lot of flexibility and power.

 

How to use a Powershell Azure Function to Tweet IoT environment data

 

Overview

This blog post details how to use a Powershell Azure Function App to get information from a RestAPI and send a social media update.

The data can come from anywhere, and in the case of this example I’m getting the data from WioLink IoT Sensors. This builds upon my previous post here that details using Powershell to get environmental information and put it in Power BI.  Essentially the difference in this post is outputting the manipulated data to social media (Twitter) whilst still using a TimerTrigger Powershell Azure Function App to perform the work and leverage the “serverless” Azure Functions model.

Prerequisites

The following are prerequisites for this solution;

Create a folder on your local machine for the Powershell Module then save the module to your local machine using the powershell command ‘Save-Module” as per below.

Save-Module -Name InvokeTwitterAPIs -Path c:\temp\twitter

Create a Function App Plan

If you don’t already have a Function App Plan create one by searching for Function App in the Azure Management Portal. Give it a Name, Select Consumption so you only pay for what you use, and select an appropriate location and Storage Account.

Create a Twitter App

Head over to http://dev.twitter.com and create a new Twitter App so you can interact with Twitter using their API. Give you Twitter App a name. Don’t worry about the URL too much or the need for the Callback URL. Select Create your Twitter Application.

Select the Keys and Access Tokens tab and take a note of the API Key and the API Secret. Select the Create my access token button.

Take a note of your Access Token and Access Token Secret. We’ll need these to interact with the Twitter API.

Create a Timer Trigger Azure Function App

Create a new TimerTrigger Azure Powershell Function. For my app I’m changing from the default of a 5 min schedule to hourly on the top of the hour. I did this after I’d already created the Function App as shown below. To update the schedule I edited the Function.json file and changed the schedule to “schedule”: “0 0 * * * *”

Give your Function App a name and select Create.

Configure Azure Function App Application Settings

In your Azure Function App select “Configure app settings”. Create new App Settings for your Twitter Account, Twitter Account AccessToken, AccessTokenSecret, APIKey and APISecret using the values from when you created your Twitter App earlier.

Deployment Credentials

If you haven’t already configured Deployment Credentials for your Azure Function Plan do that and take note of them so you can upload the Twitter Powershell module to your app in the next step.

Take note of your Deployment Username and FTP Hostname.

Upload the Twitter Powershell Module to the Azure Function App

Create a sub-directory under your Function App named bin and upload the Twitter Powershell Module using a FTP Client. I’m using WinSCP.

From the Applications Settings option start Kudu.

Traverse the folder structure to get the path do the Twitter Powershell Module and note it.

Update the code to replace the sample from the creation of the Trigger Azure Function as shown below to import the Twitter Powershell Module. Include the get-help lines for the module so we can see in the logs that the modules were imported and we can see the cmdlets they contain.

Validating our Function App Environment

Update the code to replace the sample from the creation of the Trigger Azure Function as shown below to import the Twitter Powershell Module. Include the get-help line for the module so we can see in the logs that the module was imported and we can see the cmdlets they contain. Select Save and Run.

Below is my output. I can see the output from the Twitter Module.

Function Application Script

Below is my sample script. It has no error handling etc so isn’t production ready, but gives a working example of getting data in from an API (in this case IoT sensors) and sends a tweet out to Twitter.

Viewing the Tweet

And here is the successful tweet.

Summary

This shows how easy it is to utilise Powershell and Azure Function Apps to get data and transform it for use in other ways. In this example a social media platform. The input could easily be business data from an API and the output a corporate social platform such as Yammer.

Follow Darren on Twitter @darrenjrobinson

A Twitter Management Agent for Microsoft Identity Manager

In the last couple of weeks I’ve been evaluating a number of different approaches/concepts for some upcoming MIM development projects. Some of these I’ve blogged about already.

Having an Identity Manager Metaverse with identity data is a key dependency to being able to validate ideas and concepts. So what’s a good source of some interesting and varied identity data with string, integer, reference, and boolean attributes? Twitter? Yeah why not. There’s an API. Should be pretty quick to get some sample data right?

In this blog post I’m going to give an overview of creating a PowerShell Management Agent to consume Twitter identities and their data into Microsoft Identity Manager. I’ll cover;

  • Obtaining Twitter user data from Twitter using Powershell and the Twitter RestAPI
  • Using Søren’s Powershell Management Agent to import Twitter user data obtained via the RestAPI
  • Manipulating the Twitter data into the MIM Synchronisation Engine

Twitter Data

Here’s an overview of my approach/rationale of what data I was looking for and how I got it from Twitter;

  • I don’t need real-time data. Just identity data
  • I need data of all different data-types
  • I need data with all the randomness that identity data often contains
  • I created a standalone script that took a seed Twitter identity (one of my accounts) and;
    • obtained the Twitter account info including the list of the Twitter accounts it followed
    • the Twitter accounts that follows it
  • The standalone script uses the Twitter RestAPI to obtain the data and respected the service and rate-limits
  • To make the Twitter API calls easy I leveraged the awesome InvokeTwitterAPI Powershell Module from Shannon Conley & Mehmet Kaya available here https://github.com/MeshkDevs/InvokeTwitterAPIs . I notice that there is an updated version from Marc R Kellerman available here https://www.powershellgallery.com/packages/InvokeTwitterAPIs/2.1/Content/InvokeTwitterAPIs.psm1 that was released after I had done most of my work. Notably it supports having multiple OAuth keys and the ratelimit restrictions. The details below leverage this updated version.

Here is what a sample of some of the data looks like in the Metaverse.

DataSample1.PNG

Pre-requisites

You need to enable your Twitter Account for API access. Follow the details here 

Getting the Seed Twitter account info

By now you should have downloaded the Twitter PowerShell API Modules and installed them. If you haven’t get WMF5 installed and run the install-module command as shown below in Line 1.

Modify the script below to give the;

  • Seed Twitter Account you want to bring in the Friends and Followers for as users into the MIM Metaverse
  • The API keys asscoiated with your Twitter account(s) you’re going to use to query the Twitter API
  • The directory you want to dump the account info out to

….. and let it loose.

Now we have two XML files with a whole bunch of Twitter accounts and their metadata. There is an almost certainty that the seed account you’ve used is both followed by twitter accounts that you also follow. We’re going to need to remove the duplicates so that when we import the Twitter accounts into MIM we don’t have duplicates.

Basic, basic script to read in both files and spit out the unique Twitter Accounts is shown below.

Using the Granfeldt PowerShell Management Agent to import Twitter Identities

Consuming data into the MIM Sync Engine obtained via PowerShell is quick and simple utilising Søren Granfeldt’s extremely versatile PowerShell Management Agent. I’m just going to cover importing the data from the XML file we generated above.

Getting Started with the Granfeldt PowerShell Management Agent

First up, you can get it from here. Søren’s documentation is pretty good but does assume you have a working knowledge of FIM/MIM and this blog post is no different.

A few items of note are;

  • You must have a Password.ps1 file. Even though we’re not doing password management on this MA, the PS MA configuration requires a file for this field. The .ps1 doesn’t need to have any logic/script inside it. It just needs to be present
  • Same for an Export.ps1 file. I’m not doing any exports on the MA, but an export script must be present.
  • The credentials you give the MA to run this MA are irrelevant as they aren’t used as part of the import as I’m bringing in data from files generated via separate PowerShell scripts
  • The path to the scripts in the PS MA Config must not contain spaces and be in old-skool 8.3 format. I’ve chosen to store my scripts in an appropriately named subdirectory under the MIM Extensions directory. Tip: from a command shell use dir /x to get the 8.3 directory format name. Mine looks like C:\PROGRA~1\MICROS~2\2010\SYNCHR~1\EXTENS~2\Twitter

Import Twitter Users into Microsoft Identity Manager

Using the guidance above on the Granfeldt PSMA here are the two key scripts for the Twitter MA.

The Schema Script to expose the core Twitter identity attributes.

Schema Script

Import Script

The Import Script that takes the rationalised XML file created earlier from the friends and followers queries and populates the connector space.

Password Script

Required by the PSMA but not used as detailed earlier

Export Script

Required by the PSMA but not used as detailed earlier

Creating the Management Agent

Path to the Schema Script in 8.3 format as detailed earlier.

MA1

Path to the Import, Export and Password scripts also in 8.3 format.

MA2

Select the attributes you want to bring in to the connector space.

MA3

Any Join logic, and a Projection Rule.

MA4

Import Flow Rules to bring in the Tweeters.

MA5

Create your Run Profiles, perform a Stage and Full Sync and BAM; Tweeters in the Metaverse. Real Word Identity Data Ahoy. Exception testing commences now.

Follow Darren on Twitter @darrenjrobinson