With every new project comes another development environment. Another installation of Visual Studio Code and the inevitable loss of productivity whilst you get all the necessary extensions installed and configured. If only there was a quick and painless way to perform a VSCode Configuration Sync between environments.
A quick search today for a method to sync configuration between VSCode environments revealed a VSCode Extension from Shan Ali Khan.
The VSCode Settings Sync Extension does everything the name implies leveraging GitHub Gists as the configuration store. The process is super simple too. Simply;
Install the Settings Sync Extension
Generate a GitHub Access Token to allow it to create a configuration Gist (one time task)
Upload your VSCode Configuration from your VSCode instance that has your Gold Config
Install the Settings Sync Extension on the target machine
Download your config and watch as it retrieves your configuration and installs all your extensions
The last two steps can then be repeated in each new environment. And if you enable Auto-upload (disabled by default) on your Master Configuration machine your configuration stored in the associated Gist will always be the latest. Likewise you can Auto Download your configuration to target machines (disabled by default).
The documentation for Settings Sync is super easy to follow and will have your environments in sync super quick.
The screenshot below shows Settings Sync downloading the 36 extensions I have configured that I use for the different projects I write into a new environment.
Thanks for your work Shan Ali Khan. I’ve donated to your great extension.
If you spend anytime as an IT Support Person/ DevOps type role and you checkout that second link above you’ll be mightily keen for this new Terminal.
Tabs in a Terminal Window YES (heck I remember paying for a product to provide that to me in a browser) 15+ years ago; a Terminal Window that is a standard command prompt (with Unicode Support) YES; a Terminal Window for cross platform, CMD, PowerShell, PowerShell Core, Windows Subsystem for Linux DAMN YES. And of course you don’t want to have to wait for this, you want it now.
So did I, so I built the Preview Alpha Release. This post details how I did it.
There are a few hoops you need to jump through to get on this right now, as it isn’t available as a download. It will be coming to Windows 10 in a few months, but let’s get it now.
Become a Windows Insider by registering for a Windows Insider Account here
Have a Windows 10 v 1903 build (via registering for Windows Insiders above)
the process to do this I show below
Inside your Windows 10 machine you will then need;
I built a Windows 10 1709 Virtual Machine in Azure from the Azure MarketPlace. Having connected to it, I needed to enable the Windows Insider Program on it. To do that select;
Windows => Settings => Update & Security => Windows Insider Program => GetStarted
Select Link an account and provide the account you used to sign up for Windows Insiders.
If, when you attempt to link an account you get a blank login window/page when being prompted for your Windows Insider Account you may need to make a couple of changes to the Windows 10 Local Security Policy Security Options. Below is the configuration of my test Windows Insider Windows 10 Virtual Machine. I’ve highlighted a few options I needed to update.
Select the Skip Ahead to the next Windows release to update Windows 10.
If you are doing this like I am on a Windows 10 Virtual Machine in Azure, you’ll first go from build 1709 to 1803.
After Windows 10 has updated to 1803 log back in, go back to Windows Insider Program and chose Skip ahead to the next Windows release.
Under Settings => Update & Security => Windows Update and select Check for Updates and you will see Windows 10 version 1903 become available.
Under Windows => Settings => Update & Security => Enable Developer Mode
With the other dependencies detailed in the prerequisites above (Windows 10 1903 SDK, Visual Studio etc) downloaded and installed on your Windows 10 machine we can get on to the fun bit of building the new Terminal. Create a folder where you want to put the source to build the terminal and from a command prompt change directory into it and run the following commands;
Then in Visual Studio select Open a project or solution and open the Terminal Visual Studio Solution. Select SDK Version 10.0.18362.0 and No Upgrade for the Platform Toolset
Select Release and x64 and then from the Build menu Build Solution.
Finally, right click on CascadiaPackage and select Deploy
Terminal (Dev) will then be available through the Start Menu.
Opening the Windows Terminal will give you a single Terminal Window. Press Cntrl + T to open an additional tab.
Use the drop down menu to select Settings and you will be presented with the JSON configuration document. See (below under Icons for mine that enables CMD, PWSH, PowerShell, WSL – Ubuntu and WSL – Suse.
To have icons for your terminal tabs obtain some 32×32 pixel icons for your different terminals and drop them into the RoamingState directory under the Windows Terminal App. For me that is
Then update your profiles.json configuration file located in the same directory and add the name of the appropriate icon for each terminal.
As much as we use nice UI’s for a lot of what we do as Devs/IT Pro’s, there are still numerous tasks we perform using terminal shells. A tabbed experience for these complete with customisation brings them into the 21st century. Now the wait for another month or two to have it delivered as part of the next Windows 10 Build.
Today (27 April 2019) is Global Azure Bootcamp day. It is the 6th year for the free event that is run in communities all over the world to teach Azure Cloud technologies. This year is my second. Last year I presented on Creating the Internet of YOUR Things and today I’m presenting Building Apps in Azure with only Scripting Skills.
In this session I gave a beginners guide to building an Azure Web App using VSCode and NodeJS. In the demo’s we only needed to visit the Azure Portal for a small Azure Function for our Web App to integrate with.
Here is the GitHub Repo for the small Web App that I used for the demo’s. A Bastard Operator from Hell Excuse Generator (that I also used in my AutoRest post here) and the PowerPoint presentation.
Infrastructure-as-a-Service has provided the ability to quickly deploy hosts in Cloud environments such as Azure. However the certificate that comes with the host isn’t ready for Web Services. I hadn’t had to do this for quite some time and it came to my realisation again that whilst there are a few guides available they are for different scenarios than what I require. In my development environments I’m looking for an SSL Certificate that;
can be verified through HTTP verification methods (not DNS as we obviously don’t own the *.cloudapp.azure.com namespace)
This blog post details the process to generate free SSL Web Server Certificates using ZeroSSL and Let’s Encrypt, along with the process to implement them on IIS on Windows. Specifically;
Generating the Certificate using ZeroSSL
Converting the certificate to PFX format for IIS
Installing the Certificate on IIS
Testing a website configured with the SSL Cert
Generating the Certificate
Start by navigating to ZeroSSL and select Start SSL Certificate Wizard
Enter the address of the host you are generating the certification for. This can be found from the Overview section of your Virtual Machine in the Azure Portal under DNS Name. Select Accept ZeroSSL TOS and Accept Let’s Encrypt SA (pdf) and select Next.
The Certificate Signing Request will be generated. Select Next.
The Account Key will be generated. You can save the CSR and Account Keys from this page. Select Next.
Create the folder structure in IIS as described in the ZeroSSL Verification page under your Windows Server IIS webroot. This by default will be c:\inetpub\wwwroot.
You will need to create the .well-known folder using a windows command prompt.
Create a web.config file in the acme-challenge directory so that IIS will provide access to the file created.
Ensure the permissions for the new directory and file will also allow access.
Click on the file link to make sure that ZeroSSL can access your IIS Website with the file in the path required. Depending on your Azure Networking, you may require an inbound NSG Rule for port 80 as shown below.
Once you have validated the file is accessible, select Next to allow ZeroSSL to verify you have created the file with the appropriate content to validate you own the host. You will then be provided with your certificate.
The top certificate file contains both your domain certificate and the issuer’s certificate.
Save it to your local Windows 10 workstation with the .txt extension. e.g. ServerCerts.txt
The second file contains your domain key.
Save the domain key to another file with the .txt extension e.g. ServerDomainKey.txt
Converting the Certificate
In order to be able to import the certificate in IIS we need it in PFX format. OpenSSL allows us to convert the .txt files into PFX format. OpenSSL is available Windows Subsystem for Linux by default. Jumping into a WSL window run OpenSSL with the following syntax referencing the two files you created above.
Now that I have documented the end to end process it will be much quicker next time generating a certificate for the next host. With the certificate details saved I can also easily update the certificate when it expires in three months time.
Today I was configuring an Integration Module for SailPoint IdentityNow. As part of that integration I needed the ID of an IdentityNow Virtual Appliance Cluster. It seemed I hadn’t previously documented doing this and I couldn’t find my previous script. So here’s a quick post for the next time I need to get info about the SailPoint Identity Now Virtual Appliance Clusters in my environment.
The following script uses v3 Authentication as detailed in this post.
line 2 with your IdentityNow Orgname
line 5 with your Admin Account Name
line 6 with your Admin Password
line 16 with your IdentityNow v3 API Client ID
line 17 with your IdentityNow v3 API Client Secret
Last week I wrote this post that detailed how to use the Azure Table Storage output binding in an Azure PowerShell Function. As part of the same solution I’m working on, I also need to get data/events into Power BI for reporting dashboards. An Azure Function (PowerShell) has the ability to obtain the data but the path to Power BI requires a number of steps that start with using the Azure Function Event Hub output binding.
The end to end process is;
Azure Function obtains information from an API (Brewery DB)
processes the data and;
Sends the data to an Azure Event Hub
A Stream Analytics Job picks up the data and puts it into a Power BI Dataset.
This post will detail the process of configuring this process using the same Beer Styles example from the Azure Table Storage Azure Output Binding post here and PowerShell as the documentation doesn’t give a working example for PowerShell.
The inputs and Azure Function are the same. Just the output from the Azure Function to Azure Event Hub is added. You can have multiple output bindings, so this example will add the Event Hub output whilst keeping the Azure Table Service output as well.
Azure Event Hub
Before we can add and configure an Azure Event Hub Output Binding on an Azure Function we need to create an Azure Event Hub Namespace.
From the Azure Portal, create a resource (Event Hub).
Once the Event Hub Namespace has been created, we need to create an Event Hub. Select the Event Hub Namespace => Event Hubs => + Event Hub and give it a name.
From the Azure Portal create a resource Power BI Embedded and create a Workspace if you don’t already have one.
Azure Stream Analytics
Now we need to create the Stream Analytics Job that will take the event data from the Event Hub and put it into a Power BI Dataset.
From the Azure Portal create a resource Stream Analytics Job. Give it a name and select create.
Once created select your Stream Analytics Job => Inputs => Add stream input => Event Hub. Provide a job Alias, select your Azure Subscription, the Event Hub Namespace and Event Hub created earlier and select Create.
Select Outputs from your Stream Analytics Job => + Add => Power BI => Authorize to authorise access to Power BI. Provide an output Alias, select your Group workspace and provide a Dataset name and Table name => Save.
Select Query and update the query to copy the input to the output. Select Save. If you weren’t filtering what you wanted into the Power BI Dataset on the Azure Function (or other input to the Event Hub) you could filter it with a query here.
Select the Job Overview and select Start => Now.
Azure Function App Event Hub Output
With all the wiring in place to get our data out to Power BI we can now configure our Azure Function App to output to our Event Hub.
Select your Azure Function => Integrate => Outputs => New Output => Event Hub => New Connection => Event Hub and select your Event Hub Namespace, Event Hub => Select.
Update the Event Hub name to the name of your Event Hub and select Save.
Selecting our Azure Function from the previous blog post adding in the following line will also copy our output object to our Event Hub Output Binding.
Recently I wrote this post on using PowerShell to bulk load data into Azure Table Service. Whilst this method works great it does rely on the AzureRM PowerShell module to provide the ability to batch ingest data into Azure Table Service.
I’m working on a solution that requires levels of automation to obtain data from events from Microsoft Graph and ingest that data into Azure Table Service. That doesn’t work with the AzureRM PowerShell Module.
Azure Function Configuration
If you’re creating a new Azure Function App in 2019 and wanting to use PowerShell, after creating your Azure Function App you need to configure the Function app settings for Runtime version 1. This can only be configured prior to creating a Function.
Using Azure Storage Explorer select your Storage Account associated with the Azure Function Plan and under Tables create the table you will be putting data into.
After creating your Azure PowerShell Function select Integrate and under Outputs add Azure Table Storage. Provide the Azure Storage Account Table that you created above.
It then creates a PowerShell Object for each style and adds it to an Array of Beer Styles. The array is then converted to JSON and passed to the Azure Table Service Output Binding outputTable configured earlier.
As this is just an example there is no error handling etc, but a working example of obtaining data, transforming it and sending it to Azure Table Service.
Looking at the Azure Table Service Table with Azure Storage Explorer after executing the Azure Function all the Beer Styles have been added.
The Azure Table Service Output Binding for Azure Functions provides a quick and simple method to allow data to be ingested into Azure Table Service. An added benefit over my previous integration is that the data doesn’t need to be batched into batches of 100 records.
This post is more a note-to-self for future me in case I’m in this scenario again. Today I encountered the error Attempted to access an unloaded AppDomain.
I have a custom Forefront/Microsoft Identity Manager Management Agent that requires multiple credentials for the Web Service it is integrating with. In order to secure parts of the credentials that cannot be provided as part of the Connectivity configuration tab on the Management Agent Proporties I have generated them and exported them using Export-Clixml as detailed in this post here.
Today I was migrating a Management Agent from one environment to another and was sure I’d regenerated the credentials correctly. However the Management Agent wasn’t working as expected. Looking into the Applications and Services Logs => Forefront Identity Manager Management Agent Log ….
i found the following error ….
Unhandled exception, the CLR will not terminate: System.AppDomainUnloadedException: Attempted to access an unloaded AppDomain
Retracing my steps, I had logged on to the Synchronisation Server with the incorrect credentials to generate my protected credentials XML file.
When the Synchronisation Server was running the Management Agent and attempting to run Import-clixml “credentialFilename” the credentials/account that had exported the credentials did not match the service account that the synchronisation server was running with. And the error listed above was thrown.
Import-clixml and Export-clixml do exactly what they are supposed too and respect the context by which the credentials were exported and will only be able to access them when imported under that same context. The error doesn’t really tell you that but does hint at it with Attempted to access an unloaded AppDomain if you know what you are trying to do.
Over the last year I’ve been utilising Docker more and more. Today I needed to fire up an old virtual machine that I use a couple of times a year. The VM is in VirtualBox. It wouldn’t start. The error was;
Raw-mode is unavailable courtesy of Hyper-V. (VERR_SUPDRV_NO_RAW_MODE_HYPER_V_ROOT)
Docker Desktop for Windows uses Microsoft Hyper-V for virtualization, and Hyper-V is not compatible with Oracle VirtualBox. Therefore, you cannot run the two solutions simultaneously.
Now I wasn’t going to go to the trouble of migrating my VM from Virtual Box to Hyper-V for the 1 to 2 times a year I need to use it. And I wasn’t going to migrate it to Azure for the little use it gets either.
After some Googling I found the answer. Temporarily disable Hyper-V. In an elevated PowerShell session run;