Tags: Lab

Microsoft Flow, Microsoft Forms and Azure AD – what can we do?

Recently, I have been playing around more and more with Microsoft Flow – which is a tool designed to Automate processes and tasks. In a previous post I used a tool called Stringify to automate a number of Smart Home actions. Microsoft Flow provides a similar environment and allows integration between processes and tasks.

I was impressed from the first time I played with Microsoft Flow at how powerful the tool could be – and immediately set to work creating a simple flow, which I am going to demonstrate here.

Onboarding a new Azure AD User, by filling in a simple online form – the Flow will:

  1. Create the AD User Account based on the username we specify
  2. Set the users password based on a password we specify
  3. Add the user to either the “Staff” or “Student” Department
  4. Send an email notification with the account details ready for use

The key here is that this process can now be carried out, with commonality and uniformity, by someone with no technical knowledge of Azure AD at all.

To start this process – we need a Form. I’ve created a simple form in Microsoft Forms to capture all of the information above, to integrate this into Microsoff Flow. Creating forms is super simple in Microsoft Forms – I’ve created a basic new user data gathering form for our Flow below:


Now – when this form is filled in, we have the information captured that we will need for our Flow. The next step is to start building the Flow. To do this, log into Microsoft Flow and click on “Flows” and then “Create from Blank”:

Next up, we need to add a Trigger – this is an event that will cause our Flow to run. In our case – it will be when a new submission to our Form is received. Just search for Forms and then you’ll see the options required:

When we have selected “When a new response is submitted”, we then see the first step in our Flow has been added:

We need to tell Flow which Form to use now – because I am signed into my Microsoft Account, any forms I have in my account are shown in the drop down list:

Now we have our trigger, we need an action to follow – in this case, we need to get the Form response details. Do do this, just click on “New Step” (shown above) and then search for Forms:

You’ll see now that we need to select our Form again, so that the correct Form is associated with this step in the Flow:

When we add this Form you’ll notice we also have to specify a “Response Id”. In our case, this needs to be Dynamic Content – so that each response is processed by our Flow. When we click into the Response Id area – a new Window will open where we can select Dynamic Content, and then click “See more” – we can then select “List of response notifications…”:

Upon selecting this – Flow will recognize that we want to carry out an action for each response we get – and an “Apply to each” section will be automatically created:

Now we can start creating the Azure AD elements of our Flow, to do this, click on “Add an action” above, and then search for “Azure AD” – we will start by creating the user:

Once this element has been added we can start adding Dynamic Content from our Form to the new user section of the Flow – you’ll notice that when you click into areas that support Dynamic Content, the Dynamic Content window will show as below:

Once completed, we have the following in place for our user creation step:

Next I am going to configure a simple email notification – to let me know what’s been created. We can do this with the “Add an action” option, and then search for “email”:

We can then use Dynamic Content, as we did before, to create an email based on the response to our Form:

Important: Obviously using this method is bad security practice (username and password in the same email) – and in this case is used just to give an idea of the capabilities of the Flow. In production use, using something like the Office 365 “Send an Email” is better – as this supports sending to different addresses, so for example, the username and password can be sent to different lists or addresses. For example, the username to the new user, and the password to the manager (without the username).

Finally – we can test our Flow. To do this, it’s just a case of filling in the Form created earlier:

We can check that our Flow has run from the Flow web interface:

And also – drill down into exactly what was run by clicking on the “Succeeded” (or Failed) in Run History – below you can see some of the variables my Form data contained:

The next step is to check that an Azure AD user has actually been created:

Bingo – everything looking good here… and below we have the user with the details from our Form correctly filled in:

Finally – we can check to confirm we have been emailed the confirmation message with the details of the user account:

As you can see – the Flow has worked as expected, and we now have an Azure AD Account and email notification to go with it. Whilst this is really just scratching the surface of what we can do here – it gives an idea of where we could take this type of automation. A few things I can immediately think of for this type of new user scenario:

  • Add to numerous AD Groups – based on checkboxes in a form
  • Create an O365 Mailbox – based on username/names
  • Provide a welcome email to the mailbox
  • Notify a Slack or Teams channel that a new user has been created, for example “Please welcome [Username] to the department!”
  • Interact with one of the many 3rd party systems supported in Flow – for example adding the user to a CRM system, or SAAS application

Hopefully this has been interesting – and congrats for making it to the end of this post!

Azure Storage Sync – the easiest branch office file sync solution?

Azure Storage Sync provides the means to synchronise files from various locations into an Azure Storage account and to endpoints running the Azure Storage Sync agent. In this post I will give a quick overview of how it can be setup to service branch office requirements whereby VPN connectivity does not exist. For further information see here: https://docs.microsoft.com/en-us/azure/storage/files/storage-sync-files-planning. Here’s my environment which I will be testing with:

Both of my “Branch offices” are actually VMs running on my home lab – but on isolated networks, so they can only communicate outbound to the internet with no LAN access.

Essentially – the goal for this article is to show how to configure Azure Storage Sync to replicate files between Branch Office 1 and Branch Office 2, using Azure Storage as the intermediary.

There are a number of key components to this deployment:

  • An Azure Storage Account – this is where our file share will be hosted
  • An Azure File Share – this is where our replicated data will reside
  • An Azure Storage Sync Group – this is where the synchronisation will be configured and managed
  • Two installations of the Azure Storage Sync Agent that will sit on our Branch Office servers

To start, I have created a folder on Branch Office 1 called “HeadOfficeDocs” – this is the folder that I want to replicate, and it contains a couple of folders and files of “business data” that we need replicated across to Branch Office 2:

Next – we need to setup an Azure Storage account, and an Azure File Share to host the data. First up, I will create a storage account:

Once this has been deployed we can create a File Share – this is where our replicated data will reside:

Next we create a file share and give it a quota:

Once this has been created, we can setup the synchronisation! To do this, we need to create a new Azure File Sync resource:

And fill in a few details – note that the location should be the same as where your Azure File Share is hosted:

Once this has been created, we can access the Storage Sync Resource and create a Sync Group:

A Sync Group defines the sync topology that will replicate our files – so if you require different sets of data replicated then you will need to use different sync groups. For example we could sync completely different sets of data, but on the same or different file servers, in completely different locations, using the same Storage Sync Resource, but separate Sync Groups. This provides plenty of flexibility, and the option use the local file servers (in our Branch sites) to act as a cache for the Azure File Share, and thus reduce the amount of data that is replicated within the topology: https://docs.microsoft.com/en-us/azure/storage/files/storage-sync-cloud-tiering

Creating the Sync Group is easy – just a few basic details to fill in; the Storage Account we have created, and the File Share that will host the data:

Once this is completed – we have the basics for our topology in place, and we need to register our first server into the topology. The registered servers pane will be blank at this point, as we have not yet installed the Azure File Sync Agent onto any Servers:

To start the process – log onto the first server and download the Azure File Sync Agent from this URL: https://www.microsoft.com/en-us/download/details.aspx?id=57159. Installation of the File Sync Agent is straightforward and simple – just keep clicking next (you can input proxy settings if you require)… the only configuration is the account we wish to associate the server to, which is completed after setup:

Just click “Sign in” and then follow the instructions – you’ll need to sign into your Azure account, and will then be prompted to select the Azure Subscription, Resource Group, and Storage Sync Service we are registering this server to:

Once this is done – just click “Register”. (You may get prompted to sign in again here – I did) – once completed, the registration process is completed:

Our newly registered server now shows up in the “Registered Servers” pane:

Next we need to configure the Sync Group – so that this server is added and starts to replicate data into our topology. To do this, browse to the Sync Group settings:

From here, we can add the server:

This is just a case of selecting the server from a drop down, and then entering the path where our data exists. Note – I am not using Cloud Tiering here as I want all data on all replication points within the topology:

Once this is done – our server is added to the Sync Group, initially it will show as provisioning, then pending, and then as healthy:

If we now look in the Azure File Share – we can see that the data from the server has been replicated into the Cloud:

So – we now have a single server in a replication topology between itself and an Azure File Share. If I add a new folder to the Branch Server – we see this replicated onto the Azure File Share after a short time:

Next – I will deploy the Agent to a new server (in another Branch) to test out the replication and initial sync. To do this it’s just a case of installing the Agent exactly as we did before – and ensuring that we register this server to the same Storage Sync Service. Once this has been completed, we register the server again (exactly as before – but with a different path if required) and then we will start seeing data being synchronised:

Note that I have used a different path – you can change this on a per server basis if you require, so there is no need to have all servers setup in an identical disk arrangement. If we look in the Sync Group pane after a short while for the sync to take place, we can see both servers are setup:

We can also see that the data has been replicated to the 2nd Server I have added:

Bingo – we now have a working topology that will keep data in sync between our offices using Azure File Sync. No VPNs required, no complicated configuration, just two Agent installations, and some basic Azure Configuration. This provides a simple and effective way to keep branch site data in Sync and provides a number of potential use cases whereby complicated setups would otherwise be required.

Until next time, Cheers!

Azure Lab Services – creating an effective and reliable testing environment

Azure Lab Services (formerly DevTest Labs) is designed to allow for the rapid creation of Virtual Machines for testing environments. A variety of purposes and use cases can be serviced using DevTest Labs, for example, Development Teams, Classrooms, and various Testing environments.

The basic idea is that the owner of the Lab creates VMs or provides a means to create VMs, which are driven by settings and policy, all of which is configurable via the Azure Portal.

The key capabilities of Azure Lab Services are:

  • Fast & Flexible Lab Setup – Lab Services can be quickly setup, but also provides a high level of customization if required. The service also provides built in scaling and resiliency, which is automatically managed by the Labs Service.
  • Simplified Lab Experience for Users – Users can access the labs in methods that are suitable, for example with a registration code in a classroom lab. Within DevTest Labs an owner can assign permissions to create Virtual Machines, manage and reuse data disks and setup reusable secrets.
  • Cost Optimization and Analysis – A lab owner can define schedules to start up and shut down Virtual Machines, and also set time schedules for machine availability. The ability to set usage policies on a per-user or per-lab basis to optimize costs. Analysis allows usage and trends to be investigated.
  • Embedded Security – Labs can be setup with private VNETs and Subnets, and also shared Public IPs can be used. Lab users can access resources in existing VNETs using ExpressRoute or S2S VPNs so that private resources can be accessed if required. (Note – this is currently in DevTest Labs only).
  • Integration into your workflows and tools – Azure Lab Services provides integration into other tools and management systems. Environments can automatically be provisioned using continuous integration/deployment tools in Azure DevTest Labs.

You can read more about Lab Services here: https://docs.microsoft.com/en-us/azure/lab-services/lab-services-overview

I’m going to run through the setup of the DevTest Labs environment, and also run through a few key elements and run through the use cases for these:

Creating the Environment:

This can be done from the Azure Portal – just search for “DevTest Labs” and then we can create the Lab Account. Note – I have left Auto-shutdown enabled (this is on by default at 1900 with no notification):

Once this has been deployed, we are able to view the DevTest Labs Resource overview:

From here we can start to build out the environment and create the various policies and settings that we require.

Configuring the Environment

The first port of call is the “Configuration and Policies” pane at the bottom of the above screenshot:

I’m going to start with some basic configuration – specifically to limit the number of Virtual Machines that are allowed in the lab (total VMs per Lab) and also per user (VMs per user). At this point I will also be setting the allowed VM sizes. These are important configuration parameters, as with these settings in place we effectively limit our maximum compute cost:

[total number of VMs allowed in the lab] x [maximum/most expensive VM size permitted] = the maximum compute cost of the lab environment

This is done using the panes below:

First up, setting the allowed VM sizes. For this you need to enable the setting and then select any sizes you wish to be available in the Lab. I have limited mine to just Standard_B2s VMs:

Once we have set this up as required we just need to click “Save” and then we can move onto the “Virtual Machines per user” setting. I am going to limit my users to 2 Virtual Machines each at any time:

You’ll notice you can also limit the number of Virtual Machines using Premium OS disks if required. Once again – just click “Save” and then we can move onto the maximum number of Virtual Machines per Lab:

As you can see I have limited the number of VMs in the Lab to a maximum of 5. Once we have clicked “Save” we have now configured a few of the basic elements for our Lab.

Defining VM Images for our Lab

Next up – it’s time to configure some images that we want to use in our Lab. We have three options here – which provide a number of different configurations that suit different Lab requirements:

Marketplace images – these are images from the Azure Marketplace, much like we are used to selecting when creating Virtual Machines in the Portal

Custom images – these are custom images uploaded into the Lab, for example containing bespoke software or settings not available via the Marketplace or a Formula.

Formulas – these allow for the creation of a customised deployment based on a number of values. These values can be used as-is or adjusted to change the machine deployed. Formulas provide scope for customisation within defined variables and can be based on both Marketplace and Custom images.

For more information on choosing between custom images and formulas this is well worth a read: https://docs.microsoft.com/en-us/azure/lab-services/devtest-lab-comparing-vm-base-image-types

I’ve defined a single Marketplace Image from the Portal – and provided only Windows 10 Pro 1803:

Next, I am going to create a Formula based on this image, but also with a number of customisations. This is done by clicking on “Formulas” and then “Add”:

Next, we can configure the various settings of our Formula, but first we need to setup the Administrator username and password in the “My secrets” section of the lab. This data is stored in a Key Vault created as part of the Lab setup, so that they can be securely used in Formulas:

Next, I am going to create a Windows 10 Formula with a number of applications installed as part of the Formula, to simulate a client PC build. This would be useful for testing out applications against PCs deployed in a corporate environment for example. When we click “Formulas” and then  “Add” we are presented with the Marketplace Images we defined as available in the earlier step:

Marketplace Image selection as part of the Formula creation:

Once the base has been selected we are presented with the Formula options:

There are a couple of things to note here:

  • The user name can be entered, but the password is what we previously defined in the “My secrets” section
  • The Virtual Machine size must adhere to the sizes defined as available for the lab

Further down the options pane we can define the artifacts and advanced settings:

Artifacts are configuration and software items that are applied to the machines when built – for example, applications, runtimes, Domain Join options etc. I’m going to select a few software installation options to simulate a client machine build:

There are a few very useful options within other artifacts, which I feel deserve a mention here:

  • Domain Join – this requires credentials and a VNET connected to a Domain Controller
  • Download a File from a URI – for example if we need to download some custom items from a specific location
  • Installation of Azure PowerShell Modules
  • Adding a specified Domain User to the Local Admins group – very useful if we need all testing to be done using Domain Accounts and don’t want to give out Local Administrator credentials
  • Create an AD Domain – if we need a Lab domain spun up on a Windows Server Machine. Useful if an AD Domain is required temporarily for testing
  • Create a shortcut to a URL on the Public Desktop – useful for testing a Web Application on different client bases. For example we could test a specified Website against a number of different client builds.
  • Setup basic Windows Firewall configuration – for example to enable RDP or to enable/disable the Firewall

It is also worth noting that we can define “Mandatory Artifacts” within the Configuration and Policies section – these are artifacts that are applied to all Windows or Linux VMs created with the Lab:

After artifact selection we can specify the advanced settings for the Lab:

It is worth noting here that we can specify an existing VNET if required – this is particularly useful if we need to integrate the Lab VMs into existing environments – for example an existing Active Directory Domain. Here we can also configure the IP address allocation, automatic delete settings, machine claim settings, and the number of instances to be created when the formula is run.

Once the Formula is created we can see the status:

Granting access to the Lab

We can now provide access to end users – this is done from the Lab Configuration and Policies pane of the Portal:

We can then add users from our Azure Active Directory to the Lab Environment:

Visit this URL for an overview of the DevTest Lab Permissions: https://docs.microsoft.com/en-us/azure/lab-services/devtest-lab-add-devtest-user

Now we can start testing the Lab environment logged in as a Lab User.

Testing the Lab Environment

We can now start testing out the Lab Environment – to do this, head over to the Azure Portal and log in as a Lab User – in this case I am going to log in as “Labuser1”. Once logged into the Portal we can see the Lab is assigned to this user:

The first item I am going to do is to define a local username and password using the “My secrets” section – I won’t show this here but you need to follow the same process as I did earlier in this post.

Once we have accessed the Lab, we can then create a Virtual Machine using the “Add” button:

This presents the Lab user with a selection of Base Images – both Marketplace (as we previously defined) and Formulas (that we have previously setup):

I’m going to make my life easy – I’m a lab user who just wants to get testing and doesn’t have time to install any software… so a Formula is the way to go! After clicking on the “Windows10-1803_ClientMachine” Formula I just need to fill out a few basic details and the VM is then ready to provision. Note that the 5 artifacts we setup as part of the Formula and the VM size is already setup:

Once we have clicked create the VM is then built and we can see the status is set to “Creating”:

After some time the VM will show as Running:

Once the VM has been created we can connect via RDP and start testing. When creating this VM I left all of the advanced settings as defaults – which means as part of the first VM deployment, a Public IP and Load Balancer (so that the IP can be shared across multiple Lab VMs) has been created. When we now look at the VM overview window, we can just click connect as we normally would to an Azure VM:

Once we have authenticated, we can then use the VM as we would any other VM – note in the screenshot below, both Chrome and 7Zip (previously specified artifacts) are visible and have been installed (along with other items) for us before we access the VM:

When we have finished our testing or work on this VM – we have a number of options we can use:

  • Delete the VM – fairly self explanatory this one… the VM gets deleted
  • Unclaim the VM – the VM is then placed into the pool of Claimable VMs so that other Lab users can claim and use them. This is useful if you wish to simply have a pool of VMs that people use and then return to a Pool. For example – in a development team testing different OS versions or Browsers etc.
  • Stop the VM – this is the same as deallocating any Azure VM – we’d only pay for the Storage use when stopped

Hopefully this has been a useful overview of the DevTest Labs offering within Azure… congratulations if you made it all the way to the end of the post! Any questions/comments feel free to reach out to me via my contact form or @jakewalsh90 🙂