Category: Azure

Azure Front Door – Azure Advent Calendar 2019

Overview

This year I have had the pleasure of taking part in the Azure Advent Calendar, a community driven event that runs throughout December 2019. The calendar is brought to us by MVPs, Gregor Suttie and Richard Hooper. A huge shout out to Anthony Mashford for letting me know about it too!

The idea is simple – each day throughout December a series of Blog Posts and Videos about different Azure topics are released, by people who work with these technologies. For me – I wanted to choose something I had not had a huge amount of exposure too. So not only do I learn something in the process, but also through my own discovery I can (hopefully!) help others learn along the way.

Azure Front Door

Azure Front Door is a service that offers features similar to those found in many different types of Application Delivery Controllers, throughout datacenters worldwide. However, as with everything in Microsoft Azure, it brings a wealth of additional features and benefits. The key features and benefits to using the service are:

  • Application Performance Acceleration
  • Increased availability via Smart Health Probes
  • URL Based Routing
  • Multiple Site Hosting
  • Session Affinity
  • SSL Termination
  • Custom Domains and Certificate Management
  • Application Layer Security
  • URL Redirection
  • URL Rewrite
  • Native IPV6 and HTTP/2

Who would benefit from the Service?

Anyone currently utilizing Azure to host any web service would benefit from the Front Door Service in my opinion – the wealth of features make it an ideal companion and provide optimizations that fit around existing services and improve them vastly. Also anyone wishing to accelerate the performance of a Web Application elsewhere could also benefit – even if that application not hosted in Azure.

Okay – I’m convinced, how do I try it?

The great news is that Azure Front Door can be tried and tested both rapidly and very cost effectively. I’d recommend starting out how I have, with two identical Web Apps, and practice setting up Front Door, and creating the various configuration items, as I have done in my demo. From there you can try out more complex settings and build up from this foundation. The basic setup and creating a rule is something shown in my video below – and this will get you a basic setup that you can then modify and tweak to learn move and forward with.

Azure Advent Calendar Video

Happy Christmas!

Thanks for reading my post and watching my video – please do feel free to reach out if you’d like any more information, I’m active on Twitter over at @jakewalsh90 🙂

Microsoft Flow, Microsoft Forms and Azure AD – what can we do?

Recently, I have been playing around more and more with Microsoft Flow – which is a tool designed to Automate processes and tasks. In a previous post I used a tool called Stringify to automate a number of Smart Home actions. Microsoft Flow provides a similar environment and allows integration between processes and tasks.

I was impressed from the first time I played with Microsoft Flow at how powerful the tool could be – and immediately set to work creating a simple flow, which I am going to demonstrate here.

Onboarding a new Azure AD User, by filling in a simple online form – the Flow will:

  1. Create the AD User Account based on the username we specify
  2. Set the users password based on a password we specify
  3. Add the user to either the “Staff” or “Student” Department
  4. Send an email notification with the account details ready for use

The key here is that this process can now be carried out, with commonality and uniformity, by someone with no technical knowledge of Azure AD at all.

To start this process – we need a Form. I’ve created a simple form in Microsoft Forms to capture all of the information above, to integrate this into Microsoff Flow. Creating forms is super simple in Microsoft Forms – I’ve created a basic new user data gathering form for our Flow below:


Now – when this form is filled in, we have the information captured that we will need for our Flow. The next step is to start building the Flow. To do this, log into Microsoft Flow and click on “Flows” and then “Create from Blank”:

Next up, we need to add a Trigger – this is an event that will cause our Flow to run. In our case – it will be when a new submission to our Form is received. Just search for Forms and then you’ll see the options required:

When we have selected “When a new response is submitted”, we then see the first step in our Flow has been added:

We need to tell Flow which Form to use now – because I am signed into my Microsoft Account, any forms I have in my account are shown in the drop down list:

Now we have our trigger, we need an action to follow – in this case, we need to get the Form response details. Do do this, just click on “New Step” (shown above) and then search for Forms:

You’ll see now that we need to select our Form again, so that the correct Form is associated with this step in the Flow:

When we add this Form you’ll notice we also have to specify a “Response Id”. In our case, this needs to be Dynamic Content – so that each response is processed by our Flow. When we click into the Response Id area – a new Window will open where we can select Dynamic Content, and then click “See more” – we can then select “List of response notifications…”:

Upon selecting this – Flow will recognize that we want to carry out an action for each response we get – and an “Apply to each” section will be automatically created:

Now we can start creating the Azure AD elements of our Flow, to do this, click on “Add an action” above, and then search for “Azure AD” – we will start by creating the user:

Once this element has been added we can start adding Dynamic Content from our Form to the new user section of the Flow – you’ll notice that when you click into areas that support Dynamic Content, the Dynamic Content window will show as below:

Once completed, we have the following in place for our user creation step:

Next I am going to configure a simple email notification – to let me know what’s been created. We can do this with the “Add an action” option, and then search for “email”:

We can then use Dynamic Content, as we did before, to create an email based on the response to our Form:

Important: Obviously using this method is bad security practice (username and password in the same email) – and in this case is used just to give an idea of the capabilities of the Flow. In production use, using something like the Office 365 “Send an Email” is better – as this supports sending to different addresses, so for example, the username and password can be sent to different lists or addresses. For example, the username to the new user, and the password to the manager (without the username).

Finally – we can test our Flow. To do this, it’s just a case of filling in the Form created earlier:

We can check that our Flow has run from the Flow web interface:

And also – drill down into exactly what was run by clicking on the “Succeeded” (or Failed) in Run History – below you can see some of the variables my Form data contained:

The next step is to check that an Azure AD user has actually been created:

Bingo – everything looking good here… and below we have the user with the details from our Form correctly filled in:

Finally – we can check to confirm we have been emailed the confirmation message with the details of the user account:

As you can see – the Flow has worked as expected, and we now have an Azure AD Account and email notification to go with it. Whilst this is really just scratching the surface of what we can do here – it gives an idea of where we could take this type of automation. A few things I can immediately think of for this type of new user scenario:

  • Add to numerous AD Groups – based on checkboxes in a form
  • Create an O365 Mailbox – based on username/names
  • Provide a welcome email to the mailbox
  • Notify a Slack or Teams channel that a new user has been created, for example “Please welcome [Username] to the department!”
  • Interact with one of the many 3rd party systems supported in Flow – for example adding the user to a CRM system, or SAAS application

Hopefully this has been interesting – and congrats for making it to the end of this post!

Azure Storage Sync – the easiest branch office file sync solution?

Azure Storage Sync provides the means to synchronise files from various locations into an Azure Storage account and to endpoints running the Azure Storage Sync agent. In this post I will give a quick overview of how it can be setup to service branch office requirements whereby VPN connectivity does not exist. For further information see here: https://docs.microsoft.com/en-us/azure/storage/files/storage-sync-files-planning. Here’s my environment which I will be testing with:

Both of my “Branch offices” are actually VMs running on my home lab – but on isolated networks, so they can only communicate outbound to the internet with no LAN access.

Essentially – the goal for this article is to show how to configure Azure Storage Sync to replicate files between Branch Office 1 and Branch Office 2, using Azure Storage as the intermediary.

There are a number of key components to this deployment:

  • An Azure Storage Account – this is where our file share will be hosted
  • An Azure File Share – this is where our replicated data will reside
  • An Azure Storage Sync Group – this is where the synchronisation will be configured and managed
  • Two installations of the Azure Storage Sync Agent that will sit on our Branch Office servers

To start, I have created a folder on Branch Office 1 called “HeadOfficeDocs” – this is the folder that I want to replicate, and it contains a couple of folders and files of “business data” that we need replicated across to Branch Office 2:

Next – we need to setup an Azure Storage account, and an Azure File Share to host the data. First up, I will create a storage account:

Once this has been deployed we can create a File Share – this is where our replicated data will reside:

Next we create a file share and give it a quota:

Once this has been created, we can setup the synchronisation! To do this, we need to create a new Azure File Sync resource:

And fill in a few details – note that the location should be the same as where your Azure File Share is hosted:

Once this has been created, we can access the Storage Sync Resource and create a Sync Group:

A Sync Group defines the sync topology that will replicate our files – so if you require different sets of data replicated then you will need to use different sync groups. For example we could sync completely different sets of data, but on the same or different file servers, in completely different locations, using the same Storage Sync Resource, but separate Sync Groups. This provides plenty of flexibility, and the option use the local file servers (in our Branch sites) to act as a cache for the Azure File Share, and thus reduce the amount of data that is replicated within the topology: https://docs.microsoft.com/en-us/azure/storage/files/storage-sync-cloud-tiering

Creating the Sync Group is easy – just a few basic details to fill in; the Storage Account we have created, and the File Share that will host the data:

Once this is completed – we have the basics for our topology in place, and we need to register our first server into the topology. The registered servers pane will be blank at this point, as we have not yet installed the Azure File Sync Agent onto any Servers:

To start the process – log onto the first server and download the Azure File Sync Agent from this URL: https://www.microsoft.com/en-us/download/details.aspx?id=57159. Installation of the File Sync Agent is straightforward and simple – just keep clicking next (you can input proxy settings if you require)… the only configuration is the account we wish to associate the server to, which is completed after setup:

Just click “Sign in” and then follow the instructions – you’ll need to sign into your Azure account, and will then be prompted to select the Azure Subscription, Resource Group, and Storage Sync Service we are registering this server to:

Once this is done – just click “Register”. (You may get prompted to sign in again here – I did) – once completed, the registration process is completed:

Our newly registered server now shows up in the “Registered Servers” pane:

Next we need to configure the Sync Group – so that this server is added and starts to replicate data into our topology. To do this, browse to the Sync Group settings:

From here, we can add the server:

This is just a case of selecting the server from a drop down, and then entering the path where our data exists. Note – I am not using Cloud Tiering here as I want all data on all replication points within the topology:

Once this is done – our server is added to the Sync Group, initially it will show as provisioning, then pending, and then as healthy:

If we now look in the Azure File Share – we can see that the data from the server has been replicated into the Cloud:

So – we now have a single server in a replication topology between itself and an Azure File Share. If I add a new folder to the Branch Server – we see this replicated onto the Azure File Share after a short time:

Next – I will deploy the Agent to a new server (in another Branch) to test out the replication and initial sync. To do this it’s just a case of installing the Agent exactly as we did before – and ensuring that we register this server to the same Storage Sync Service. Once this has been completed, we register the server again (exactly as before – but with a different path if required) and then we will start seeing data being synchronised:

Note that I have used a different path – you can change this on a per server basis if you require, so there is no need to have all servers setup in an identical disk arrangement. If we look in the Sync Group pane after a short while for the sync to take place, we can see both servers are setup:

We can also see that the data has been replicated to the 2nd Server I have added:

Bingo – we now have a working topology that will keep data in sync between our offices using Azure File Sync. No VPNs required, no complicated configuration, just two Agent installations, and some basic Azure Configuration. This provides a simple and effective way to keep branch site data in Sync and provides a number of potential use cases whereby complicated setups would otherwise be required.

Until next time, Cheers!