Deploy Azure Bicep (ARM Templates) to Multiple Environments using Azure DevOps Pipelines

This post looks at deploying resources defined using Azure Bicep to multiple environments using Azure DevOps Pipeliens

Previously I’ve posted on developing Azure Bicep code using Visual Studio Code and on how to use an Azure DevOps Pipeline to deploy bicep code to Azure. In this post we’re going to go one step further and look at deploying resources defined using Azure Bicep to multiple environments. Our goal is to fully automate this process, so we’re going to leverage Azure DevOps Pipelines to define a build and release process.

Let’s summarise the goals:

  • Azure resources to be defined using Azure Bicep
  • Build and Release process to be defined in yaml in Azure DevOps Pipelines
  • Build process to be triggered whenever the bicep code changes
  • Build process should output the compiled ARM template as an artifact
  • Release process needs to support multiple environments
  • Each environment should use a different resource group
  • Release process should gate the release to particular environments based on a mix of branch and approval gates

You might think these goals are too hard to achieve using a single build and release process but the good news is that most of the heavy lifting is already done by Azure DevOps.

Here’s a quick summary of what we needs to setup:

  1. Azure Bicep file – this will define the resource(s) that we’re going to deploy to each environment
  2. Azure DevOps Environments – this will define the different environments we’re going to deploy to. At this stage this is limited to defining the approvals and checks that will be done before code/resources are deployed to an environment
  3. Azure DevOps Variable Groups – this will define variables that are used across all build and deployment steps, as well as variables that are specific to individual environments
  4. Azure DevOps Pipeline – this will define the actual build and release process for the bicep code.

Defining Resources Using Azure Bicep

Let’s start by defining a very simple Azure Bicep (services.bicep) that defines a storage account.

/* 
******************  
    Literals
****************** 
*/

// Resource type prefixes
var storageAccountPrefix = 'st'

/* 
******************  
    Parameters
****************** 
*/

// ** General
param applicationName string 
param location string 
param env string 

/* 
******************  
    Resources
****************** 
*/

var appNameEnvLocationSuffix  = '${applicationName}${env}'

// Storage Account

var storageAccountName  = '${storageAccountPrefix}${appNameEnvLocationSuffix}' 

resource attachmentStorage 'Microsoft.Storage/storageAccounts@2019-06-01' = {
    name: storageAccountName
    location: location
    sku: {
        name: 'Standard_LRS'
        tier: 'Standard'
    }
    kind: 'StorageV2'
    properties: {
        accessTier: 'Hot'
        minimumTlsVersion: 'TLS1_2'
        supportsHttpsTrafficOnly: true
        allowBlobPublicAccess: true
        networkAcls: {
            bypass: 'AzureServices'
            defaultAction: 'Allow'
            ipRules: []
        }
    }
}

A couple of things to note about this bicep file

  • It defines a constant, storageAccountPrefix, that is used to define the full name of the generated resource. This prefix comes from the list of recommended resource-type prefixes that the Microsoft documentation lists
  • There are three parameters defined: applicationName, location and env. Location is used to define the region that the resource will be created (alternatively you could use resourceGroup().location to use the same location as the resource group where this resource is being created). The applicationName and env parameters are combined with the storageAccountPrefix to define a unique name for the resource being created.
  • All three parameters are required, meaning that they will need to be supplied when deploying the generated ARM template. To simplify the process you may want to define default values for applicationName and location. It’s important that you supply the env value during deployment to ensure the resources in each resource group are unique across your subscription.

Setting Up Multiple Environments

Next we’ll define the different environments in Azure DevOps. We’ll keep things relatively simple and define three environments:

  1. Development – from the develop branch
  2. Testing – from the release branch
  3. Production – from the release branch but requiring approval

To create these environments, click on the Environments node under Pipelines from the navigation tree on the left side of the Azure DevOps portal for the project. Click the New environment button and enter a Name and Description for the each environment.

Branch Control

For each environment we need to limit deployments so that only code from the appropriate branch can be deployed to the environment. To setup a branch control check for an environment you first need to open the environment by selecting it from the list of Environments. From the dropdown menu in the top right corner, select Approvals and checks. Next, click the Add check (+) button. Select Branch control and then click the Next button.

In the Branch control dialog we need to supply the name of the branch that we want to limit deployments from. For example for the Development environment we would restrict the Allowed branches to the develop branch (specified here as refs/heads/develop).

Note here that we’ve also included refs/tags/* in the list of Allowed branches. This is required so that we can use the bicep template from the Pipeline Templates repository. I’d love to know if there’s a way to restrict this check to only a specific repository, since adding refs/tags to the Allowed branches will mean that any tagged branch in my repository will also be approved. Let me know in the comments if you know of a workaround for this.

The Testing and Production environments are both going to be restricted to the Release branch. However, we’re also going to enforce a check on the branch to Verify branch protection.

What this means is that the Release branch will be checked to ensure branch policies are in place. For example the Release branch requires at least one reviewer and a linked work item.

Approvals

The only difference between the Testing and Production environments is that deployment to the Production environment requires a manual approval. This makes sense in most cases considering you may need to co-ordinate this with a marketing announcement or a notification to existing customers.

Setting up an approval gate starts similar to adding branch control. Open the environment and go to Approvals and checks. Click the Add check button and select Approvals. In the Approvals dialog, enter the list of users that can approve the deployment, adjust any of the other properties and then click Create.

We typically set a very short approval timeout period to avoid the scenario where multiple deployments get queued up behind each other. If we create another deployment before the first has been approve, we prefer to only have the latter deployment pushed to the Production environment. Of course, this is something your team should discuss and agree on what strategy you want to employ.

Environment Variable Groups

Since we’re going to be deploying the same set of resources to each of the environments, we’re going to need a way to specify build and release variables, some that are common across all stages of the pipeline, and some that are specific to each environment. To make it easy to manage the variables used in the pipeline we’re going to use variable groups which can be defined within the Library tab within Azure DevOps Pipelines.

Common Build Variables

We’ll create a variable group called Common Build Variables and we’ll add two properties, ResourceGroupLocation and AzureSubscriptionConnectionName.

As you can probably deduce, the ResourceGroupLocation will be the region where all the resources will be created. For simplicity this will assumed to be the same across all environments. The AzureSubscriptionConnectionName we’ll come back to but needless to say, it’s the same connection for all stages in the pipeline.

Environment Specific Variables

For each environment we’re going to define a variable group that is named Common.[enviornment]. These variable groups will contain variables that are specific to each environment. In this case, we’re going to define the EnvironmentName, the EnvironmentCode and ResourceGroupName

We’ll show these variable in action shortly but it’s important to remember that any variable that needs to vary, based on which Enviornment it’s being deployed to, should be defined in the appropriate variable group.

Build and Release Process

The build and release process is going to be defined as a yaml pipeline in Azure DevOps.

Service Connections

Before we can jump in to write some yaml, we need to setup a couple of service connections.

  • Pipeline-Templates – this is a github service connection so that the build process can download the appropriate pipeline template to assist with the compilation of the bicep file.
  • Azure-Subscription – this is a link to the Azure subscription where the resource groups will be created and subsequently the resources created.

I’m not going to step through process of creating these connections, since you can simply follow the prompts provided in the Azure portal. However, it’s important to take note of the name of the service connections.

Build and Deploy Process

Here’s the full build and release pipeline, which we’ll step through in more detail below.

trigger:
  branches:
    include:
    - '*'  # must quote since "*" is a YAML reserved character; we want a string
  paths:
    include:
    - azure/services.bicep
    - pipelines/azure-services.yml
  
resources:
  repositories:
    - repository: pipelinetemplates
      type: github
      name: builttoroam/pipeline_templates
      ref: refs/tags/v0.7.0
      endpoint: Pipeline-Templates
  
pool:
  vmImage: 'windows-latest'
  
variables:
  - group: 'Common Build Variables'
  - name: application_name
    value: inspect
  - name: bicep_filepath
    value: 'azure/services.bicep'
  - name: arm_template_filepath
    value: '$(Pipeline.Workspace)/BicepArtifacts/services.json'
  
stages:
- stage: Compile_Bicep
  pool:
    vmImage: 'windows-latest'

  jobs:
  - job: Bicep
    steps:
      - template: azure/steps/bicep/bicep.yml@pipelinetemplates
        parameters:
          name: Bicep
          bicep_file_path: '$(System.DefaultWorkingDirectory)/$(bicep_filepath)'
          arm_path_variable: ArmFilePath

      - task: CopyFiles@2
        displayName: 'Copying bicep file to artifacts folder'
        inputs:
          contents: '$(Bicep.ArmFilePath)'
          targetFolder: '$(build.artifactStagingDirectory)'
          flattenFolders: true
          overWrite: true

      - task: PublishBuildArtifacts@1
        displayName: 'Publish artifacts'
        inputs:
          pathtoPublish: '$(build.artifactStagingDirectory)' 
          artifactName: 'BicepArtifacts' 
          publishLocation: Container


- template:  templates/deploy-arm.yml
  parameters:
    stage_name: 'Deploy_Development'
    depends_on: 'Compile_Bicep'
    deploy_environment: 'Development'

- template:  templates/deploy-arm.yml
  parameters:
    stage_name: 'Deploy_Testing'
    depends_on: 'Deploy_Development'
    deploy_environment: 'Testing'
  
- template:  templates/deploy-arm.yml
  parameters:
    stage_name: 'Deploy_Production'
    depends_on: 'Deploy_Testing'
    deploy_environment: 'Production'

Trigger – we’ve setup this process to kick off whenever code is committed to the develop environment.

Resources – this process leverages the templates from PipelineTemplates, which requires the definition of a resource pointing to the appropriate tagged release in the pipeline templates github repository.

Stages – there’s one build stage, followed by three release (aka deploy) stages. The steps for the build stage leverage the bicep template from pipeline templates, in order to generate the ARM template. The ARM template is then executed in each of the environments and deployed to the appropriate resource group.

The three deploy stages all used the same template, that we’ll see in a minute, coupled with an environment specific parameter value. For example the first stage passes in a parameter ‘Development’.

Deploy Stage Template

As I mentioned, the three deployment stages are identical, except for some parameter values that are defined for each environment. In this case the template referenced for each stage includes creating the resource group and then planting it.

parameters:
- name: stage_name
  type: string
  default: 'Deploy_ARM_Resources'

- name: depends_on
  type: string
  default: ''

  # deploy_environment - Environment code
- name: deploy_environment
  type: string

stages:
- stage: ${{ parameters.stage_name }}
  dependsOn: ${{ parameters.depends_on }}
  variables:
  - group: 'Common.${{ parameters.deploy_environment }}'
  
  pool:
    vmImage: 'windows-latest'

  jobs:
  - deployment: 'Deploy${{ parameters.stage_name }}'
    displayName: 'Deploy ARM Resources to ${{ parameters.deploy_environment }}' 
    environment: ${{ parameters.deploy_environment }}
    strategy:
      runOnce:
        deploy:
          steps:
          - task: PowerShell@2
            name: ${{ parameters.stage_name }}
            inputs:
              targetType: 'inline'
              workingDirectory: $(Pipeline.Workspace)
              script: |
                  $envParam = '${{ parameters.deploy_environment }}'
                  Write-Host "Deployment deploy environment parameter: $envParam"

                  $envName = '$(EnvironmentName)'
                  Write-Host "Deployment environment name variable: $envName"

          - task: AzureCLI@2
            displayName: 'Create resource group - $(ResourceGroupName)'
            inputs:
              azureSubscription: $(AzureSubscriptionConnectionName)
              scriptType: ps
              scriptLocation: inlineScript
              inlineScript: |
                Write-Host "Creating RG: $(ResourceGroupName)"
                az group create -n $(ResourceGroupName) -l $(ResourceGroupLocation)
                Write-Host "Created RG: $(ResourceGroupName)"

          - task: AzureResourceGroupDeployment@2
            displayName: 'Deploying ARM template to $(ResourceGroupName)'
            inputs:
              azureSubscription: $(AzureSubscriptionConnectionName)
              action: 'Create Or Update Resource Group' 
              resourceGroupName: $(ResourceGroupName)
              location: $(ResourceGroupLocation) 
              templateLocation: 'Linked artifact'
              csmFile: '$(arm_template_filepath)' # Required when  TemplateLocation == Linked Artifact        
              overrideParameters: '-location $(ResourceGroupLocation) -env $(EnvironmentCode) -applicationName $(application_name)'

Throughout this pipeline, there are various variables referenced. The important thing to note is that the variables need to exist for each environment, or are environment independent.

The Common Build Variables group was imported in the build and release process yaml file. The ResourceGroupLocation is the only variable from this group that’s used within this template.

The variable group for each environment is imported within the deploy stage template. The environment name, which is passed in as the deploy_environment parameter, is concatenated with “Common.” in order to import the correct variable group. The imported variables can be referenced the same way as other locally defined variables.

The deployment pipeline template has three steps: The first simply outputs variables so that it’s clear what environment is being built. Then there’s a task for creating the resource group, and then lastly a task for deploying the ARM template.

Running the Pipeline End to End

In the post we’ve defined three different environment and configured Azure DevOps to have different variable groups for each of the environments. Here you can see an execution of the pipeline with the build stage, followed by three deployment stages.

In this case note the Testing deployment failed because the resources were being deployed from the wrong branch (develop instead of release). Unfortunately because this one stage failed, the entire pipeline was marked as failed.

Building a Bicep from an ARM Template with Parameters

As I start to work with Azure Bicep I figured I’d share some of my experiences here. First up, is just a quick recap of how to generate bicep code from an ARM template. Actually, the problem I initially started with was how to start writing bicep code since I wasn’t really familiar with either … Continue reading “Building a Bicep from an ARM Template with Parameters”

As I start to work with Azure Bicep I figured I’d share some of my experiences here. First up, is just a quick recap of how to generate bicep code from an ARM template.

Actually, the problem I initially started with was how to start writing bicep code since I wasn’t really familiar with either the bicep syntax, or how to define each of the resources I wanted to create. I had skimmed through the docs that the team have put together, which helped my understand the basic syntax. Then I took a look through the various examples that the team had posted. In fact, they even have an interactive playground where you can enter bicep code and have it generate the corresponding ARM template. In the top right corner, they have a list of sample templates, along with the corresponding bicep code.

Now that I’d brushed up on the syntax and had trawled through a dozen or so of the examples, I figured I was ready to write my first bicep file. Hold up…. before I get into writing some bicep code, I actually decided to make sure I could build and deploy my bicep code locally (if you want to build and run your bicep code in an Azure DevOps pipeline, check out this post).

Build and Deploy Bicep Code

Here’s a quick summary of working with Visual Studio Code to write, compile and deploy your Bicep code.

  • Install latest Bicep version (v0.1.37-alpha at time of writing) – I’d recommend using the installer as this will correctly setup path variables etc
  • Install the Visual Studio Code extension – this isn’t available via the extensions marketplace, so you’ll need to download it from the release page on github. Make sure you follow the instructions to install the extension by selecting the downloaded VSIX from within Visual Studio Code (rather than double-clicking the downloaded file)
  • Launch Visual Studio Code and create your first, empty, bicep file eg myfirstbicep.bicep. Note that Visual Studio Code will recognise the file type and show “Bicep” as the language in the bottom right corner of the window.
  • Make sure you have the Azure CLI Tools extension installed for Visual Studio Code
  • Create an azcli file, eg myfirstbicep.azcli that will be used to run Azure CLI commands. Alternatively you can just enter the commands into a Powershell terminal (assuming you have the Azure CLI installed).
  • Add the following code to the azcli file – comments should explain what each line does.
# generates myfirstbicep.json (ie compiles the bicep file to ARM template)
bicep build myfirstbicep.bicep 

# create a resource group to deploy the bicep code to
# this is ignored if resource group already exists
az group create -n rgbicepexample -l westus 

# deploy the ARM template that was generated from bicep file
az deployment group create -f myfirstbicep.json -g rgbicepexample

# delete the resource group
# this will prompt to confirm deletion
az group delete -n rgbicepexample
  • Execute each line in the azcli file by right-clicking and selecting Run Line in Terminal

If you run each of the lines you should see output similar to the following

In this scenario I ran the final command to clean up the resource group. If you’re in the process of writing your bicep code, chances are that you’ll simply create the resource group once and then keep compiling the bicep code to the ARM template (first command in the azcli file) and deploying the updated ARM template.

First Bicep Resource

Now that we have the tooling setup so that we can compile and deploy our bicep file, it’s time to dig in and start to create resources. Despite having some examples to follow, I was still a bit bewildered by not knowing what options I needed to include for any given resource. I decided to take a rather pragmatic approach by using the Azure portal to generate resources, export the ARM template and then convert that to bicep code. The last step, as you’ll see, is very manual and repetitive – hopefully this will get easier when this GitHub issue is addressed, providing us with a tool to at least automatically generate bicep code from an ARM template.

Let’s step through this process and I’ll point out a couple of things along the way. I’ll create a Storage Account that can be deployed to a resource group. This should be enough to give you a flavour for the approach.

  • Start by invoking the “az group create” command in Visual Studio code to make sure you have a resource group to work with. Alternatively you can work with any existing resource group if you’d prefer (I tend to avoid doing this to ensure I don’t accidentally overwrite, or delete, other resources I might have)
  • Head to the Azure Portal and open the resource group. Click the Add button to launch the New resource wizard.
  • Search for Storage Account and provide the necessary details to create a Storage Account.
  • I’m not going to go through the various settings here but once you get to the Review + Create tab, you’ll see that there is a link at the bottom to Download a template for automation.
  • When you click through to the template, you’ll see that it’s nicely presented with a tree view to allow for each navigation around the ARM template
  • Rather than simply copying the entire ARM template into our bicep file (and then having to change the syntax from ARM to Bicep), we’re going to do this in steps. We’re going to start with the parameters but instead of using the Parameters in the ARM template, we’re going to grab the json from the Parameters tab.
{
    "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentParameters.json#",
    "contentVersion": "1.0.0.0",
    "parameters": {
        "location": {
            "value": "westus2"
        },
        "storageAccountName": {
            "value": "stbicepexample"
        },
        "accountType": {
            "value": "Standard_RAGRS"
        },
        "kind": {
            "value": "StorageV2"
        },
        "accessTier": {
            "value": "Hot"
        },
        "minimumTlsVersion": {
            "value": "TLS1_2"
        },
        "supportsHttpsTrafficOnly": {
            "value": true
        },
        "allowBlobPublicAccess": {
            "value": true
        },
        "networkAclsBypass": {
            "value": "AzureServices"
        },
        "networkAclsDefaultAction": {
            "value": "Allow"
        }
    }
}
  • Copy the json for the parameters into the bicep file and then start trimming the bits we don’t need – remember the bicep syntax is more succinct, so in a lot of cases the change to syntax is simply removing braces and the verbose ARM template code.

From the parameters json, the main things we need are the parameters and their default values, everything else can go. You’ll need to insert the keyword “param” and the type for each parameter – most are obvious from the default value but if in any doubt, you can go back to the ARM template and look in the parameters list. You’ll also need to change all strings from using ” to using ‘. What we’re left with is a very compact list of parameters, with their default values.

param location string = 'westus2'
param storageAccountName string = 'stbicepexample'
param accountType string = 'Standard_RAGRS'
param kind string = 'StorageV2'
param accessTier string = 'Hot'
param minimumTlsVersion string = 'TLS1_2'
param supportsHttpsTrafficOnly bool = true
param allowBlobPublicAccess bool = true
param networkAclsBypass string = 'AzureServices'
param networkAclsDefaultAction string = 'Allow'

By providing default values for each of the parameter, we’re making them optional – if a parameter value is provided as part of running the deployment, it will be used, otherwise the default value will be used. If you want to require a parameter to be provided as part of the deployment, simply remove the default value (eg change “param location string = ‘westus2′” to just “param location string”)

  • Next up is to copy across the json from the ARM template for the resource itself. I simply grab the resources block of json
    "resources": [
        {
            "name": "[parameters('storageAccountName')]",
            "type": "Microsoft.Storage/storageAccounts",
            "apiVersion": "2019-06-01",
            "location": "[parameters('location')]",
            "properties": {
                "accessTier": "[parameters('accessTier')]",
                "minimumTlsVersion": "[parameters('minimumTlsVersion')]",
                "supportsHttpsTrafficOnly": "[parameters('supportsHttpsTrafficOnly')]",
                "allowBlobPublicAccess": "[parameters('allowBlobPublicAccess')]",
                "networkAcls": {
                    "bypass": "[parameters('networkAclsBypass')]",
                    "defaultAction": "[parameters('networkAclsDefaultAction')]",
                    "ipRules": []
                }
            },
            "dependsOn": [],
            "sku": {
                "name": "[parameters('accountType')]"
            },
            "kind": "[parameters('kind')]",
            "tags": {}
        }
    ],
  • Converting this code is a little trickier but essentially, you need to start with declaring the resource which will look like the following, where the [type] and [apiVersion] need to be replaced by the values from the ARM template.
resource myStorage '[type]@[apiVersion]' = {
    properties: {
    }
}

Essentially the resource declaration is just a key-value pair object graph. To covert the ARM template to the corresponding object graph you mainly just need to remove the parenthesis and trailing commas. You’ll also need to replace the parameter references with a simple reference to one of the param variables we declared earlier (eg “parameters(‘kind’)” becomes just kind). The converted resource (including the param lines we converted earlier) then looks like.

param location string = 'westus2'
param storageAccountName string = 'stbicepexample'
param accountType string = 'Standard_RAGRS'
param kind string = 'StorageV2'
param accessTier string = 'Hot'
param minimumTlsVersion string = 'TLS1_2'
param supportsHttpsTrafficOnly bool = true
param allowBlobPublicAccess bool = true
param networkAclsBypass string = 'AzureServices'
param networkAclsDefaultAction string = 'Allow'


resource myStorage 'Microsoft.Storage/storageAccounts@2019-06-01' = {
    name: storageAccountName
    location: location
    sku: {
        name: accountType
    }
    kind: kind
    properties: {
        accessTier: accessTier
        minimumTlsVersion: minimumTlsVersion
        supportsHttpsTrafficOnly: supportsHttpsTrafficOnly
        allowBlobPublicAccess: allowBlobPublicAccess
        networkAcls: {
            bypass: networkAclsBypass
            defaultAction: networkAclsDefaultAction
        }
    }
}

This bicep code is good to go – you can compile and deploy this to your resource group. Don’t forget, once your code is finished and you’ve tested it locally, make sure you commit it to Azure DevOps and deploy it to Azure as part of a CI/CD process.

Important Note: A lot of Azure resources are pay for use, so you won’t rack up the dollars just by creating resources. However, there are some resources that will start to cost you as soon as they’re created. I would highly recommend deleting your development resource group whenever you’re done for the day, that way you can be sure you’re not going to continue to be charged.

Thinking Out Loud: Events, Messaging and Mvvm Navigation with XAML Frameworks

This post will explores mvvm navigation further, employing the latest c# 9 code generator to reduce the boilerplate code that developers have to write.

In my previous post on this topic, Thinking Out Loud: Mvvm Navigation for XAML Frameworks such as Xamarin.Forms, UWP/WinUI, WPF and Uno, I explored using events emitted by a ViewModel to drive page navigation. This post will explore this concept further, employing the latest c# 9 code generator to reduce the boilerplate code that developers have to write.

Before we go on, let’s just recap of where we got to previously:

  • ViewModels are independent, not knowing what’s before or after them in the navigation flow of the application
  • Use events to signify when a ViewModel is complete
  • ViewModel events are converted to navigation methods at an application level

The upshot is that you can have a simple ViewModel that simply raises an event to indicate that it’s complete (for example when the user clicks a submit button on a form).

public class MainViewModel
{
    public event EventHandler ViewModelDone;
    public async Task<int> DoSomething()
    {
        var rnd = new Random().Next(1000);
        await Task.Delay(rnd);
        if (rnd % 2 == 0)
        {
            ViewModelDone?.Invoke(this, EventArgs.Empty);
        }
        ... 
    }
}

There were a couple of pieces of feedback following my previous post:

  • Use an Observable instead of an event
  • Bypass the ViewModel event completely and simply raise a message that could be handled by the application. For example a developer could attach a Behavior to a Button that would send a message to an application wide dispatcher that could determine where to navigate to based on the message type, or perhaps the parameter set.

Whilst I like the first idea of a ViewModel exposing an Observable, I think that this is an idea we’ll explore sometime in the future. Using an event is incredibly simple and gives us the clear separation we’re after. The only downside is that the add/remove handler code required for events is somewhat nasty.

The idea of using messages, and having a central dispatcher for messages, is a great idea and one that I wanted to explore. I didn’t want to change the ViewModel to have to emit a message, since again this just adds additional complexity to the ViewModel. This means that there needs to be some sort of conversion between events and messages. As you can imagine, this is simply adding more code that developers need to write in order to get everything to work.

I’ve just pointed out two areas where developers will have to write unnecessary code: add/remove event handlers and converting between events and messages. I’ll be using the c# 9 code generators to help eliminate this excess code.

TL;DR

In this post I’m not going to I walk through the complexities of events, messaging and code generation because that would make for a long post. Instead, let me walk through a scenario where we’re going to add a new page, FifthPage, to our existing application (which as you can probably guess, already has four pages). Here’s what we need:

  • Add FifthPage
  • Add a Button to MainPage that navigates to FifthPage when clicked
  • Add Button to FifthPage that navigates back to MainPage when clicked
  • Add FifthViewModel that will be the DataContext for FifthPage
  • Add string property, Title, to FifthViewModel that returns a page title
  • Add TextBlock to FifthPage that is bound to the Title property on the FifthViewModel.

Create Page and ViewModel

The first step is to create the FifthPage and FifthViewModel. I’ve separated out the application code into different projects, so I have my view models in a project called MvvmNavigation.Core. My pages are still in the MvvmNavigation.Shared project that was created by the Uno solution template.

Whilst we’re creating these classes, we’ll add some of the basics that we’re going to need. On the FifthPage, we’ll add a TextBlock, for the Title, and a Button, to trigger navigation back to the MainPage.

<Page
    x:Class="MvvmNavigation.FifthPage"
    xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation"
    xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml"
    xmlns:d="http://schemas.microsoft.com/expression/blend/2008"
    xmlns:mc="http://schemas.openxmlformats.org/markup-compatibility/2006"
    mc:Ignorable="d"
    Background="{ThemeResource ApplicationPageBackgroundThemeBrush}">

    <StackPanel VerticalAlignment="Center"
                HorizontalAlignment="Center">
        <TextBlock Text="page title" />
        <Button Content="Go Back" />
    </StackPanel>
</Page>

In the FifthViewModel we’ll add a Title property and a simple event, FifthDone, that will indicate to the application that the FifthViewModel is done. The application will be responsible for determining how to handle this; which according to our specification, should navigate back to MainPage. We’ll also create a RaiseFifthDone method that can be called to invoke the FifthDone event.

public class FifthViewModel
{
    public event EventHandler FifthDone;

    public string Title => "Page 5";

    public void RaiseFifthDone()
    {
        FifthDone?.Invoke(this, EventArgs.Empty);
    }
}

Navigation to FifthPage

To navigate to the FifthPage we need a new Button on the MainPage that, when clicked, will raise a message, PleadTheFifthMessage, that the application will handle in order to navigate to the FifthPage. Let’s unpack this into the steps:

Add Button

Add a Button to the MainPage XAML, along with the NavigationMessageAction behavior which will raise the PleadTheFifthMessage

<Button Content="Go To Page 5">
    <Interactivity:Interaction.Behaviors>
        <Interactions:EventTriggerBehavior EventName="Click">
            <builditbehaviors:NavigationMessageAction MessageType="localmessages:PleadTheFifthMessage" />
        </Interactions:EventTriggerBehavior>
    </Interactivity:Interaction.Behaviors>
</Button>

PleadTheFifthMessage

Add a class, PleadTheFifthMessage, that inherits from CompletedMessage.

public class PleadTheFifthMessage : CompletedMessage
{
    public PleadTheFifthMessage() : base() { }
    public PleadTheFifthMessage(object sender) : base(sender) { }
}

Map PleadTheFifthMessage to FifthViewModel

As part of the MvvmApplicationService, we need to register a navigation to the FifthViewModel for the PleadTheFifthMessage. In this case, we’re only handling the PleadTheFifthMessage for when it’s raised by the MainViewModel.

serviceRegistrations.AddSingleton<INavigationMessageRoutes>(sp =>
{
    var routes = new NavigationMessageRoutes()
        .RegisterNavigate<MainViewModel, PleadTheFifthMessage, FifthViewModel>()
        .RegisterNavigate<MainViewModel, CompletedMessage, SecondViewModel>()
	// ... omitted for brevity
        .RegisterGoBack<CloseMessage>();

    return routes;
});

FifthPage – FifthViewModel Mapping

Whilst we’ve defined a mapping from the PleadTheFifthMessage to the FifthViewModel, there needs to be a way for the application to connect the FifthViewModel to the FifthPage. Rather than rely on naming convention, we’re going to apply an Attribute to the FifthPage.

[ViewModel(typeof(FifthViewModel))]
public sealed partial class FifthPage 
{
    public FifthPage()
    {
        this.InitializeComponent();
    }
}

ViewModel Binding

So far we’ve wired up the navigation from MainPage to FifthPage. However, when arriving at FifthPage it’s clear than neither the Title, nor the Button event handler, has been wired up. The title could be easily wired up by simply creating an instance of the FifthViewModel in XAML and setting it as the DataContext.

However, this doesn’t scale well for real world applications where a view model may be dependent on any number of services that are required (eg for fetching and/or saving data). It’s preferable to have some sort of depedency injection framework that can be used to instantiate the view model.

ViewModel Instantiation

To this end, we’re going to add a ViewModel property to our FifthPage, along with a partial method, InitiViewModel (note also that we’ve added a second parameter to the ViewModel attribute). The implementation of the partial method will be done by our code generator that will generate the code necessary to instantiate, along with any dependent services, the FifthViewModel.

[ViewModel(typeof(FifthViewModel), nameof(InitViewModel))]
public sealed partial class FifthPage 
{
    partial void InitViewModel();
    public FifthViewModel ViewModel => this.ViewModel(() => DataContext as FifthViewModel, () => InitViewModel());

    public FifthPage()
    {
        this.InitializeComponent();
    }
}

FifthViewModel Registration

Despite providing the mapping between FifthPage and FifthViewModel, there’s currently no way for the dependency injection container to create an instance of FifthViewModel, since we haven’t registered the FifthViewModel type. Rather than have the developer work out where to add the code to register the FifthViewModel type, we can simply attribute the FifthViewModel:

[Register]
public class FifthViewModel
{
    // ... 
}

Bind FifthViewModel With x:Bind

We can then update the XAML of our FifthPage to data bind both the Title property and the RaiseFifthDone method on the FifthViewModel.

<Page x:Class="MvvmNavigation.FifthPage"
      xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation"
      xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml"
      xmlns:d="http://schemas.microsoft.com/expression/blend/2008"
      xmlns:mc="http://schemas.openxmlformats.org/markup-compatibility/2006"
      mc:Ignorable="d"
      Background="{ThemeResource ApplicationPageBackgroundThemeBrush}">

    <StackPanel VerticalAlignment="Center"
                HorizontalAlignment="Center">
        <TextBlock Text="{x:Bind ViewModel.Title}" />
        <Button Content="Go Back"
                Click="{x:Bind ViewModel.RaiseFifthDone}" />
    </StackPanel>
</Page>

Navigation Back to MainPage

When we created the FifthViewModel we already created the FifthDone event which will be invoked by the RaiseFifthDone method. However, clicking the Button on the FifthPage currently does nothing – actually it does indeed raise the FifthDone event but currently nothing is listening to that event.

Let’s add the EventMessage attribute to the FifthDone event. In this case the attribute references the existing CloseMessage which is the message that will be dispatched when the FifthDone event is raised.

[Register]
public class FifthViewModel
{
    [EventMessage(typeof(CloseMessage))]
    public event EventHandler FifthDone;

    // ...
}

We already have a handler for the CloseMessage for all pages that will simply close the current page.

That completes all the steps necessary to add a new page along with navigation too and from the page. It seems there’s a lot of steps, but actually there’s only minimal code required and it facilitates a high degree of separation between the elements of the application.

If you want to check out the code for this example app, feel free to check out the MvvmNavigation GitHub repo. Note that this is a work in progress and that there will most likely be quite a bit of refactoring over the coming weeks, after which I’ll post about more of the details on how the various mappings work and the code generation that’s used behind the scenes.

How to Debug C# 9 Source Code Generators

Build and debug a source code generator with C# 9 and Visual Studio.

I’ve been working on a follow up to my previous post on a different take on MVVM (Thinking Out Loud: Mvvm Navigation for XAML Frameworks such as Xamarin.Forms, UWP/WinUI, WPF and Uno) and I’ve got the code to a point that I’m mostly happy with it. However, there’s a couple of things that are just tedious for developers to have to do and something that’s perfect for source code generation. More about that in a later post; in this post I want to cover how you can start to generate code and how you can debug the source code generation process.

Before we get started, for a more detailed set of instructions on how to get started with code generation, the dotnet team has a great post on this from earlier this year – Introducing C# Source Generators.

Let’s jump in and create a simple code generator (remember the point of this post isn’t to talk in detail about code generation, it’s to show you how to debug the generation process).

We’ll start with just a vanilla .NET standard 2.0 class library, GenerationSample – this is the assembly where we’ll be doing the code generation. I’ve specifically picked a .NET Standard 2.0 class library to point out that even though the code generation is a feature of the C# 9 feature set, you can still use it to inject code into existing libraries. We’ll come back to this library in a minute, once we’ve created our code generator.

Next up, we need to create a new project that will house our code generator. Again, I’ll create a .NET Standard 2.0 library, called Generators. Creating a generator only requires us to reference the appropriate NuGet packages and then create a class that implements ISourceGenerator (and has the Generator attribute).

Here’s the updated csproj file with both the NuGet package references we need, as well as LangVersion set to preview

<Project Sdk="Microsoft.NET.Sdk">
	<PropertyGroup>
		<TargetFramework>netstandard2.0</TargetFramework>
		<LangVersion>preview</LangVersion>
	</PropertyGroup>
	<ItemGroup>
		<PackageReference Include="Microsoft.CodeAnalysis.CSharp" Version="3.8.0-3.final" PrivateAssets="all" />
		<PackageReference Include="Microsoft.CodeAnalysis.Analyzers" Version="3.3.0" PrivateAssets="all" />
	</ItemGroup>
</Project>

To do the code generator we’ll create a class called DebuggableGenerator that implements the ISourceGenerator interface. Note: If you copy the code from the introductory post from April 2020 by the dotnet team, you’ll see a bunch of errors relating to the ISourceGenerator interface.

These errors can easily be fixed by clicking on the dropdown next to the error and selecting Implement Interface, and then removing the old methods – it looks like the ISourceGenerator interface changed since the post back in April. Our code generator should look similar to the following code at this point.

using Microsoft.CodeAnalysis;
using System;
using System.Diagnostics;

namespace Generators
{
    [Generator]
    public class DebuggableGenerator : ISourceGenerator
    {
        public void Execute(GeneratorExecutionContext context)
        {
            Debug.WriteLine("Execute code generator");
        }
        public void Initialize(GeneratorInitializationContext context)
        {
            Debug.WriteLine("Initalize code generator");
        }
    }
}

The last thing to do, to complete the setup, is to add a reference to the Generators library into the GenerationSample library. The csproj should now look similar to the following.

<Project Sdk="Microsoft.NET.Sdk">
	<PropertyGroup>
		<TargetFramework>netstandard2.0</TargetFramework>
		<LangVersion>preview</LangVersion>
	</PropertyGroup>
	<ItemGroup>
		<ProjectReference 
			Include="..\Generators\Generators.csproj"
			OutputItemType="Analyzer"
			ReferenceOutputAssembly="false" />
	</ItemGroup>
</Project>

A couple of things to note: We’ve set the LangVersion to preview and added a couple of additional attributes to the ProjectReference – these are required to link the code generator into the compilation process.

At this point, you’re good to start implementing your code generation. However, as pointed out by the dotnet team in the FAQ section of their post, there’s currently no built in support for debugging. You can set a breakpoint in Visual Studio but it won’t be hit. You can write Debug.WriteLine or Console.WriteLine statements but they won’t appear in the Output tool window (not even if you set the verbosity to Diagnostic).

Luckily, there’s quite a simple hack to give you a mostly-fully featured debugging experience. At the start of the Initialize methods (you can think of this as the entry point for code generation) add code to launch the debugger.

using Microsoft.CodeAnalysis;
using System.Diagnostics;

namespace Generators
{
    [Generator]
    public class DebuggableGenerator : ISourceGenerator
    {
        public void Execute(GeneratorExecutionContext context)
        {
            Debug.WriteLine("Execute code generator");
        }
        public void Initialize(GeneratorInitializationContext context)
        {
#if DEBUG
            if (!Debugger.IsAttached)
            {
                Debugger.Launch();
            }
#endif 
            Debug.WriteLine("Initalize code generator");
        }
    }
}

To ensure this code doesn’t accidentally end up in my Release code I’ve wrapped it using conditional compilation. The code also tests to see if the Debugger is already attached, otherwise you’ll find that it will continually prompt to launch a new debugging session.

Now if we force a rebuild of our GenerationSample library, we’ll see a prompt asking to specify where the code should be debugged.

From this dialog you can either select the existing instance of Visual Studio, or open a new instance. My preference it to open up a new instance of Visual Studio – it feels a bit to much like inception to debug the code generation in the same instance of Visual Studio but this comes down to whatever works for you.

Once the debugger is attached, you can step through the code, view variables, set breakpoints etc. Unfortunately it appears that Edit and Continue doesn’t work, so it does mean that in order to make changes you need to stop debugging, make changes and then rebuild the project in order to trigger the code generator to run.

And there you have it -the ability to debug your code generation library using Visual Studio.

Pipeline Templates: Using Project Bicep with Azure DevOps Pipeline to Build and Deploy an ARM Templates

This post covers how you can use a .bicep file in your Azure DevOps pipeline to deploy resources to Azure via an ARM template

It’s great to see that Microsoft listens to the pain that developers and devops professionals go through in working with Azure. Specifically that ARM templates, whilst very powerful, are insanely verbose and cumbersome to work with. Hence Project Bicep which is at it’s core a DSL for generating ARM templates. In this post I’m not going to go into much details on how to work with .bicep files but rather on how you can use a .bicep file in your Azure DevOps pipeline to deploy resources to Azure.

If you are interested in learning more about Project Bicep and the .bicep file format, here are some posts that provide some introductory material:

In order to deploy a .bicep file to Azure, you first need to use the bicep command line to generate the corresponding ARM template. You can then deploy the ARM template to Azure. Rather than having to add multiple steps to your build pipeline, wouldn’t it be nice to have a single step that you can use that simply takes a .bicep file, some parameters and deploys it to Azure. Enter the bicep-run template that is part of the 0.7.0 release of Pipeline Templates.

If you haven’t worked with any of the templates from the Pipeline Templates project, here’s the quick getting started:

Add the following to the top of your pipeline – this defines an external repository called all_templates that can be referenced in your pipeline.

resources:
  repositories:
    - repository: all_templates
      type: github
      name: builttoroam/pipeline_templates
      ref: refs/tags/v0.7.0
      endpoint: github_connection

Next, we’re going to use the bicep-run template to deploy our .bicep file to Azure.

      - template: ../../steps/bicep/bicep-run.yml
        parameters:
          name: BicepRun
          az_service_connection: $(service_connection)
          az_resource_group_name: $(resource_group_name)
          az_resource_location: $(resource_location)
          bicep_file_path: '$(bicep_filepath)'
          arm_parameters: '-parameter1name $(parameter1value) -parameter2name $(parameter2value)'

The bicep-run wraps the following:

  • Downloads and caches the Project Bicep command line. It currently references the v0.1.37 release but you can override this by specifying the bicep_download_url – make sure you provide the url to the windows executable, not the setup file.
  • Runs the Bicep command line to covert the specified .bicep file (bicep_file_path parameter) into the corresponding ARM template
  • Uses the Azure Resource Group Deployment Task to deploy the ARM template into Azure. The arm_parameters are forwarded to the overrideParameters parameter on the Azure Resource Group Deployment task.

Would love feedback on anyone that takes this template for a spin – what features would you like to see added? what limitations do you currently see for Project Bicep and the ability to run using this task?

Note: The bicep-run template is designed to run on a windows image.