Using Azure DevOps Stage Dependency Variables with Conditional Stage and Job Execution

I have been doing some work with Azure DevOps multi-stage YAML pipelines using stage dependency variables and conditions. They can get confusing quickly, you need one syntax in one place and another elsewhere.

So, here are a few things I have learnt…

What are stage dependency variables?

Stage Dependencies are the way you define which stage follows another in a multi-stage YAML pipeline. This is as opposed to just relying on the order they appear in the YAML file, the default order. Hence, they are critical to creating complex pipelines.

Stage Dependency variables are the way you can pass variables from one stage to another. Special handling is required, as you can’t just use the ordinary output variables (which are in effect environment variables on the agent) as you might within a job as there is no guarantee the stages and jobs are running on the same agent.

For stage dependency variables, is not how you create output variables, that does not differ from the standard manner, the difference is in how you retrieve them.

In my sample, I used a BASH script to set the output variable based on a parameter passed into the pipeline, but you can create output variables using scripts or tasks

  - stage: SetupStage
    displayName: 'Setup Stage'
      - job: SetupJob
        displayName: 'Setup Job'
          - checkout: none
          - bash:  |
              set -e # need to avoid trailing " being added to the variable
              echo "##vso[task.setvariable variable=MyVar;isOutput=true]${{parameters.value}}"
            name: SetupStep
            displayName: 'Setup Step'

Possible ways to access a stage dependency variable

There are two basic ways to access stage dependency variables, both using array objects


Which one you use, in which place, and whether via a local alias is the complexity

How to access a stage dependency in a script?

To access a stage dependency variable in a script, or a task, there are two key requirements

  • The stage containing the consuming job and hence script/task, must be set as dependant on the stage that created the output variable
  • You have to declare a local alias for the value in the stageDependencies array within the consuming stage. This local alias will be used as the local name by scripts and tasks

Once this is configured you can access the variable like any other local YAML variable

  - stage: Show_With_Dependancy
    displayName: ‘Show Stage With dependancy’
      - SetupStage
      localMyVarViaStageDependancies : $[stageDependencies.SetupStage.SetupJob.outputs[‘SetupStep.MyVar’]]
      - job: Job
        displayName: ‘Show Job With dependancy’
        - bash: |
              echo “localMyVarViaStageDependancies - $(localMyVarViaStageDependancies)”

Tip: If you are having a problem with the value not being set for a stage dependency variable look in the pipeline execution log, at the job level, and check the ‘Job preparation parameters’ section to see what is being evaluated. This will show if you are using the wrong array object, or have a typo, as any incorrect declarations evaluate as null

How to use a stage dependency as a stage condition

You can use stage dependency variables as controlling conditions for running a stage. In this use-case you use the dependencies array and not the stagedependencies used when aliasing variables.

  - stage: Show_With_Dependancy_Condition
    condition: and (succeeded(), eq (dependencies.SetupStage.outputs['SetupJob.SetupStep.MyVar'], 'True'))
    displayName: 'Show Stage With dependancy Condition'

From my experiments for this use-case, you don’t seem to need the DependsOn entry to decare the stage that exposed the output variable for this to work. So, this is very useful for complex pipelines where you want to skip a later stage based on a much earlier stage for which there is no direct dependency.

A side effect of using a stage condition is that many subsequent stages have to have their execution conditions edited as you cannot rely on the default completion stage state succeeded. This is because the prior stages could now be succeeded or skipped. Hence all following stages need to use the condition

condition: and( not(failed()), not(canceled()))

How to use a stage dependency as a job condition

To avoid the need to alter all the subsequent stage’s execution conditions you can set a condition at the job or task level. Unlike setting the condition at that stage level, you have to create a local alias (see above) and check the condition on that

  - stage: Show_With_Dependancy_Condition_Job
    displayName: 'Show Stage With dependancy Condition'
      - SetupStage
      localMyVarViaStageDependancies : $[stageDependencies.SetupStage.SetupJob.outputs['SetupStep.MyVar']]
      - job: Job
        condition: and (succeeded(),
          eq (variables.localMyVarViaStageDependancies, 'True'))
        displayName: 'Show Job With dependancy'

This technique will work for both Agent-based and Agent-Less (Server) jobs

A warning though, if your job makes use of an environment with a manual approval, the environment approval check is evaluated before the job condition. This is probably not what you are after, so if using conditions with environments that use manual approvals then the condition is probably best set at the stage level, with the knock-on issues of states of subsequent stages as mentioned above.

An alternative, if you are just using the environment for manual approval, is to look at using an AgentLess job with a manual approval. AgentLess job manual approvals are evaluated after the job condition, so do not suffer the same problem.

If you need to use a stage dependency variable in a later stage, as a job condition or script variable, but do not wish to add a direct dependency between the stages, you could consider ‘republishing’ the variable as an output of the intermedia stage(s)

  - stage: Intermediate_Stage
      - SetUpStage
      localMyVarViaStageDependancies : $[stageDependencies.SetupStage.SetupJob.outputs['SetupStep.MyVar']]
      - job: RepublishMyVar
          - checkout: none
          - bash:  |
              set -e # need to avoid trailing " being added to the variable
              echo "##vso[task.setvariable variable=MyVar;isOutput=true]$( localMyVarViaStageDependancies)"
            name: RepublishStep

Summing Up

So I hope this post will help you, and the future me, navigate the complexities of stage variables

You can find the YAML for the test harness I have been using in this GitHub GIST

Porting my Visual Studio Parameters.xml Generator tool to Visual Studio 2022 Preview

As I am sure you are all aware the preview of Visual Studio 2022 has just dropped, so it is time for me to update my Parameter.xml Generator Tool to support this new version of Visual Studio.

But what does my extension do?

As the Marketplace description says…

A tool to generate parameters.xml files for MSdeploy from the existing web.config file or from an app.config file for use with your own bespoke configuration transformation system.

Once the VSIX package is installed, to use right-click on a web.config, or app.config, file in Solution Explorer and the parameters.xml file will be generated using the current web.config entries from for both configuration/applicationSettings and configuration/AppSettings. The values attributes will contain TAG style entries suitable for replacement at deployment time.

If the parameters.xml already exists in the folder (even if it is not a file in the project) you will be prompted before it is overwritten.

Currently the version in the Marketplace of Parameter.xml Generator Tool supports Visual Studio 2015, 2017 & 2019

Adding Visual Studio 2022 Support

The process to add 2022 support is more complicated than adding past new versions, where all that was usually required was an update to the manifest. This is due to the move to 64Bit.

Luckily the process is fairly well documented, but of course I still had a few problems.

MSB4062: The “CompareBuildTaskVersion” task could not be loaded from the assembly

When I tried build the existing solution, without any changes, in Visual Studio 2022 I got the error

MSB4062: The “CompareBuildTaskVersion” task could not be loaded from the assembly D:myprojectpackagesMicrosoft.VSSDK.BuildTools.15.8.3253toolsVSSDKMicrosoft.VisualStudio.Sdk.BuildTasks.15.0.dll. Could not load file or assembly.

This was fixed by updating the package Microsoft.VSSDK.BuildTools from 15.1.192 to 16.9.1050.

Modernizing the Existing VSIX project

I did not modernize the existing VSIX project before I started the migration. When I clicked the Migrate packages.config to PackageReference…. it said my project was not a suitable version. So I just moved to the next step.

Adding Link Files

After creating the shared code project, that contains the bulk of the files, I needed to add links to some of the resources i.e. the license file, the package icon and .VSCT file.

When I tried add the link, I got an error in the form

 Cannot add another link for the same file in another project

I tried exiting Visual Studio, cleaning the solution, nothing helped. The solution was to edit the .CSPROJ file manually in a text editor e.g.

    <Content Include="ResourcesLicense.txt">
    <Content Include="..ParametersXmlAddinSharedResourcesPackage.ico">
    <Content Include="ResourcesPackage.ico">
    <Content Include="..ParametersXmlAddinSharedResourcesLicense.txt">
    <EmbeddedResource Include="ResourcesParametersUppercaseTransform.xslt" />
    <VSCTCompile Include="..ParametersXmlAddinSharedParametersXmlAddin.vsct">

Publishing the new Extension

Once I had completed the migration steps, I had a pair of VSIX files. The previously existing one that supported Visual Studio 2015, 2017 & 2019 and the new Visual Studio 2022 version.

The migration notes say that in the future we will be able to upload both VSIX files to a single Marketplace entry and the Marketplace will sort out delivering the correct version.

Unfortunately, that feature is not available at present. So for now the new Visual Studio 2022 VSIX is published separately from the old one with a preview flag.

As soon as I can, I will merge the new VSIX into the old Marketpalce entry and removed the preview 2022 version of the VSIX

Getting confused over Azure DevOps Pipeline variable evaluation


The use of variables is important in Azure DevOps pipelines, especially when using YML templates. They allow a single pipeline to be used for multiple branches/configurations etc.

The most common form of variables you see is are the predefined built in variables e.g. $(Build.BuildNumber) and your own custom ones e.g. $(var). Usually the value of these variables are set before/as the build is run, as an input condition.

But this is not the only way variables can be used. As noted in the documentation there are different ways to access a variable…

In a pipeline, template expression variables ${{ variables.var }} get processed at compile time, before runtime starts. Macro syntax variables $(var) get processed during runtime before a task runs. Runtime expressions $[variables.var] also get processed during runtime but were designed for use with conditions and expressions.

Azure DevOps Documentation

99% of the time I have been fine using just the $(var) syntax, but I recently was working on a case where this would not work for me.

The Issue

I had a pipeline that made heavy use of YML templates and conditional task insertion to include sets of task based upon the manually entered and pre-defined variables.

The problems that one of the tasks, used in a template, set a boolean output variable $(outVar) by calling

echo '##vso[task.setvariable variable=outvar;isOutput=true]true'

This task created the output variable could be accessed by other tasks as the variable $(mytask.outvar), but it was set at runtime it not available at the time of the YML compilation.

This caused me a problem as it meant that it could not be used in the template’s conditional task inclusion blocks as it as not present art compile time when this code is evaluated e.g.

- ${{ if eq(mytask.outvar, 'true') }} :
  # the task to run if the condition is met
  - task: Some.Task@1 

I tied referencing the variable using all forms of $ followed by brackets syntax I could think of, but it did not help.

The lesson here is that you cannot make a runtime value a compile time value by wishing it to change.

The only solution I could find was to make use of the runtime variable in a place where it can be resolved. If you wish to enable or disable a task based on the variable value then the only option is to use the condition parameter

  # the task to run if the condition is met
  - task: Some.Task@1 
    condition: and(succeeded(), eq(mytask.outvar, 'true'))

The only downside of this way of working as opposed to the conditional insertion is that

  • If you conditional insertion, non required tasks are never shown in the pipeline as they are not compiled into it
  • If using the condition property to exclude a task, it will still appear in the log, but it can be seen that it has not been run.

So I got there in the end, it was just not as neat as I had hoped, but I do have a clearer understanding of compile and runtime variables in Azure DevOps YML

Using the Post Build Cleanup Task from the Marketplace in YAML based Azure DevOps Pipelines

Disks filling up on our private Azure DevOps agents is a constant battle. We have maintenance jobs setup on the agent pools, to clean out old build working folders nightly, but these don’t run often enough. We need a clean out more than once a day due to the number and size of our builds.

To address this, with UI based builds, we successfully used the Post Build Cleanup Extension. However since we have moved many of our builds to YAML we found it not working so well. Turned out the problem was due to the way got source code.

The Post Build Cleanup task is intelligent, it does not just delete folders on demand. It check to see what the Get Source ‘Clean’ setting was when the repo was cloned and bases what it deletes on this value e.g. nothing, source, or everything. This behaviour is not that obvious.

In a UI based builds it is easy to check this setting. You are always in the UI when editing the build. However, in YAML it is easy to forget the setting, as it is one of those few values that cannot be set in YAML.

To make the post build cleanup task actually delete folders in a YAML pipeline you need to

  1. Edit the pipeline
  2. Click the ellipse menu top right
  3. Pick Triggers
  4. Pick YAML and select the ‘Get Source’ block
  5. Make sure the ‘Clean’ setting is set to ‘true’ and the right set of items to delete are selected – if this is not done the post clean up task does nothingimage
  6. You can then add the post build cleanup task the end of the steps
  - script: echo This where you do stuff
  - task: mspremier.PostBuildCleanup.PostBuildCleanup-task.PostBuildCleanup@3
    displayName: 'Clean Agent Directories'
    condition: always()

Once this is done it behaves as expected

Bringing Stage based release notes in Multi-Stage YAML to my Cross Platform Release Notes Exension

I have just released Version 3.1.7 of my Azure DevOps Pipeline XplatGenerateReleaseNotes Extension.

This new version allows you to build release notes within a Multi-Stage YAML build since the last successful release to the current (or named) stage in the pipeline as opposed to just last fully successful build.

This gives more feature parity with the older UI based Releases functionality.

To enable this new feature you need to set the checkStage: true flag and potentially the overrideStageName: AnotherStage if you wish the comparison to compare against a stage other than the current one.

- task: XplatGenerateReleaseNotes@3
    outputfile: '$(Build.ArtifactStagingDirectory)'
    outputVariableName: 'outputvar'
    templateLocation: 'InLine'
    checkStage: true
    inlinetemplate: |
      # Notes for build 
      **Build Number**: {{}}

Getting started with Aggregator CLI for Azure DevOps Work Item Roll-up

Updated 30/Sep/21 to reflect changes in the Aggregator CLI setup process


Back in the day I wrote a tool, TFS Alerts DSL, to do Work Item roll-up for TFS. Overtime I updated this to support VSTS (as Azure DevOps was then called), it’s final version is still available in the Azure DevOps Marketplace as the Azure DevOps Service Hooks DSL. So when I recently had a need for Work Item roll-up I did consider using my own tool, just for a short while.

However, I quickly realised a much better option was to use the Aggregator CLI. This is a successor to the TFS Aggregator Plug-in and is a far more mature project than my tool and actively under development, allow hosting as an Azure Function or a Docker container.

However, I have found the Aggregator CLI a little hard to get started with. The best ‘getting started’ documentation seems to be in the command examples, but I is not that easy to find. So I thought this blog post was a good idea, so I don’t forget the details in the future.


In this latest version of the Aggregator the functionality is delivered using Azure Functions, one per rule.

Note: A docker container is an other option, but one I have not explored

These Azure Functions are linked to Azure DevOps Service hook events. The command line tool setup process configures all of the parts required setting up Azure resources, Azure DevOps events and managing rules.


  • Download the latest release from, pick the version for the operating system you are planning to use to setup the tool.
  • Next you need to setup an Azure Service Principle App registration for the Aggregator and connect it to a Subscription
    1. Login to Azure

      az login

    2. Pick the correct subscription

      az account set –subscription <ID>

    3. Create the service principle

      az ad sp create-for-rbac –name AggregatorServicePrincipal

    4. From the root of the Azure Portal pick the Subscription you wish to create the Azure Functions in.
    5. In the Access (IAM ) section grant the ‘contributor role’ for the subscription to the newly created Service Principle

Using the Aggregator CLI

At a command prompt we need to now start to use the tool to link up Azure Services and Azure DevOps

  • First we log the CLI tool into Azure. You can find the values required from Azure Portal, in the Subscription overview and App Registration overview. You create a password from ‘client and secrets’ section for the App Registration.

    .aggregator-cli.exe -s <sub-id> -c <client-id> -t <tenant-id> -p <pwd>
  • Next login to Azure DevOps, create the PAT as detailed in the documentation

    .aggregator-cli.exe logon.ado -u<org> -mode PAT -t <pat>
  • Now we can create the Instance of the Aggregator in  Azure

    Note: I had long delays and timeout problems here due to what turned out to be a  poor WIFI links. The strange thing was it was not obviously failing WIFI, but just unstable enough to cause issues. As soon as I swapped to Ethernet the problems went away.

    The basic form of the install command is as follows, this will create a new resource group in Azure and then the required Web App, Storage, Application Insights etc. As this is  done using an ARM template so it is idempotent i.e. it can re run as many times as you wish, it will just update the Azure services if they already exist.

    .aggregator-cli.exe install.instance -verbose -n yourinstancename -l westeurope

    If you do get problems, goto the Azure Portal, find th reosurce group and look at the deployment logs

  • When this completes, you can see the new resources in the Azure Portal, or check them with command line

    .aggregator-cli.exe list.instances

  • You next need to register your rules. You can register as many as you wish. A few samples are provided in the test folder in the downloaded ZIP, these are good for a quick tests, thought you will usually create your own for production use.

    When you add a rule, behind the scenes this creates an Azure Function with the same name as the rule.

    .aggregator-cli.exe add.rule -v -i yourinstancename -n test1 -file testtest1.rule

  • Finally you map a rule to some event in Azure DevOps instance

    .aggregator-cli.exe map.rule -v -p yourproject -e workitem.updated -i yourinstancename -r test1

Once all this done you should have a working system. If you are using the the test rules the quickest option to see it is working is to

  1. Go into the Azure Portal
  2. Find the created Resource Group
  3. Pick the App Service for the Azure Functions
  4. Pick the Function for the rule under test
  5. Pick the Monitor
  6. Pick Logs
  7. Open Live Metric
  8. You should see log entries when you perform the event on a work item you mapped to the function.

An alternative is to look in the AppInsights Logs or live telemetry.

So I hope this helps my future self remember how get this tool setup quickly

How to do local template development for my Cross platform Release notes task

The testing cycle for Release Notes Templates can be slow, requiring a build and release cycle. To try to speed this process for users I have created a local test harness that allows the same calls to be made from a development machine as would be made within a build or release.

However, running this is not as simple was you might expect so please read the instruction before proceeding.

Setup and Build

  1. Clone the repo contain the Azure DevOps Extension.
  2. Change to the folder

    <repo root>ExtensionsXplatGenerateReleaseNotesV2testconsole

  3. Build the tool using NPM (this does assume Node is already installed)

    npm install
    npm run build

Running the Tool

The task the testconsole runs takes many parameters, and reads runtime Azure DevOps environment variable. These have to be passing into the local tester. Given the number, and the fact that most probably won’t need to be altered, they are provided in settings JSON file. Samples are provided for a build and a release. For details on these parameters see the task documentation

The only values not stored in the JSON files are the PATs required to access the REST API. This reduces the chance of them being copied onto source control by mistake.

Two PATs are potentially used.

  • Azure DevOps PAT (Required) – within a build or release this is automatically picked up. For this tool it must be provided
  • GitHub PAT – this is an optional parameter for the task, you only need to provide it if working with private GitHub repos as your code store. So usually this can be ignored.

Test Template Generation for a Build

To run the tool against a build

  1. In the settings file make sure the TeamFoundationCollectionUri, TeamProject and BuildID are set to the build you wish to run against, and that the ReleaseID is empty.
  2. Run the command

    node .GenerateReleaseNotesConsoleTester.js build-settings.json <your-Azure-DevOps-PAT> <Optional: your GitHub PAT>

  3. Assuming you are using the sample settings you should get an file with your release notes.

Test Template Generation for a Release

To run the tool against a release is but more complex. This is because the logic looks back to see the most recent successful run. So if your release ran to completion you will get no notes as there has been no changes it it is the last successful release.

You have two options

  • Allow a release  to trigger, but cancel it. You can then use its ReleaseID to compare with the last release
  • Add a stage to your release this is skipped, only run on a manual request and use this as the comparison stage to look for difference

To run the tool

  1. In the settings file make sure the TeamFoundationCollectionUri, TeamProject, BuildID, EnvironmentName (as stage in your process), ReleaseID and releaseDefinitionId are set for the release you wish to run against.
  2. Run the command

    node .GenerateReleaseNotesConsoleTester.js release-settings.json <your-Azure-DevOps-PAT> <Optional: yourGitHub PAT>

  3. Assuming you are using the sample settings you should get an file with your release notes.

Hope you find it useful

New feature for Cross Platform Release notes – get parent and child work items

I have added another new feature to my Cross Platform release note generator. Now, when using Handlebars based templates you can optionally get the parent or child work items for any work item associated with build/release

To enable the feature, as it is off by default, you need to set the  getParentsAndChildren: true parameter for the task, either in YAML or in the handlebars section of the configuration.

This will add an extra array that the template can access relatedWorkItems. This contains all the work items associated with the build/release plus their direct parents and children. This can then be accessed in the template

{{#forEach this.workItems}}

{{#if isFirst}}### WorkItems {{/if}}

* **{{}}**  {{lookup this.fields 'System.Title'}}

- **WIT** {{lookup this.fields 'System.WorkItemType'}}

- **Tags** {{lookup this.fields 'System.Tags'}}

- **Assigned** {{#with (lookup this.fields 'System.AssignedTo')}} {{displayName}} {{/with}}

- **Description** {{{lookup this.fields 'System.Description'}}}

- **Parents**

{{#forEach this.relations}}

{{#if (contains 'Parent')}}

{{#with (lookup_a_work_item ../../relatedWorkItems  this.url)}}

      - {{}} - {{lookup this.fields 'System.Title'}}




- **Children**

{{#forEach this.relations}}

{{#if (contains 'Child')}}

{{#with (lookup_a_work_item ../../relatedWorkItems  this.url)}}

      - {{}} - {{lookup this.fields 'System.Title'}}





This is a complex way to present the extra work items, but very flexible.

Hope people find the new feature useful.

And another new feature for my Cross Platform Release Notes Azure DevOps Task – commit/changeset file details

The addition of Handlebars based templating for my Cross Platform Release Notes Task has certainly made it much easier to release new features. The legacy templating model it seem is what had been holding development back.

In the past month or so I have added support for generating release notes based on PRs and Tests. I am now happy to say I have just added support for the actual files associated with a commit or changeset.

Enriching the commit/changeset data with the details of the files edited has been a repeated request over the years. The basic commit/changeset object only detailed the commit message and the author. With this new release of my task there is now a .changes property on the commit objects that exposes the details of the actual files in the commit/changeset.

This is used in Handlebars based template as follows

# Global list of CS ({{commits.length}})
{{#forEach commits}}
{{#if isFirst}}### Associated commits{{/if}}
* ** ID{{}}** 
   -  **Message:** {{this.message}}
   -  **Commited by:** {{}} 
   -  **FileCount:** {{this.changes.length}} 
{{#forEach this.changes}}
      -  **File path (use this for TFVC or TfsGit):** {{this.item.path}}  
      -  **File filename (using this for GitHub):** {{this.filename}}  
      -  **this will show all the properties available for file):** {{json this}}  

Another feature for my Cross Platform Release Notes Azure DevOps Extension–access to test results

Over the weekend I got another new feature for my Cross Platform Release Notes Azure DevOps Extension working. The test results associated with build artefacts or releases are now exposed to Handlebars based templates.

The new objects you can access are:

  • In builds
    • tests – all the test run as part of current build
  • In releases
    • tests – all the test run as part of any current build artefacts or previous to the running of the release notes task within a release environment
    • releaseTests – all the test run within a release environment
    • builds.test – all the test run as part of any build artefacts group by build artefact

These can be used as follows in a release template

# Builds with associated WI/CS/Tests ({{builds.length}})

{{#forEach builds}}

{{#if isFirst}}## Builds {{/if}}

##  Build {{}}

{{#forEach this.commits}}

{{#if isFirst}}### Commits {{/if}}

- CS {{}}


{{#forEach this.workitems}}

{{#if isFirst}}### Workitems {{/if}}

- WI {{}}


{{#forEach this.tests}}

{{#if isFirst}}### Tests {{/if}}

- Test {{}}

-  Name: {{}}

-  Outcome: {{this.outcome}}



# Global list of tests ({{tests.length}})

{{#forEach tests}}

{{#if isFirst}}### Tests {{/if}}

* ** ID{{}}**

-  Name: {{}}

-  Outcome: {{this.outcome}}


For more details see the documentation in the WIKI