Porting my Visual Studio Parameters.xml Generator tool to Visual Studio 2022 Preview

As I am sure you are all aware the preview of Visual Studio 2022 has just dropped, so it is time for me to update my Parameter.xml Generator Tool to support this new version of Visual Studio.

But what does my extension do?

As the Marketplace description says…

A tool to generate parameters.xml files for MSdeploy from the existing web.config file or from an app.config file for use with your own bespoke configuration transformation system.

Once the VSIX package is installed, to use right-click on a web.config, or app.config, file in Solution Explorer and the parameters.xml file will be generated using the current web.config entries from for both configuration/applicationSettings and configuration/AppSettings. The values attributes will contain TAG style entries suitable for replacement at deployment time.

If the parameters.xml already exists in the folder (even if it is not a file in the project) you will be prompted before it is overwritten.

Currently the version in the Marketplace of Parameter.xml Generator Tool supports Visual Studio 2015, 2017 & 2019

Adding Visual Studio 2022 Support

The process to add 2022 support is more complicated than adding past new versions, where all that was usually required was an update to the manifest. This is due to the move to 64Bit.

Luckily the process is fairly well documented, but of course I still had a few problems.

MSB4062: The “CompareBuildTaskVersion” task could not be loaded from the assembly

When I tried build the existing solution, without any changes, in Visual Studio 2022 I got the error

MSB4062: The “CompareBuildTaskVersion” task could not be loaded from the assembly D:\myproject\packages\Microsoft.VSSDK.BuildTools.15.8.3253\tools\VSSDK\Microsoft.VisualStudio.Sdk.BuildTasks.15.0.dll. Could not load file or assembly.

This was fixed by updating the package Microsoft.VSSDK.BuildTools from 15.1.192 to 16.9.1050.

Modernizing the Existing VSIX project

I did not modernize the existing VSIX project before I started the migration. When I clicked the Migrate packages.config to PackageReference…. it said my project was not a suitable version. So I just moved to the next step.

Adding Link Files

After creating the shared code project, that contains the bulk of the files, I needed to add links to some of the resources i.e. the license file, the package icon and .VSCT file.

When I tried add the link, I got an error in the form

 Cannot add another link for the same file in another project

I tried exiting Visual Studio, cleaning the solution, nothing helped. The solution was to edit the .CSPROJ file manually in a text editor e.g.

 <ItemGroup>
    <Content Include="Resources\License.txt">
      <CopyToOutputDirectory>Always</CopyToOutputDirectory>
    <Content Include="..\ParametersXmlAddinShared\Resources\Package.ico">
      <Link>Package.ico</Link>
      <IncludeInVSIX>true</IncludeInVSIX>
    </Content>
    <Content Include="Resources\Package.ico">
      <CopyToOutputDirectory>Always</CopyToOutputDirectory>
    <Content Include="..\ParametersXmlAddinShared\Resources\License.txt">
      <Link>License.txt</Link>
      <IncludeInVSIX>true</IncludeInVSIX>
    </Content>
    <EmbeddedResource Include="Resources\ParametersUppercaseTransform.xslt" />
    <VSCTCompile Include="..\ParametersXmlAddinShared\ParametersXmlAddin.vsct">
      <Link>ParametersXmlAddin.vsct</Link>
      <ResourceName>Menus.ctmenu</ResourceName>
    </VSCTCompile>
  </ItemGroup>

Publishing the new Extension

Once I had completed the migration steps, I had a pair of VSIX files. The previously existing one that supported Visual Studio 2015, 2017 & 2019 and the new Visual Studio 2022 version.

The migration notes say that in the future we will be able to upload both VSIX files to a single Marketplace entry and the Marketplace will sort out delivering the correct version.

Unfortunately, that feature is not available at present. So for now the new Visual Studio 2022 VSIX is published separately from the old one with a preview flag.

As soon as I can, I will merge the new VSIX into the old Marketpalce entry and removed the preview 2022 version of the VSIX

Getting confused over Azure DevOps Pipeline variable evaluation

Introduction

The use of variables is important in Azure DevOps pipelines, especially when using YML templates. They allow a single pipeline to be used for multiple branches/configurations etc.

The most common form of variables you see is are the predefined built in variables e.g. $(Build.BuildNumber) and your own custom ones e.g. $(var). Usually the value of these variables are set before/as the build is run, as an input condition.

But this is not the only way variables can be used. As noted in the documentation there are different ways to access a variable…

In a pipeline, template expression variables ${{ variables.var }} get processed at compile time, before runtime starts. Macro syntax variables $(var) get processed during runtime before a task runs. Runtime expressions $[variables.var] also get processed during runtime but were designed for use with conditions and expressions.

Azure DevOps Documentation

99% of the time I have been fine using just the $(var) syntax, but I recently was working on a case where this would not work for me.

The Issue

I had a pipeline that made heavy use of YML templates and conditional task insertion to include sets of task based upon the manually entered and pre-defined variables.

The problems that one of the tasks, used in a template, set a boolean output variable $(outVar) by calling

echo '##vso[task.setvariable variable=outvar;isOutput=true]true'

This task created the output variable could be accessed by other tasks as the variable $(mytask.outvar), but it was set at runtime it not available at the time of the YML compilation.

This caused me a problem as it meant that it could not be used in the template’s conditional task inclusion blocks as it as not present art compile time when this code is evaluated e.g.

- ${{ if eq(mytask.outvar, 'true') }} :
  # the task to run if the condition is met
  - task: Some.Task@1 
    ....

I tied referencing the variable using all forms of $ followed by brackets syntax I could think of, but it did not help.

The lesson here is that you cannot make a runtime value a compile time value by wishing it to change.

The only solution I could find was to make use of the runtime variable in a place where it can be resolved. If you wish to enable or disable a task based on the variable value then the only option is to use the condition parameter

  # the task to run if the condition is met
  - task: Some.Task@1 
    condition: and(succeeded(), eq(mytask.outvar, 'true'))
    ....

The only downside of this way of working as opposed to the conditional insertion is that

  • If you conditional insertion, non required tasks are never shown in the pipeline as they are not compiled into it
  • If using the condition property to exclude a task, it will still appear in the log, but it can be seen that it has not been run.

So I got there in the end, it was just not as neat as I had hoped, but I do have a clearer understanding of compile and runtime variables in Azure DevOps YML

Using the Post Build Cleanup Task from the Marketplace in YAML based Azure DevOps Pipelines

Disks filling up on our private Azure DevOps agents is a constant battle. We have maintenance jobs setup on the agent pools, to clean out old build working folders nightly, but these don’t run often enough. We need a clean out more than once a day due to the number and size of our builds.

To address this, with UI based builds, we successfully used the Post Build Cleanup Extension. However since we have moved many of our builds to YAML we found it not working so well. Turned out the problem was due to the way got source code.

The Post Build Cleanup task is intelligent, it does not just delete folders on demand. It check to see what the Get Source ‘Clean’ setting was when the repo was cloned and bases what it deletes on this value e.g. nothing, source, or everything. This behaviour is not that obvious.

In a UI based builds it is easy to check this setting. You are always in the UI when editing the build. However, in YAML it is easy to forget the setting, as it is one of those few values that cannot be set in YAML.

To make the post build cleanup task actually delete folders in a YAML pipeline you need to

  1. Edit the pipeline
  2. Click the ellipse menu top right
  3. Pick Triggers
  4. Pick YAML and select the ‘Get Source’ block
  5. Make sure the ‘Clean’ setting is set to ‘true’ and the right set of items to delete are selected – if this is not done the post clean up task does nothingimage
  6. You can then add the post build cleanup task the end of the steps
steps:
  - script: echo This where you do stuff
  - task: mspremier.PostBuildCleanup.PostBuildCleanup-task.PostBuildCleanup@3
    displayName: 'Clean Agent Directories'
    condition: always()

Once this is done it behaves as expected

Bringing Stage based release notes in Multi-Stage YAML to my Cross Platform Release Notes Exension

I have just released Version 3.1.7 of my Azure DevOps Pipeline XplatGenerateReleaseNotes Extension.

This new version allows you to build release notes within a Multi-Stage YAML build since the last successful release to the current (or named) stage in the pipeline as opposed to just last fully successful build.

This gives more feature parity with the older UI based Releases functionality.

To enable this new feature you need to set the checkStage: true flag and potentially the overrideStageName: AnotherStage if you wish the comparison to compare against a stage other than the current one.

- task: XplatGenerateReleaseNotes@3
  inputs:
    outputfile: '$(Build.ArtifactStagingDirectory)\releasenotes.md'
    outputVariableName: 'outputvar'
    templateLocation: 'InLine'
    checkStage: true
    inlinetemplate: |
      # Notes for build 
      **Build Number**: {{buildDetails.id}}
      ...

Getting started with Aggregator CLI for Azure DevOps Work Item Roll-up

Updated 30/Sep/21 to reflect changes in the Aggregator CLI setup process

Background

Back in the day I wrote a tool, TFS Alerts DSL, to do Work Item roll-up for TFS. Overtime I updated this to support VSTS (as Azure DevOps was then called), it’s final version is still available in the Azure DevOps Marketplace as the Azure DevOps Service Hooks DSL. So when I recently had a need for Work Item roll-up I did consider using my own tool, just for a short while.

However, I quickly realised a much better option was to use the Aggregator CLI. This is a successor to the TFS Aggregator Plug-in and is a far more mature project than my tool and actively under development, allow hosting as an Azure Function or a Docker container.

However, I have found the Aggregator CLI a little hard to get started with. The best ‘getting started’ documentation seems to be in the command examples, but I is not that easy to find. So I thought this blog post was a good idea, so I don’t forget the details in the future.

Architecture

In this latest version of the Aggregator the functionality is delivered using Azure Functions, one per rule.

Note: A docker container is an other option, but one I have not explored

These Azure Functions are linked to Azure DevOps Service hook events. The command line tool setup process configures all of the parts required setting up Azure resources, Azure DevOps events and managing rules.

Preparation

  • Download the latest release from https://github.com/tfsaggregator/aggregator-cli/releases, pick the version for the operating system you are planning to use to setup the tool.
  • Next you need to setup an Azure Service Principle App registration for the Aggregator and connect it to a Subscription
    1. Login to Azure

      az login

    2. Pick the correct subscription

      az account set –subscription <ID>

    3. Create the service principle

      az ad sp create-for-rbac –name AggregatorServicePrincipal

    4. From the root of the Azure Portal pick the Subscription you wish to create the Azure Functions in.
    5. In the Access (IAM ) section grant the ‘contributor role’ for the subscription to the newly created Service Principle

Using the Aggregator CLI

At a command prompt we need to now start to use the tool to link up Azure Services and Azure DevOps

  • First we log the CLI tool into Azure. You can find the values required from Azure Portal, in the Subscription overview and App Registration overview. You create a password from ‘client and secrets’ section for the App Registration.

    .\aggregator-cli.exe logon.azure -s <sub-id> -c <client-id> -t <tenant-id> -p <pwd>
  • Next login to Azure DevOps, create the PAT as detailed in the documentation

    .\aggregator-cli.exe logon.ado -u
    https://dev.azure.com/<org> -mode PAT -t <pat>
  • Now we can create the Instance of the Aggregator in  Azure

    Note: I had long delays and timeout problems here due to what turned out to be a  poor WIFI links. The strange thing was it was not obviously failing WIFI, but just unstable enough to cause issues. As soon as I swapped to Ethernet the problems went away.

    The basic form of the install command is as follows, this will create a new resource group in Azure and then the required Web App, Storage, Application Insights etc. As this is  done using an ARM template so it is idempotent i.e. it can re run as many times as you wish, it will just update the Azure services if they already exist.

    .\aggregator-cli.exe install.instance -verbose -n yourinstancename -l westeurope

    If you do get problems, goto the Azure Portal, find th reosurce group and look at the deployment logs

  • When this completes, you can see the new resources in the Azure Portal, or check them with command line

    .\aggregator-cli.exe list.instances

  • You next need to register your rules. You can register as many as you wish. A few samples are provided in the \test folder in the downloaded ZIP, these are good for a quick tests, thought you will usually create your own for production use.

    When you add a rule, behind the scenes this creates an Azure Function with the same name as the rule.

    .\aggregator-cli.exe add.rule -v -i yourinstancename -n test1 -file test\test1.rule

  • Finally you map a rule to some event in Azure DevOps instance

    .\aggregator-cli.exe map.rule -v -p yourproject -e workitem.updated -i yourinstancename -r test1

Once all this done you should have a working system. If you are using the the test rules the quickest option to see it is working is to

  1. Go into the Azure Portal
  2. Find the created Resource Group
  3. Pick the App Service for the Azure Functions
  4. Pick the Function for the rule under test
  5. Pick the Monitor
  6. Pick Logs
  7. Open Live Metric
  8. You should see log entries when you perform the event on a work item you mapped to the function.

An alternative is to look in the AppInsights Logs or live telemetry.

So I hope this helps my future self remember how get this tool setup quickly

How to do local template development for my Cross platform Release notes task

The testing cycle for Release Notes Templates can be slow, requiring a build and release cycle. To try to speed this process for users I have created a local test harness that allows the same calls to be made from a development machine as would be made within a build or release.

However, running this is not as simple was you might expect so please read the instruction before proceeding.

Setup and Build

  1. Clone the repo contain the Azure DevOps Extension.
  2. Change to the folder

    <repo root>Extensions\XplatGenerateReleaseNotes\V2\testconsole

  3. Build the tool using NPM (this does assume Node is already installed)

    npm install
    npm run build

Running the Tool

The task the testconsole runs takes many parameters, and reads runtime Azure DevOps environment variable. These have to be passing into the local tester. Given the number, and the fact that most probably won’t need to be altered, they are provided in settings JSON file. Samples are provided for a build and a release. For details on these parameters see the task documentation

The only values not stored in the JSON files are the PATs required to access the REST API. This reduces the chance of them being copied onto source control by mistake.

Two PATs are potentially used.

  • Azure DevOps PAT (Required) – within a build or release this is automatically picked up. For this tool it must be provided
  • GitHub PAT – this is an optional parameter for the task, you only need to provide it if working with private GitHub repos as your code store. So usually this can be ignored.

Test Template Generation for a Build

To run the tool against a build

  1. In the settings file make sure the TeamFoundationCollectionUri, TeamProject and BuildID are set to the build you wish to run against, and that the ReleaseID is empty.
  2. Run the command

    node .\GenerateReleaseNotesConsoleTester.js build-settings.json <your-Azure-DevOps-PAT> <Optional: your GitHub PAT>

  3. Assuming you are using the sample settings you should get an output.md file with your release notes.

Test Template Generation for a Release

To run the tool against a release is but more complex. This is because the logic looks back to see the most recent successful run. So if your release ran to completion you will get no notes as there has been no changes it it is the last successful release.

You have two options

  • Allow a release  to trigger, but cancel it. You can then use its ReleaseID to compare with the last release
  • Add a stage to your release this is skipped, only run on a manual request and use this as the comparison stage to look for difference

To run the tool

  1. In the settings file make sure the TeamFoundationCollectionUri, TeamProject, BuildID, EnvironmentName (as stage in your process), ReleaseID and releaseDefinitionId are set for the release you wish to run against.
  2. Run the command

    node .\GenerateReleaseNotesConsoleTester.js release-settings.json <your-Azure-DevOps-PAT> <Optional: yourGitHub PAT>

  3. Assuming you are using the sample settings you should get an output.md file with your release notes.

Hope you find it useful

New feature for Cross Platform Release notes – get parent and child work items

I have added another new feature to my Cross Platform release note generator. Now, when using Handlebars based templates you can optionally get the parent or child work items for any work item associated with build/release

To enable the feature, as it is off by default, you need to set the  getParentsAndChildren: true parameter for the task, either in YAML or in the handlebars section of the configuration.

This will add an extra array that the template can access relatedWorkItems. This contains all the work items associated with the build/release plus their direct parents and children. This can then be accessed in the template

{{#forEach this.workItems}}

{{#if isFirst}}### WorkItems {{/if}}

* **{{this.id}}**  {{lookup this.fields 'System.Title'}}

- **WIT** {{lookup this.fields 'System.WorkItemType'}}

- **Tags** {{lookup this.fields 'System.Tags'}}

- **Assigned** {{#with (lookup this.fields 'System.AssignedTo')}} {{displayName}} {{/with}}

- **Description** {{{lookup this.fields 'System.Description'}}}

- **Parents**

{{#forEach this.relations}}

{{#if (contains this.attributes.name 'Parent')}}

{{#with (lookup_a_work_item ../../relatedWorkItems  this.url)}}

      - {{this.id}} - {{lookup this.fields 'System.Title'}}

{{/with}}

{{/if}}

{{/forEach}}

- **Children**

{{#forEach this.relations}}

{{#if (contains this.attributes.name 'Child')}}

{{#with (lookup_a_work_item ../../relatedWorkItems  this.url)}}

      - {{this.id}} - {{lookup this.fields 'System.Title'}}

{{/with}}

{{/if}}

{{/forEach}}

{{/forEach}}

This is a complex way to present the extra work items, but very flexible.

Hope people find the new feature useful.

And another new feature for my Cross Platform Release Notes Azure DevOps Task – commit/changeset file details

The addition of Handlebars based templating for my Cross Platform Release Notes Task has certainly made it much easier to release new features. The legacy templating model it seem is what had been holding development back.

In the past month or so I have added support for generating release notes based on PRs and Tests. I am now happy to say I have just added support for the actual files associated with a commit or changeset.

Enriching the commit/changeset data with the details of the files edited has been a repeated request over the years. The basic commit/changeset object only detailed the commit message and the author. With this new release of my task there is now a .changes property on the commit objects that exposes the details of the actual files in the commit/changeset.

This is used in Handlebars based template as follows

# Global list of CS ({{commits.length}})
{{#forEach commits}}
{{#if isFirst}}### Associated commits{{/if}}
* ** ID{{this.id}}** 
   -  **Message:** {{this.message}}
   -  **Commited by:** {{this.author.displayName}} 
   -  **FileCount:** {{this.changes.length}} 
{{#forEach this.changes}}
      -  **File path (use this for TFVC or TfsGit):** {{this.item.path}}  
      -  **File filename (using this for GitHub):** {{this.filename}}  
      -  **this will show all the properties available for file):** {{json this}}  
{{/forEach}}. 
{{/forEach}}

Another feature for my Cross Platform Release Notes Azure DevOps Extension–access to test results

Over the weekend I got another new feature for my Cross Platform Release Notes Azure DevOps Extension working. The test results associated with build artefacts or releases are now exposed to Handlebars based templates.

The new objects you can access are:

  • In builds
    • tests – all the test run as part of current build
  • In releases
    • tests – all the test run as part of any current build artefacts or previous to the running of the release notes task within a release environment
    • releaseTests – all the test run within a release environment
    • builds.test – all the test run as part of any build artefacts group by build artefact

These can be used as follows in a release template

# Builds with associated WI/CS/Tests ({{builds.length}})

{{#forEach builds}}

{{#if isFirst}}## Builds {{/if}}

##  Build {{this.build.buildNumber}}

{{#forEach this.commits}}

{{#if isFirst}}### Commits {{/if}}

- CS {{this.id}}

{{/forEach}}

{{#forEach this.workitems}}

{{#if isFirst}}### Workitems {{/if}}

- WI {{this.id}}

{{/forEach}}

{{#forEach this.tests}}

{{#if isFirst}}### Tests {{/if}}

- Test {{this.id}}

-  Name: {{this.testCase.name}}

-  Outcome: {{this.outcome}}

{{/forEach}}

{{/forEach}}


# Global list of tests ({{tests.length}})

{{#forEach tests}}

{{#if isFirst}}### Tests {{/if}}

* ** ID{{this.id}}**

-  Name: {{this.testCase.name}}

-  Outcome: {{this.outcome}}

{{/forEach}}


For more details see the documentation in the WIKI

Announcing the deprecation of my Azure DevOps Pester Extension as it has been migrated to the Pester Project and republished under a new ID

Back in early 2016 I wrote an Azure DevOps Extension to wrapper Pester, the Powershell unit testing tool. Over the years I updated it, and then passed the support of it over to someone who knows much more about Powershell and Pester than I Chris Gardner who continued to develop it.

With the advent of cross-platform Powershell Core we realized that the current extension implementation had a fundamental limitation. Azure DevOps Tasks can only be executed by the agent using the Windows version of Powershell or Node. There is no option for execution by Powershell Core, and probably never will be. As Pester is now supported by Powershell Core this was a serious limitation.

To get around this problem I wrote a Node wrapper to allow the existing Powershell task to be executed using Node, by running a Node script then shelling out to Powershell or Powershell Core. A technique I have since used to make other extensions of mine cross-platform

Around this time we started to discuss whether my GitHub repo was really the best home for this Pester extension, and in the decided that this major update to provide cross-platform support was a good point to move it a new home under the ownership of Pester Project.

So, given all that history, I am really pleased to say that I am deprecating my Pester Extension and adding instructions that though my extension is not going away and will continue to work as it currently does, it will not be updated again and all users should consider swapping over to the new cross-platform version of the extension that is the next generation of same code base but now owned and maintained by the Pester project (well still Chris in reality).

Unfortunately, Azure DevOps provides no way to migrate ownership of an extension. So to swap to the new version will require some work. If you are using YAML the conversion is only a case of changing the task name/id. If you are using the UI based builds or release you need to add the new task and do some copy typing of parameters. The good news is that all the parameter options remain the same so it should be a quick job.

Also please note that any outstanding issues, not fixed in the new release, have been migrated over to the extensions now home, they have not been forgotten.

So hope you all like the new enhanced version of the Pester Extension and thanks to Chris for sorting the migration and all his work support it.