New feature for Cross Platform Release notes – get parent and child work items

I have added another new feature to my Cross Platform release note generator. Now, when using Handlebars based templates you can optionally get the parent or child work items for any work item associated with build/release

To enable the feature, as it is off by default, you need to set the  getParentsAndChildren: true parameter for the task, either in YAML or in the handlebars section of the configuration.

This will add an extra array that the template can access relatedWorkItems. This contains all the work items associated with the build/release plus their direct parents and children. This can then be accessed in the template

{{#forEach this.workItems}}

{{#if isFirst}}### WorkItems {{/if}}

* **{{this.id}}**  {{lookup this.fields 'System.Title'}}

- **WIT** {{lookup this.fields 'System.WorkItemType'}}

- **Tags** {{lookup this.fields 'System.Tags'}}

- **Assigned** {{#with (lookup this.fields 'System.AssignedTo')}} {{displayName}} {{/with}}

- **Description** {{{lookup this.fields 'System.Description'}}}

- **Parents**

{{#forEach this.relations}}

{{#if (contains this.attributes.name 'Parent')}}

{{#with (lookup_a_work_item ../../relatedWorkItems  this.url)}}

      - {{this.id}} - {{lookup this.fields 'System.Title'}}

{{/with}}

{{/if}}

{{/forEach}}

- **Children**

{{#forEach this.relations}}

{{#if (contains this.attributes.name 'Child')}}

{{#with (lookup_a_work_item ../../relatedWorkItems  this.url)}}

      - {{this.id}} - {{lookup this.fields 'System.Title'}}

{{/with}}

{{/if}}

{{/forEach}}

{{/forEach}}

This is a complex way to present the extra work items, but very flexible.

Hope people find the new feature useful.

And another new feature for my Cross Platform Release Notes Azure DevOps Task – commit/changeset file details

The addition of Handlebars based templating for my Cross Platform Release Notes Task has certainly made it much easier to release new features. The legacy templating model it seem is what had been holding development back.

In the past month or so I have added support for generating release notes based on PRs and Tests. I am now happy to say I have just added support for the actual files associated with a commit or changeset.

Enriching the commit/changeset data with the details of the files edited has been a repeated request over the years. The basic commit/changeset object only detailed the commit message and the author. With this new release of my task there is now a .changes property on the commit objects that exposes the details of the actual files in the commit/changeset.

This is used in Handlebars based template as follows

# Global list of CS ({{commits.length}})
{{#forEach commits}}
{{#if isFirst}}### Associated commits{{/if}}
* ** ID{{this.id}}** 
   -  **Message:** {{this.message}}
   -  **Commited by:** {{this.author.displayName}} 
   -  **FileCount:** {{this.changes.length}} 
{{#forEach this.changes}}
      -  **File path (use this for TFVC or TfsGit):** {{this.item.path}}  
      -  **File filename (using this for GitHub):** {{this.filename}}  
      -  **this will show all the properties available for file):** {{json this}}  
{{/forEach}}. 
{{/forEach}}

Another feature for my Cross Platform Release Notes Azure DevOps Extension–access to test results

Over the weekend I got another new feature for my Cross Platform Release Notes Azure DevOps Extension working. The test results associated with build artefacts or releases are now exposed to Handlebars based templates.

The new objects you can access are:

  • In builds
    • tests – all the test run as part of current build
  • In releases
    • tests – all the test run as part of any current build artefacts or previous to the running of the release notes task within a release environment
    • releaseTests – all the test run within a release environment
    • builds.test – all the test run as part of any build artefacts group by build artefact

These can be used as follows in a release template

# Builds with associated WI/CS/Tests ({{builds.length}})

{{#forEach builds}}

{{#if isFirst}}## Builds {{/if}}

##  Build {{this.build.buildNumber}}

{{#forEach this.commits}}

{{#if isFirst}}### Commits {{/if}}

- CS {{this.id}}

{{/forEach}}

{{#forEach this.workitems}}

{{#if isFirst}}### Workitems {{/if}}

- WI {{this.id}}

{{/forEach}}

{{#forEach this.tests}}

{{#if isFirst}}### Tests {{/if}}

- Test {{this.id}}

-  Name: {{this.testCase.name}}

-  Outcome: {{this.outcome}}

{{/forEach}}

{{/forEach}}


# Global list of tests ({{tests.length}})

{{#forEach tests}}

{{#if isFirst}}### Tests {{/if}}

* ** ID{{this.id}}**

-  Name: {{this.testCase.name}}

-  Outcome: {{this.outcome}}

{{/forEach}}


For more details see the documentation in the WIKI

Deploying Files to the Logged-In User’s Profile Using Configuration Manager

I’ve had a number of cases where it would have been useful to be able to deploy files to a logged-in user’s profile on Windows using Configuration Manager (SCCM), but due to the context in which the ‘installation’ runs (i.e. the local system), using parameters such as %AppData% in a batch file doesn’t work, and this has always been an issue.

We recently wanted to push some custom backgrounds out to Microsoft Teams and I thought I’d spend a while trying to solve this issue. The following may not be the most elegant solution, but it seems to work reliably for me!

  1. We’re going to use PowerShell to detect the presence of a file. Copy the following into a file for the moment; we’ll need this PowerShell snippet shortly:
    Function CurrentUser {
         $LoggedInUser = get-wmiobject win32_computersystem | select username
         $LoggedInUser = [string]$LoggedInUser
         $LoggedInUser = $LoggedInUser.split(“=”)
         $LoggedInUser = $LoggedInUser[1]
         $LoggedInUser = $LoggedInUser.split(“}”)
         $LoggedInUser = $LoggedInUser[0]
         $LoggedInUser = $LoggedInUser.split(“”)
         $LoggedInUser = $LoggedInUser[1]
         Return $LoggedInUser
    }

  2. $user = CurrentUser


    start-sleep 30


    $AppPath = “C:Users” + $user + “AppDataRoamingMicrosoftTeamsBackgroundsUploadsCustomBackground.png”
    If (Test-Path $AppPath) {
         Write-Host “The application is installed”
    }

    We need the sleep function in the file as the detection runs pretty quickly after the deployment and we want to be sure that the file(s) we’re going to copy are in place before attempting to perform the detection. This script extracts the username (including domain) of the currently logged in user, then separates the actual username from the returned value allowing this to be inserted into the path for detecting the file.

  3. Create a PowerShell file similar to the detection script to perform the file deployment:
    Function CurrentUser {
         $LoggedInUser = get-wmiobject win32_computersystem | select username
         $LoggedInUser = [string]$LoggedInUser
         $LoggedInUser = $LoggedInUser.split(“=”)
         $LoggedInUser = $LoggedInUser[1]
         $LoggedInUser = $LoggedInUser.split(“}”)
         $LoggedInUser = $LoggedInUser[0]
         $LoggedInUser = $LoggedInUser.split(“”)
         $LoggedInUser = $LoggedInUser[1]
         Return $LoggedInUser
    }

  4. $user = CurrentUser


    $AppPath = “C:Users” + $user + “AppDataRoamingMicrosoftTeamsBackgroundsUploads”
    cp “.Backgrounds*.png” -Destination $AppPath -Confirm:$false –Force

    I saved mine as ‘Deploy-TeamsBackgrounds.ps1’. This script uses the same logic as the detection script, above, to determine the username and allows us to use this in the path to copy the file(s) to.

  5. Create a batch file to call the PowerShell file that does the actual file deployment:
    Powershell -NoProfile -ExecutionPolicy Bypass -file %~dp0Deploy-TeamsBackgrounds.ps1

    I saved this as ‘Deploy-TeamsBackgrounds.bat’. This batch file runs PowerShell and passes in the filename of the scripts used to deploy the files to the user’s profile location. Ensure that the batch file and the deployment PowerShell script are in the same location somewhere suitable for SCCM to use as the application source (usually a share on the SCCM server). I then have a folder called ‘Backgrounds’ in the same location that contains the actual image files to be copied.

  6. Create a new application in Configuration Manager selecting ‘Manually specify the application information’:
    Deploy-TeamsBackgrounds01
    Then enter information on the name, publisher and version of the application:
    Deploy-TeamsBackgrounds02
    Add any required information for the Software Center entry:
    Deploy-TeamsBackgrounds03
    Click ‘Add’ to add a deployment type:
    Deploy-TeamsBackgrounds04
    and select ‘Script installer’ from the drop-down. This will automatically select the ‘Manually specify the deployment type information’ option:
    Deploy-TeamsBackgrounds05
    Provide a name for the deployment type:
    Deploy-TeamsBackgrounds06
    Specify the location that contains the files created earlier and the batch file as the command used to install the content:
    Deploy-TeamsBackgrounds07
    For the detection method, select ‘Use a custom script to detect the presence of this deployment type, the click the ‘Edit…’ button:
    Deploy-TeamsBackgrounds08
    Select PowerShell from the Script type drop-down and then paste the detection script generated earlier:
    Deploy-TeamsBackgrounds09
    and click OK to close the script editor window.
    Define the user experience:
    Deploy-TeamsBackgrounds10
    Note that I saw a warning shown at this point.
    Define any requirements (e.g. only deploy on Windows 10) and dependencies, then click through to generate the application.
  7. Distribute the content, then configure a deployment. I used the ‘All Staff’ user collection to deploy to.

Once the application appears in the Software Center, an end-user can click to install the custom backgrounds and the ‘application’ is downloaded to the Configuration Manager cache, then the batch file is triggered, which in turn executes the PowerShell script to copy the custom background images to the correct location in the user’s profile. The detection script then runs and detects the presence of the specified file.

Running SonarQube for a .NET Core project in Azure DevOps YAML multi-stage pipelines

We have been looking migrating some of our common .NET Core libraries into new NuGet packages and have taken the chance to change our build process to use Azure DevOps Multi-stage Pipelines. Whilst doing this I hit a problem getting SonarQube analysis working, the documentation I found was a little confusing.

The Problem

As part of the YAML pipeline re-design we were moving away from building Visual Studio SLN solution files, and swapping to .NET Core command line for the build and testing of .CSproj files. Historically we had used the SonarQube Build Tasks that can be found in the Azure DevOps Marketplace to control SonarQube Analysis. However, if we used these tasks in the new YAML pipeline we quickly found that the SonarQube analysis failed saying it could find no projects

##[error]No analysable projects were found. SonarQube analysis will not be performed. Check the build summary report for details.

So I next swapped to using use the SonarScanner for .NET Core, assuming the issue was down to not using .NET Core commands. This gave YAML as follows,

- task: DotNetCoreCLI@2
     displayName: 'Install Sonarscanner'
     inputs:
       command: 'custom'

custom: ‘tool’

arguments: ‘install –global dotnet-sonarscanner –version 4.9.0

– task: DotNetCoreCLI@2
displayName: ‘Begin Sonarscanner’
inputs:
command: ‘custom’
custom: ‘sonarscanner’

arguments: ‘begin /key:”$(SonarQubeProjectKey)” /name:”$(SonarQubeName)” /d:sonar.host.url=”$(SonarQubeUrl)” /d:sonar.login=”$(SonarQubeProjectAPIKey)” /version:$(Major).$(Minor)’

…. Build and test the project

– task: DotNetCoreCLI@2
displayName: ‘En Sonarscanner’
inputs:
command: ‘custom’
custom: ‘sonarscanner’

arguments: ‘end /key:”$(SonarQubeProjectKey)” ‘

However, this gave exactly the same problem.

The Solution

The solution it turns out was nothing to do with using the either of the ways to trigger SonarQube analysis, it was down to the fact that the .NET Core .CSProj file did not have a unique GUID. Historically this had not been an issue as if you trigger SonarQube analysis via a Visual Studio solution GUIDs are automatically injected. The move to building using the .NET core command line was the problem, but the fix was simple, just add a unique GUID to each CS project file.

<Project Sdk="MSBuild.Sdk.Extras">
  <PropertyGroup>
     <TargetFramework>netstandard2.0</TargetFramework>
     <PublishRepositoryUrl>true</PublishRepositoryUrl>
     <EmbedUntrackedSources>true</EmbedUntrackedSources>
     <ProjectGuid>e2bb4d3a-879c-4472-8ddc-94b2705abcde</ProjectGuid>

Once this was done, either way of running SonarQube worked.

After a bit of thought, I decided to stay with the same tasks I have used historically to trigger analysis. This was for a few reasons

  • I can use a central Service Connector to manage credentials to access SonarQube
  • The tasks manage the installation and update of the SonarQube tools on the agent
  • I need to pass less parameters about due to the use of the service connector
  • I can more easily include the SonarQube analysis report in the build

So my YAML now looks like this

- task: SonarSource.sonarqube.15B84CA1-B62F-4A2A-A403-89B7A063157.SonarQubePrepare@4
  displayName: 'Prepare analysis on SonarQube'
  inputs:
    SonarQube: Sonarqube
    projectKey: '$(sonarqubeKey)'
    projectName: '$(sonarqubeName)'
    projectVersion: '$(Major).$(Minor)'
    extraProperties: |
          # Additional properties that will be passed to the scanner, 
          # Put one key=value per line, example:
          # sonar.exclusions=**/*.bin
          sonar.dependencyCheck.reportPath=$(Build.SourcesDirectory)/dependency-check-report.xml     
       sonar.dependencyCheck.htmlReportPath=$(Build.SourcesDirectory)/dependency-check-report.html
          sonar.cpd.exclusions=**/AssemblyInfo.cs,**/*.g.cs
             sonar.cs.vscoveragexml.reportsPaths=$(System.DefaultWorkingDirectory)/**/*.coveragexml
         sonar.cs.vstest.reportsPaths=$(System.DefaultWorkingDirectory)/**/*.trx

… Build & Test with DotNetCoreCLI

- task: SonarSource.sonarqube.6D01813A-9589-4B15-8491-8164AEB38055.SonarQubeAnalyze@4
  displayName: 'Run Code Analysis'

- task: SonarSource.sonarqube.291ed61f-1ee4-45d3-b1b0-bf822d9095ef.SonarQubePublish@4
  displayName: 'Publish Quality Gate Result'

Addendum..

Even though I don’t use it in the YAML, I still found a use for the .NET Core SonarScanner commands.

We use the Developer edition of SonarQube, this understands Git branches and PR. This edition has a requirement that you must perform an analysis on the master branch before any analysis on other branches can be done. This is because the branch analysis is measured relative to the quality of the master branch. I have found the easiest way to establish this baseline, even if it is of an empty project, is to run SonarScanner from the command on my PC, just to setup the base for any PR to be measured against.

Announcing the deprecation of my Azure DevOps Pester Extension as it has been migrated to the Pester Project and republished under a new ID

Back in early 2016 I wrote an Azure DevOps Extension to wrapper Pester, the Powershell unit testing tool. Over the years I updated it, and then passed the support of it over to someone who knows much more about Powershell and Pester than I Chris Gardner who continued to develop it.

With the advent of cross-platform Powershell Core we realized that the current extension implementation had a fundamental limitation. Azure DevOps Tasks can only be executed by the agent using the Windows version of Powershell or Node. There is no option for execution by Powershell Core, and probably never will be. As Pester is now supported by Powershell Core this was a serious limitation.

To get around this problem I wrote a Node wrapper to allow the existing Powershell task to be executed using Node, by running a Node script then shelling out to Powershell or Powershell Core. A technique I have since used to make other extensions of mine cross-platform

Around this time we started to discuss whether my GitHub repo was really the best home for this Pester extension, and in the decided that this major update to provide cross-platform support was a good point to move it a new home under the ownership of Pester Project.

So, given all that history, I am really pleased to say that I am deprecating my Pester Extension and adding instructions that though my extension is not going away and will continue to work as it currently does, it will not be updated again and all users should consider swapping over to the new cross-platform version of the extension that is the next generation of same code base but now owned and maintained by the Pester project (well still Chris in reality).

Unfortunately, Azure DevOps provides no way to migrate ownership of an extension. So to swap to the new version will require some work. If you are using YAML the conversion is only a case of changing the task name/id. If you are using the UI based builds or release you need to add the new task and do some copy typing of parameters. The good news is that all the parameter options remain the same so it should be a quick job.

Also please note that any outstanding issues, not fixed in the new release, have been migrated over to the extensions now home, they have not been forgotten.

So hope you all like the new enhanced version of the Pester Extension and thanks to Chris for sorting the migration and all his work support it.

Fix for ‘System.BadImageFormatException’ when running x64 based tests inside a Azure DevOps Release

This is one of those blog posts I write to remind my future self how I fixed a problem.

The Problem

I have a release that installs VSTest and runs some integration tests that target .NET 4.6 x64. All these tests worked fine in Visual Studio. However, I got the following errors for all tests when they were run in a release

2020-04-23T09:30:38.7544708Z vstest.console.exe "C:agent_workr1aPaymentServicesdroptestartifactsPaymentService.IntegrationTests.dll"

2020-04-23T09:30:38.7545688Z /Settings:"C:agent_work_tempuxykzf03ik2.tmp.runsettings"

2020-04-23T09:30:38.7545808Z /Logger:"trx"

2020-04-23T09:30:38.7545937Z /TestAdapterPath:"C:agent_workr1aPaymentServicesdroptestartifacts"

2020-04-23T09:30:39.2634578Z Starting test execution, please wait...

2020-04-23T09:30:39.4783658Z A total of 1 test files matched the specified pattern.

2020-04-23T09:30:40.8660112Z   X Can_Get_MIDs [521ms]

2020-04-23T09:30:40.8684249Z   Error Message:

2020-04-23T09:30:40.8684441Z    Test method PaymentServices.IntegrationTests.ControllerMIDTests.Can_Get_MIDs threw exception:

2020-04-23T09:30:40.8684574Z System.BadImageFormatException: Could not load file or assembly 'PaymentServices, Version=1.0.0.0, Culture=neutral, PublicKeyToken=null' or one of its dependencies. An attempt was made to load a program with an incorrect format.

2020-04-23T09:30:40.8684766Z   Stack Trace:

2020-04-23T09:30:40.8684881Z       at PaymentServices.IntegrationTests.ControllerMIDTests.Can_Get_MIDs()

2020-04-23T09:30:40.9038788Z Results File: C:agent_work_tempTestResultssvc-devops_SVRHQAPP027_2020-04-23_10_30_40.trx

2020-04-23T09:30:40.9080344Z Total tests: 22

2020-04-23T09:30:40.9082348Z      Failed: 22

2020-04-23T09:30:40.9134858Z ##[error]Test Run Failed.

Solution

I needed to tell vstest.console.exe to run x64 as opposed to it’s default of x32. This can be done with a command line override –platform:x64

image

And more enriching the data available in my Azure DevOps Pipelines Cross Platform Release Notes Task

I have today released another enrichment to the dataset available in my Cross Platform Release Notes Azure Pipeline Task. It now returns an extra array of data that links work items and commits to build artifacts.

So your reporting objects are:

Array Objects

  • workItems – the array of all work item associated with the release
  • commits – the array of all commits associated with the release
  • pullRequests – the array of all PRs referenced by the commits in the release
  • The new one – builds – the array of the build artifacts that CS and WI are associated with. Note that this is a object with three properties
    • build – the build details
    • commits – the commits associated with this build
    • workitems – the work items associated with the build

Release objects (only available in a release)

  • releaseDetails – the release details of the release that the task was triggered for.
  • compareReleaseDetails – the the previous successful release that comparisons are being made against

Build objects

  • buildDetails – if running in a build, the build details of the build that the task is running in. If running in a release it is the build that triggered the release.

Note: To dump all possible values use the form {{json propertyToDump}} this runs a custom Handlebars extension to do the expansion

It is important to realised that these arrays are only available using the Handlebars form of templating. You can find sample here

Further enriching the data available in my Azure DevOps Pipelines Cross Platform Release Notes Task

I recently post about Enriching the data available in my Azure DevOps Pipelines Cross Platform Release Notes Task by adding Pull Request information. Well, that first release was fairly limited only working for PR validation builds, so I have made more improvements and shipped a newer version.

The task now will, as well as checking for PR build trigger, try to associate the commits associated with a build/release pipeline to any completed PRs in the repo. This is done using the Last Merge Commit ID, and from my tests seems to work for the various types of PR e.g. squash, merge, rebased and semi-linear.

The resultant set of PRs are made available to the release notes template processor as an array. However, there is a difference between the existing arrays for Work Items and Commits and the new one for Pull requests. The new one is only available if you are using the new Handlebars based templating mode.

You would add a block in the general form….

{{#forEach pullRequests}}
{{#if isFirst}}### Associated Pull Requests (only shown if  PR) {{/if}}
*  **PR {{this.id}}**  {{this.title}}
{{/forEach}}

The reason I have chosen to only support Handlebars is it make the development so much easier and provides a more flexible solution, given all the Handlebar helpers available. I think this might be the first tentative step towards deprecating my legacy templating solution in favour of only shipping Handlebars support.