Swapping my Azure DevOps Pipeline Extensions release process to use Multistage YAML pipelines

In the past I have documented the build and release process I use for my Azure DevOps Pipeline Extensions and also detailed how I have started to move the build phases to YAML.

Well now I consider that multistage YAML pipelines are mature enough to allow me to do my whole release pipeline in YAML, hence this post.

image

My pipeline performs a number of stages, you can find a sample pipeline here. Note that I have made every effort to extract variables into variable groups to aid reuse of the pipeline definition. I have added documentation as to where variable are stored and what they are used for.

The stages are as follows

Build

The build phase does the following

  • Updates all the TASK.JSON files so that the help text has the correct version number
  • Calls a YAML template (build-Node-task) that performs all the tasks to transpile a TypeScript based task – if my extension contained multiple tasks this template would be called a number of time
    • Get NPM packages
    • Run Snyk to check for vulnerabilities – if any vulnerabilities are found the build fails
    • Lint and Transpile the TypeScript – if any issue are found the build fails
    • Run any Unit test and publish results – if any test fail the build fails
    • Package up the task (remove dev dependencies)
  • Download the TFX client
  • Package up the Extension VSIX package and publish as a pipeline artifact.

Private

The private phase does the following

  • Using another YAML template (publish-extension) publish the extension to the Azure DevOps Marketplace, but with flags so it is private and only assessible to my account for testing
    • Download the TFX client
    • Publishes the Extension to the Marketplace

This phase is done as a deployment job and is linked to an environment,. However, there are no special approval requirements are set on this environment. This is because I am happy for the release to be done to the private instance assuming the build phase complete without error.

Test

This is where the pipeline gets interesting. The test phase does the following

  • Runs any integration tests. These could be anything dependant on the extension being deployed. Unfortunately there is no option at present in multistage pipeline for a manual task to say ‘do the manual tests’, but you could simulate similar by sending an email or the like.

The clever bit here is that I don’t want this stage to run until the new private version of the extension has been published and is available; there can be a delay between TFX saying the extension is published and it being downloadable by an agent. This can cause a problem in that you think you are running tests against a different version of the extension to one you have. To get around this problem I have implemented a check on the environment this stage’s deployment job is linked to. This check runs an Azure Function to check the version of the extension in the Marketplace. This is exactly the same Azure Function I already used in my UI based pipelines to perform the same job.

The only issue here is that this Azure Function is used as an exit gate in my UI based pipelines; to not allow the pipeline to exit the private stage until the extension is publish. I cannot do this in a multistage YAML pipeline as environment checks are only done on entry to the environment. This means I have had to use an extra Test stage to associate the entry check with. This was setup as follows

  • Create a new environment
  • Click the ellipse (…) and pick ‘approvals and checks’
  • Add a new Azure Function check
  • Provide the details, documented in my previous post, to link to your Azure Function. Note that you can, in the ’control options’ section of the configuration, link to a variable group. This is a good place to store all the values, you need to provide
    • URL of the Azure Function
    • Key to us the function
    • The function header
    • The body – this one is interesting. You need to provide the build number and the GUID of a task in the extension for my Azure Function. It would be really good if both of these could be picked up from the pipeline trying to use the environment. This would allow a single ‘test’ environment to be created for use by all my extensions, in the same way there are only a single ‘private’ and ‘public’ environment. However, there is a problem, the build number is picked up OK, but as far as I can see I cannot access custom pipeline variables, so cannot get the task GUID I need dynamically. I assume this is because this environment entry check is run outside of the pipeline. The only solution  can find is to place the task GUID as a hard coded value in the check declaration (or I suppose in the variable group). The downside of this is it means I have to have an environment dedicated to each extension, each with a different task GUID. Not perfect, but not too much of a problem
    • In the Advanced check the check logic
    • In control options link to the variable group contain any variables used.

Documentation

The documentation stage again uses a template (generate-wiki-docs) and does the following

Public

The public stage is also a deployment job and linked to an environment. This environment has an approval set so I have to approve any release of the public version of the extension.

As well as doing the same as private stage this stage does the following

Summary

It took a bit of trial and error to get this going, but I think I have a good solution now. The fact that the bulk of the work is done using shared templates means I should get good reuse of the work I have done. I am sure I will be able to improve the template as time goes on but it is a good start

My Azure DevOps Pipeline is not triggering on a GitHub Pull request – fixed

I have recently hit a problem that some of my Azure DevOps YAML pipelines, that I use to build my Azure DevOps Pipeline Extensions, are not triggering on a new PR being created on GitHub.

I did not get to the bottom of why this is happening, but I found a fix.

  • Check and of make a note of any UI declared variables in your Azure DevOps YAML Pipeline that is not triggering
  • Delete the pipeline
  • Re-add the pipeline, linking to the YAML file hosted on GitHub. You might be asked to re-authorise the link between Azure DevOps Pipelines and GitHub.
  • Re-enter any variables that are declared via the Pipelines UI and save the changes

Your pipeline should start to be triggered again

Enriching the data available in my Azure DevOps Pipelines Cross Platform Release Notes Task

A common request for my Generate Release Notes Tasks is to enrich the data available beyond basic build, work item and commit/changeset details. I have resisted these requests as it felt like a never ending journey to start. However, I have now relented and added the option to see any pull request information available.

This feature is limited, you obviously have to be using artifacts that linked to a Git repo, and also the Git repo have to on an Azure DevOps hosted repository. This won’t meet everyone’s needs but it is a start.

What was already available

Turns out there was already a means to get a limited set of PR details from a build. You used the form

**Build Trigger PR Number**: ${buildDetails.triggerInfo['pr.number']}

or in handlebars format

**Build Trigger PR Number**: {{lookup buildDetails.triggerInfo 'pr.number'}} 

The improvements

That said I have improved the options. There is now a new `prDetails` object available to the template.

If you use the dump option

${JSON.stringify(prDetails)}    

You can see the fields available

{
     "repository": {
         "id": "bebd0ae2-405d-4c0a-b9c5-36ea94c1bf59",
         "name": "VSTSBuildTaskValidation",
         "url": "https://richardfennell.visualstudio.com/670b3a60-2021-47ab-a88b-d76ebd888a2f/_apis/git/repositories/bebd0ae2-405d-4c0a-b9c5-36ea94c1bf59",
         "project": {
             "id": "670b3a60-2021-47ab-a88b-d76ebd888a2f",
             "name": "GitHub",
             "description": "A container for GitHub CI/CD processes",
             "url": "https://richardfennell.visualstudio.com/_apis/projects/670b3a60-2021-47ab-a88b-d76ebd888a2f",
             "state": "wellFormed",
             "revision": 411511726,
             "visibility": 2,
             "lastUpdateTime": "2019-10-10T20:35:51.85Z"
         },
         "size": 9373557,
         "remoteUrl": "https://richardfennell.visualstudio.com/DefaultCollection/GitHub/_git/VSTSBuildTaskValidation",
         "sshUrl": "richardfennell@vs-ssh.visualstudio.com:v3/richardfennell/GitHub/VSTSBuildTaskValidation",
         "webUrl": "https://richardfennell.visualstudio.com/DefaultCollection/GitHub/_git/VSTSBuildTaskValidation"
     },
     "pullRequestId": 4,
     "codeReviewId": 4,
     "status": 1,
     "createdBy": {
         "displayName": "Richard Fennell (Work MSA)",
         "url": "https://spsprodeus24.vssps.visualstudio.com/Ac0efb61e-a937-42a0-9658-649757d55d46/_apis/Identities/b1fce0e9-fbf4-4202-bc09-a290def3e98b",
         "_links": {
             "avatar": {
                 "href": "https://richardfennell.visualstudio.com/_apis/GraphProfile/MemberAvatars/aad.NzQzY2UyODUtN2Q0Ny03YjNkLTk0ZGUtN2Q0YjA1ZGE5NDdj"
             }
         },
         "id": "b1fce0e9-fbf4-4202-bc09-a290def3e98b",
         "uniqueName": "bm-richard.fennell@outlook.com",
         "imageUrl": "https://richardfennell.visualstudio.com/_api/_common/identityImage?id=b1fce0e9-fbf4-4202-bc09-a290def3e98b",
         "descriptor": "aad.NzQzY2UyODUtN2Q0Ny03YjNkLTk0ZGUtN2Q0YjA1ZGE5NDdj"
     },
     "creationDate": "2020-04-04T10:44:59.566Z",
     "title": "Added test.txt",
     "description": "Added test.txt",
     "sourceRefName": "refs/heads/branch2",
     "targetRefName": "refs/heads/master",
     "mergeStatus": 3,
     "isDraft": false,
     "mergeId": "f76a6556-8b4f-44eb-945a-9350124f067b",
     "lastMergeSourceCommit": {
         "commitId": "f43fa4de163c3ee0b4f17b72a659eac0d307deb8",
         "url": "https://richardfennell.visualstudio.com/670b3a60-2021-47ab-a88b-d76ebd888a2f/_apis/git/repositories/bebd0ae2-405d-4c0a-b9c5-36ea94c1bf59/commits/f43fa4de163c3ee0b4f17b72a659eac0d307deb8"
     },
     "lastMergeTargetCommit": {
         "commitId": "829ab2326201c7a5d439771eef5a57f58f94897d",
         "url": "https://richardfennell.visualstudio.com/670b3a60-2021-47ab-a88b-d76ebd888a2f/_apis/git/repositories/bebd0ae2-405d-4c0a-b9c5-36ea94c1bf59/commits/829ab2326201c7a5d439771eef5a57f58f94897d"
     },
     "lastMergeCommit": {
         "commitId": "53f393cae4ee3b901bb69858c4ee86cc8b466d6f",
         "author": {
             "name": "Richard Fennell (Work MSA)",
             "email": "bm-richard.fennell@outlook.com",
             "date": "2020-04-04T10:44:59.000Z"
         },
         "committer": {
             "name": "Richard Fennell (Work MSA)",
             "email": "bm-richard.fennell@outlook.com",
             "date": "2020-04-04T10:44:59.000Z"
         },
         "comment": "Merge pull request 4 from branch2 into master",
         "url": "https://richardfennell.visualstudio.com/670b3a60-2021-47ab-a88b-d76ebd888a2f/_apis/git/repositories/bebd0ae2-405d-4c0a-b9c5-36ea94c1bf59/commits/53f393cae4ee3b901bb69858c4ee86cc8b466d6f"
     },
     "reviewers": [],
     "url": "https://richardfennell.visualstudio.com/670b3a60-2021-47ab-a88b-d76ebd888a2f/_apis/git/repositories/bebd0ae2-405d-4c0a-b9c5-36ea94c1bf59/pullRequests/4",
     "supportsIterations": true,
     "artifactId": "vstfs:///Git/PullRequestId/670b3a60-2021-47ab-a88b-d76ebd888a2f%2fbebd0ae2-405d-4c0a-b9c5-36ea94c1bf59%2f4"
}

In templates this new object could be is used

**PR Title **: ${prDetails.title}

or in handlebars format.

**PR Details**: {{prDetails.title}}

It will be interesting to here feedback from the real world as opposed to test harnesses

Experiences setting up Azure Active Directory single sign-on (SSO) integration with GitHub Enterprise

Background

GitHub is a great system for individuals and OSS communities for both public and private project. However, corporate customers commonly want more control over their system than the standard GitHub offering. It is for this reason GitHub offers  GitHub Enterprise.

For most corporates, the essential feature that GitHub Enterprise offers is the use Single Sign On (SSO) i.e. allowing users to login to GitHub using their corporate directory accounts.

I wanted to see how easy this was to setup when you are using Azure Active Directory (AAD).

Luckily there is a step by step tutorial from Microsoft on how to set this up. Though, I would say that though detailed this tutorial has a strange structure in that it shows the default values not the correct values. Hence, the tutorial requires close reading, don’t just look at the pictures!

Even with close reading, I still hit a problem, all of my own making, as I went through this tutorial.

The Issue – a stray / in a URL

I entered all the AAD URLs and certs as instructed (or so I thought) by the tutorial into the Security page of GitHub Enterprise.

When I pressed the ‘Validate’ button in GitHub, to test the SSO settings, I got an error

‘The client has not listed any permissions for ‘AAD Graph’ in the requested permissions in the client’s application registration’

This sent me shown a rabbit hole looking at user permissions. That wasted a lot of time.

However, it turns out the issue was that I had a // in a URL when it should have been a  /. This was because I had made a cut and paste error when editing the tutorial’s sample URL and adding my organisation details.

Once I fixed this typo the validation worked, I was able to complete the setup and then I could to invite my AAD users to my GitHub Enterprise organisation.

Summary

So the summary is, if you follow the tutorial setting up SSO from AAD to GitHub Enterprise is easy enough to do, just be careful of over the detail.

A major new feature for my Cross-platform Release Notes Azure DevOps Pipelines Extension–Handlebars Templating Support

I recently got a very interesting PR for my Cross-platform Release Notes Azure DevOps Pipelines Extension from Kenneth Scott. He had added a new templating engine to the task, Handlebars.

Previous to this PR the templating in the task was done with a line by line evaluation of a template that used my own mark-up. This method worked but has limitations, mostly due to the line by line evaluation model.  With the Kenneth’s PR the option was added to write your templates in Handlebars, or stay with my previous templating engine.

Using Handlebars

If you use Handlebars, the template becomes something like

## Notes for release  {{releaseDetails.releaseDefinition.name}}    
**Release Number**  : {{releaseDetails.name}}
**Release completed** : {{releaseDetails.modifiedOn}}     
**Build Number**: {{buildDetails.id}}
**Compared Release Number**  : {{compareReleaseDetails.name}}    

### Associated Work Items ({{workItems.length}})
{{#each workItems}}
*  **{{this.id}}**  {{lookup this.fields 'System.Title'}}
   - **WIT** {{lookup this.fields 'System.WorkItemType'}} 
   - **Tags** {{lookup this.fields 'System.Tags'}}
{{/each}}

### Associated commits ({{commits.length}})
{{#each commits}}
* ** ID{{this.id}}** 
   -  **Message:** {{this.message}}
   -  **Commited by:** {{this.author.displayName}} 
{{/each}}

The whole template is evaluated by the Handlebars engine using its own mark-up to provide a means for looping across arrays and the like.

This seemed a great enhancement to the task. However, we soon realised that it could be better. Handlebars is extensible, so why not allow the extensibility to be used?

Using Handlebars Extensions

I have added extensibility in two ways. Firstly I have also added support for the common Handlebar-Helpers extensions, this added over 150 helpers. These are just accessed in a template as follows

## To confirm the handbars-helpers is work
The year is {{year}} 
We can capitalize "foo bar baz" {{capitalizeAll "foo bar baz"}}

I have also added the ability to provide a block of JavaScript as a task parameter is that is loaded as a custom Handlebars extension. So if you add the following block in the tasks customHandlebarsExtensionCode parameter.

module.exports = {foo: function () {return 'Returns foo';}};

You can access in the templates as

## To confirm our custom extension works
We can call our custom extension {{foo}}

It will be interesting to see how popular this alternative way of templating will be.

Where did all my test results go?

Problem

I recently tripped myself up whist adding SonarQube analysis to a rather complex Azure DevOps build.

The build has two VsTest steps, both were using the same folder for their test result files. When the first VsTest task ran it created the expected .TRX and .COVERAGE files and then published its results to Azure DevOps, but when the second VsTest task ran it over wrote this folder, deleting the files already present, before it generated and published it results.

This meant that the build itself had all the test results published, but when SonarQube looked for the files for analysis only the second set of test were present, so its analysis was incorrect.

Solution

The solution was easy, use different folders for each set of test results.

This gave me a build, the key items are shown below, where one VsTest step does not overwrite the previous results before they can be processed by any 3rd party tasks such as SonarQube.

steps:
- task: SonarSource.sonarqube.15B84CA1-B62F-4A2A-A403-89B77A063157.SonarQubePrepare@4
   displayName: 'Prepare analysis on SonarQube'
   inputs:
     SonarQube: Sonarqube
     projectKey: 'Services'
     projectName: 'Services'
     projectVersion: '$(major).$(minor)'
     extraProperties: |
      # Additional properties that will be passed to the scanner,
      sonar.cs.vscoveragexml.reportsPaths=$(System.DefaultWorkingDirectory)/**/*.coveragexml
      sonar.cs.vstest.reportsPaths=$(System.DefaultWorkingDirectory)/**/*.trx


… other build steps


- task: VSTest@2
   displayName: 'VsTest – Internal Services'
   inputs:
     testAssemblyVer2: |
      ***.unittests.dll
      !**obj**
     searchFolder: '$(System.DefaultWorkingDirectory)/src/Services'
     resultsFolder: '$(System.DefaultWorkingDirectory)TestResultsServices'
     overrideTestrunParameters: '-DeploymentEnabled false'
     codeCoverageEnabled: true
     testRunTitle: 'Services Unit Tests'
     diagnosticsEnabled: True
   continueOnError: true

- task: VSTest@2
   displayName: 'VsTest - External'
   inputs:
     testAssemblyVer2: |
      ***.unittests.dll
      !**obj**
     searchFolder: '$(System.DefaultWorkingDirectory)/src/ExternalServices'
     resultsFolder: '$(System.DefaultWorkingDirectory)TestResultsExternalServices'
     vsTestVersion: 15.0
     codeCoverageEnabled: true
     testRunTitle: 'External Services Unit Tests'
     diagnosticsEnabled: True
   continueOnError: true

- task: BlackMarble.CodeCoverage-Format-Convertor-Private.CodeCoverageFormatConvertor.CodeCoverage-Format-Convertor@1
   displayName: 'CodeCoverage Format Convertor'
   inputs:
     ProjectDirectory: '$(System.DefaultWorkingDirectory)'

- task: SonarSource.sonarqube.6D01813A-9589-4B15-8491-8164AEB38055.SonarQubeAnalyze@4
   displayName: 'Run Code Analysis'

- task: SonarSource.sonarqube.291ed61f-1ee4-45d3-b1b0-bf822d9095ef.SonarQubePublish@4
   displayName: 'Publish Quality Gate Result'

You need to pass a GitHub PAT to create Azure DevOps Agent Images using Packer

I wrote recently about Creating Hyper-V hosted Azure DevOps Private Agents based on the same VM images as used by Microsoft for their Hosted Agent.

As discussed in that post, using this model you will recreate your build agent VMs on a regular basis, as opposed to patching them. When I came to do this recently I found that the Packer image generation was failing with errors related to accessing packages.

Initially, I did not read the error message too closely and just assumed it was an intermittent issue as I had found you sometime get random timeouts with this process. However, when the problem did not go away after repeated retries I realised I had a more fundamental problem, so read the log properly!

Turns out the issue is you now have to pass a GitHub PAT token that has at least read access to the packages feed to allow Packer to authenticate with GitHub to read packages.

The process to create the required PAT is as follows

  1. In a browser login to GitHub
  2. Click your profile (top right)
  3. Select Settings
  4. Pick Developer Settings
  5. Pick Personal Access Tokens and create a new one that has read:packages enabled

image

Once created, this PAT needs to be passed into Packer. If using the settings JSON file this is just another variable

{
"client_id": "Azure Client ID",
"client_secret": "Client Secret",
"tenant_id": "Azure Tenant ID",
"subscription_id": "Azure Sub ID",
"object_id": "The object ID for the AAD SP",
"location": "Azure location to use",
"resource_group": "Name of resource group that contains Storage Account",
"storage_account": "Name of the storage account",
"ssh_password": A password",
"install_password": "A password",
"commit_url": "A url to to be save in a text file on the VHD, usually the URL if commit VHD based on",

"github_feed_token": "A PAT"

}

If you are running Packer within a build pipeline, as the other blog post discusses, then the PAT will be another build variable.

Once this change was made I was able to get Packer to run to completion, as expected.

Registration is open for the Global DevOps Bootcamp 2020 @ Black Marble

 

image

I am really pleased to say that we at Black Marble are again hosting a venue for this year’s edition of the Global DevOps Bootcamp on Saturday May 30th 2020.

For those who have not been to a previous GDBC event at Black Marble, or any of the other 70+ venues across the work, what can you expect on the day?

  • A video keynote from an Industry Leader in the DevOps field
  • A local keynote developing the topics of the bootcamp
  • The remainder of the day is made up of team based hands on exercises.

Lat years content can be seen here, this years will be all new.

It is worth stressing that this event is not a competition. It is a day of learning for people of all levels of experience. We encourage the forming of teams that are cross skill and include all levels of experience. The key aims for the day are that everyone learns and has a great time.

Oh, and did I mention it is a FREE event and lunch will be provided.

For more details have a look at that the central GDBC 2020 site

We do have limited spaces so if you are interested in booking your place please register here

Visual Studio Online is back and it is an editor this time!

Visual Studio Online is back. Unlike the previous usage of this name, which was an incarnation of what is now Azure DevOps Services, this is actually an editor for code. Just like you might expect it to be!

The new VSO, which is currently in preview, is a service running in Azure that allows you to in effect run Visual Studio Code on a Linux VM. 

Once you have signed into VSO with an MSA and it has created the required Resource Group and VSO Plan in your Azure subscription, you create one or more ‘environments’ that defines the size of the VM to use and which GitHub hosted repo the environment will edit.

image

You then start your environment and get the editor experience you would expect from Visual Studio Code running on a Linux instance, but in your browser.

image

This certainly opens more use-cases for editing of code that is too complex for the GitHub in browser editing experience, but you don’t want to keep a full local development setup.

Only time will tell how much I use it, but it looks interesting

A technique for porting PowerShell based Azure DevOps Extensions to Node so they can be run cross-platform without a complete re-write

Background

I’ve written a good few extensions for Azure DevOps Services. Most of the early ones I wrote were written in PowerShell, but of late I have tended to use Typescript (targeting Node.JS) for the added cross-platform support.

This has led me to consider if it was worth the effort to convert all my legacy extensions to support cross-platform usage?

This is of course assuming the tasks the extension contains are useful on a non-Window platform. There is no point porting a Windows only tool away from PowerShell.

Assuming a conversion is a useful thing, there are two obvious ways to go about it:

  • Completely re-write the task in TypeScript, but I would like to avoid this effort if possible.
  • To use PowerShell Core, this is option I decide to experiment with.

A Solution

You might think the answer is to just alter the task’s manifest to run PSCore as opposed PowerShell3. The problem is that the Azure DevOps Agent does not provide support for PSCore, only Node or PowerShell3 execution of scripts.

However, there is a way around this limitation. You can shell a PSCore session from Node, as is done with the Microsoft PowerShell/PSCore script runner tasks.

I had previously experimented with this technique with my Pester Test Runner. The process I followed was

  1. Alter the PowerShell script to accept all the task parameters it previously got via the SDK calls as script parameters
  2. Alter the task manifest to run a Node script
  3. In the new Node wrapper script get all the Azure DevOps variables and then run the old script via a PSCore shell with the variables passed as parameters

This had worked surprising well, the only negative was that all log messages seem to gain an extra line break, but I can live with that.  Oh, and yes before you ask there is a new cross platform version of the Pester test runner on the way, but it is moving home. More on that soon.

However, when I tried the same technique on another extension, specifically my Build Updating one, I hit a problem.

All the Pester task’s operations are against the file system, there is no communication back to the Azure DevOps Server. This is not true for the Build tasks. They needed to talk to the Azure DevOps API. To do this they have to get the agent’s access token. This was done using the PowerShell Azure DevOps SDK, which in this new way of working is not loaded (the agent previously did it automatically when executing a script via PowerShell3).

After a bit of experimentation trying to load the PowerShell Azure DevOps SDK inside my PowerShell script inside a Node wrapper (a bad idea) I found the best option was to use the Azure DevOps Node SDK to get the token in the wrapper script and pass it into the PowerShell script as an extra parameter (it is then passing into all the functions as needed). This is more of an edit than I wanted but not too much work, far easier than a complete rewrite.

You can see an example of a wrapper here

In Summary

So have now have a mechanism to port extensions for cross platform usage without a complete re-write. Hence adding value to what has already been created. I guess I have found some OSS work for 2020