What to do when moving your Azure DevOps organisation from one region to another is delayed.

There are good reasons why you might wish to move an existing Azure DevOps organisation from one region to another. The most common ones are probably:

  • A new Azure DevOps region has become available since you created your organisation that is a ‘better home’ for your projects.
  • New or changing national regulations require your source stored in a specific location.
  • You want your repositories as close to your workers as possible, to reduce network latency.

One of these reasons meant I recently had to move an Azure DevOps organisation, so followed the documented process. This requires you to

  1. Whilst logged in as the Azure DevOps organisation owner, open the Azure DevOps Virtual Support Agent
  2. Select the quick action ‘Change Organization Region’
  3. Follow the wizard to pick the new region and the date for the move.

You are warned that there could be a short loss of service during the move. Much of the move is done as a background process. It is only the final switch over that can interrupt service, hence this interruption being short.

I followed this process, but after the planned move date I found my organisation has not moved. In the Virtual Support Agent, I found the message.

Please note that region move requests are currently delayed due to ongoing deployments. We may not be able to perform the change at your requested time and may ask you to reschedule. We apologize for the potential delay and appreciate your patience!

I received no other emails, I suspect overly aggressive spam filters were the cause of that, but it meant I was unclear what to do next. Should I:

  1. Just wait i.e. do not reschedule anything, even though the target date is now in the past
  2. Reschedule the existing move request to a date in the future using the virtual assistant wizard
  3. Cancel the old request and start the process again from scratch

After asking the question in the Visual Studio Developer Community Forums I was told the correct action is to cancel the old request and request a new move date. It seems that once your requested date is passed the move will not take place no matter how long you wait.

Hence, I created a new request, which all went through exactly as planned.

Porting my Release Notes Azure DevOps Pipelines Extension to GitHub Actions

One of my most popular Azure DevOps Extensions is my Release Notes Pipeline task. This allows the creation of release notes using information obtained from the Azure DevOps API and formatted using a Handlebars Template.

Given the popularity of GitHub Actions, I got to wondering whether porting this extension was viable?

Well the release of my new Generate Release Notes with a Handlebars Template action shows that it was.

The basic concept of this new action is the same as for the older task, get information on the pipeline/workflow run using the API and then format it using a Handlebars template. However, the information that can be returned is different. But this stands to reason as GitHub is not Azure DevOps. This is especially true when you consider the differences between the simplicity of GitHub Issues and the complexity, and variability of format, of Azure DevOps Work Items

The new action is focused on the workflow run it is called from. It make an API call to get the details of the run. This contains a lot of information about the run and URL links to associated items. Using these link, the associated information is retrieved using the API and the results added to the objects available in the Handlebars template. In this initial version of the Action the objects tree available in a template includes:

  • runDetails – the details of the current workflow run
    • pull_requests – the array of pull requests associated with the run
      • commits – the array of commits associated with the PR
      • comments – the array of comment associated with the PR
      • linkedIssues – the array of linked issues with the PR

As with my Azure DevOps Extension I have made use of the extensibility of Handlebars. My new action includes all the Handlebar Helpers, plus some action specific helpers I have written and you have the ability to add your own custom handlebar helpers if needed.

So I hope people find this new Action useful, I guess only time will tell

Business Process Automation and Integration in the Cloud

Organisations are facing increased, and unprecedented, pressure from the market to transform digitally, and as a result, they need to think how they can become more attractive in their specific market space. The most common area to concentrate on is becoming more efficient, and this can be thought of in discrete parts: streamlining internal, often complex, business processes, making the organisation easier to work with for suppliers or customers, having better cross business views on information, improving service delivery, or saving on manual processing. All these efficiencies are brought to bear by automation, whether this is person to system data sharing, system to system, business to business; the same principles apply but different techniques may be used.

Today most organisations run their business on systems that run across multiple heterogeneous environments. This poses interoperability issues and other such difficulties. To achieve this, businesses need to look to integrating the data solutions to provide easy access to standard data across internal and external systems, layering on top of business process automation.

Business process automation is now an entity will define business processes, and automate them, so they can be run repeatedly and reliably, at a low cost, with a reduction in human error.

Running UWP Unit Tests as part of an Azure DevOps Pipeline

I was reminded recently of the hoops you have to jump through to run UWP unit tests within an Azure DevOps automated build.

The key steps you need to remember are as follows

Desktop Interaction

The build agent should not be running as a service it must be able to interact with the desktop.

If you did not set this mode during configuration this post from Donovan Brown shows how to swap the agent over without a complete reconfiguration.

Test Assemblies

The UWP unit test projects are not built as a DLL, but as an EXE.

I stupidly just made my VSTest task look for the generated EXE and run the tests they contained. This does not work generating the somewhat confusing error

Test run will use DLL(s) built for framework .NETFramework,Version=v4.0 and platform X86. Following DLL(s) do not match framework/platform settings.
BlackMarble.Spectrum.FridgeManagement.Client.OneWire.UnitTests.exe is built for Framework .NETCore,Version=v5.0 and Platform X86.

What you should search for as the entry point for the tests is the .appxrecipe file. Once tI used this my tests ran.

So my pipeline YML to run all the tests in a built solutions was

- task: VisualStudioTestPlatformInstaller@1
   inputs:
      packageFeedSelector: 'nugetOrg'
      versionSelector: 'latestPreRelease'              

- task: VSTest@2
    displayName: 'VSTest - testAssemblies'
    inputs:
       platform: 'x86'
       configuration: '$(BuildConfiguration)'
       testSelector: 'testAssemblies' 
testAssemblyVer2: | # Required when testSelector == TestAssemblies
         ***unittests.dll
        ***unittests.build.appxrecipe
         !***TestAdapter.dll
         !**obj** 
       searchFolder: '$(Build.SourcesDirectory)/src'
       resultsFolder: '$(System.DefaultWorkingDirectory)TestResults'
       runInParallel: false
       codeCoverageEnabled: true
       rerunFailedTests: false
       runTestsInIsolation: true
       runOnlyImpactedTests: false
        
- task: PublishTestResults@2

   displayName: 'Publish Test Results **/TEST-*.xml'
  condition: always()

Out of Memory running SonarQube Analysis on a large projects

Whilst adding SonarQube analysis to a large project I started getting memory errors during the analysis phase. The solution was to up the memory available to the SonarQube Scanner on the my build agent, not the memory on the SonarQube server as I had first thought. This is done with an environment variable as per the documentation, but how best to do this within our Azure DevOps build systems?

The easiest way to set the environment variable `SONAR_SCANNER_OPTS` on every build agent is to just set it via a Azure Pipeline variable. This works because the build agent makes all pipeline variables available as environment variables at runtime.

So as I was using YML Pipeline, I set a variable within the build job

job: build
timeoutInMinutes: 240
variables:
- name: BuildConfiguration
value: 'Release'
- name: SONAR_SCANNER_OPTS
value: -Xmx4096m
steps:

I found I had to quadruple the memory allocated to the scanner. Once this was done my analysis completed

Sponsoring DDD 2020

A few weeks back, I wrote about how we aren’t asking for sponsors for our online Developer Day, as there aren’t any significant costs to cover. Instead, we were directing people towards making a donation to The National Museum of Computing, an organisation which does great things for our industry, but has been finding this year challenging. And many thanks to those of you who have already donated.

However, some organisations, who usually sponsor DDD, have been in touch about still sponsoring DDD in some way, as they want to show their support.

Therefore, if an organisation wishes to donate to TNMOC using the link above, we will still count that as sponsorship, and we are delighted for their ongoing support. Please contact ddd@blackmarble.com for more details.

Thank you again for your support, and hope you can join us at DDD on the 12th December!

DDD Logo
DDD Logo
TNMOC Logo
The National Museum of Computing

Getting confused over Azure DevOps Pipeline variable evaluation

Introduction

The use of variables is important in Azure DevOps pipelines, especially when using YML templates. They allow a single pipeline to be used for multiple branches/configurations etc.

The most common form of variables you see is are the predefined built in variables e.g. $(Build.BuildNumber) and your own custom ones e.g. $(var). Usually the value of these variables are set before/as the build is run, as an input condition.

But this is not the only way variables can be used. As noted in the documentation there are different ways to access a variable…

In a pipeline, template expression variables ${{ variables.var }} get processed at compile time, before runtime starts. Macro syntax variables $(var) get processed during runtime before a task runs. Runtime expressions $[variables.var] also get processed during runtime but were designed for use with conditions and expressions.

Azure DevOps Documentation

99% of the time I have been fine using just the $(var) syntax, but I recently was working on a case where this would not work for me.

The Issue

I had a pipeline that made heavy use of YML templates and conditional task insertion to include sets of task based upon the manually entered and pre-defined variables.

The problems that one of the tasks, used in a template, set a boolean output variable $(outVar) by calling

echo '##vso[task.setvariable variable=outvar;isOutput=true]true'

This task created the output variable could be accessed by other tasks as the variable $(mytask.outvar), but it was set at runtime it not available at the time of the YML compilation.

This caused me a problem as it meant that it could not be used in the template’s conditional task inclusion blocks as it as not present art compile time when this code is evaluated e.g.

- ${{ if eq(mytask.outvar, 'true') }} :
  # the task to run if the condition is met
  - task: Some.Task@1 
    ....

I tied referencing the variable using all forms of $ followed by brackets syntax I could think of, but it did not help.

The lesson here is that you cannot make a runtime value a compile time value by wishing it to change.

The only solution I could find was to make use of the runtime variable in a place where it can be resolved. If you wish to enable or disable a task based on the variable value then the only option is to use the condition parameter

  # the task to run if the condition is met
  - task: Some.Task@1 
    condition: and(succeeded(), eq(mytask.outvar, 'true'))
    ....

The only downside of this way of working as opposed to the conditional insertion is that

  • If you conditional insertion, non required tasks are never shown in the pipeline as they are not compiled into it
  • If using the condition property to exclude a task, it will still appear in the log, but it can be seen that it has not been run.

So I got there in the end, it was just not as neat as I had hoped, but I do have a clearer understanding of compile and runtime variables in Azure DevOps YML

Positively Impacting your Organisation with Collaborative Working

Collaboration has been and will continue to be one of the important business advantages that the Cloud can deliver to an organisation. Collaboration can be thought of as not just connecting people to one another and improving their day to day working practices, but also enabling and encouraging collaboration between people and data.

Black Marble can support your move to the cloud for collaboration services between people and data. We can help your organisation realise the full potential of people to people collaboration using services such as SharePoint and, in particular for this white paper, Microsoft Teams. Our approach will help you identify how Microsoft’s collaboration solutions can improve your ways of working, whilst helping you visualise your end goal.

Positively Impacting your Organisation with Collaborative Working.
Positively Impacting your Organisation with Collaborative Working.

SPWakeUp (SPWakeUp3) v1.2.0 Released

SPWakeUp version 1.2.0 has been released. This version includes the ability to import a list of additional URLs to be woken from a file instead of providing a series of URLs to be included individually on the command line.

Use the ‘-IncludeFile:’ command line parameter to specify the full path to the file containing URLs to be imported.

The file specified should contain a list of URLs one per line.