Running UWP Unit Tests as part of an Azure DevOps Pipeline

I was reminded recently of the hoops you have to jump through to run UWP unit tests within an Azure DevOps automated build.

The key steps you need to remember are as follows

Desktop Interaction

The build agent should not be running as a service it must be able to interact with the desktop.

If you did not set this mode during configuration this post from Donovan Brown shows how to swap the agent over without a complete reconfiguration.

Test Assemblies

The UWP unit test projects are not built as a DLL, but as an EXE.

I stupidly just made my VSTest task look for the generated EXE and run the tests they contained. This does not work generating the somewhat confusing error

Test run will use DLL(s) built for framework .NETFramework,Version=v4.0 and platform X86. Following DLL(s) do not match framework/platform settings.
BlackMarble.Spectrum.FridgeManagement.Client.OneWire.UnitTests.exe is built for Framework .NETCore,Version=v5.0 and Platform X86.

What you should search for as the entry point for the tests is the .appxrecipe file. Once tI used this my tests ran.

So my pipeline YML to run all the tests in a built solutions was

- task: VisualStudioTestPlatformInstaller@1
   inputs:
      packageFeedSelector: 'nugetOrg'
      versionSelector: 'latestPreRelease'              

- task: VSTest@2
    displayName: 'VSTest - testAssemblies'
    inputs:
       platform: 'x86'
       configuration: '$(BuildConfiguration)'
       testSelector: 'testAssemblies' 
testAssemblyVer2: | # Required when testSelector == TestAssemblies
         ***unittests.dll
        ***unittests.build.appxrecipe
         !***TestAdapter.dll
         !**obj** 
       searchFolder: '$(Build.SourcesDirectory)/src'
       resultsFolder: '$(System.DefaultWorkingDirectory)TestResults'
       runInParallel: false
       codeCoverageEnabled: true
       rerunFailedTests: false
       runTestsInIsolation: true
       runOnlyImpactedTests: false
        
- task: PublishTestResults@2

   displayName: 'Publish Test Results **/TEST-*.xml'
  condition: always()

Out of Memory running SonarQube Analysis on a large projects

Whilst adding SonarQube analysis to a large project I started getting memory errors during the analysis phase. The solution was to up the memory available to the SonarQube Scanner on the my build agent, not the memory on the SonarQube server as I had first thought. This is done with an environment variable as per the documentation, but how best to do this within our Azure DevOps build systems?

The easiest way to set the environment variable `SONAR_SCANNER_OPTS` on every build agent is to just set it via a Azure Pipeline variable. This works because the build agent makes all pipeline variables available as environment variables at runtime.

So as I was using YML Pipeline, I set a variable within the build job

job: build
timeoutInMinutes: 240
variables:
- name: BuildConfiguration
value: 'Release'
- name: SONAR_SCANNER_OPTS
value: -Xmx4096m
steps:

I found I had to quadruple the memory allocated to the scanner. Once this was done my analysis completed

Sponsoring DDD 2020

A few weeks back, I wrote about how we aren’t asking for sponsors for our online Developer Day, as there aren’t any significant costs to cover. Instead, we were directing people towards making a donation to The National Museum of Computing, an organisation which does great things for our industry, but has been finding this year challenging. And many thanks to those of you who have already donated.

However, some organisations, who usually sponsor DDD, have been in touch about still sponsoring DDD in some way, as they want to show their support.

Therefore, if an organisation wishes to donate to TNMOC using the link above, we will still count that as sponsorship, and we are delighted for their ongoing support. Please contact ddd@blackmarble.com for more details.

Thank you again for your support, and hope you can join us at DDD on the 12th December!

DDD Logo
DDD Logo
TNMOC Logo
The National Museum of Computing

Getting confused over Azure DevOps Pipeline variable evaluation

Introduction

The use of variables is important in Azure DevOps pipelines, especially when using YML templates. They allow a single pipeline to be used for multiple branches/configurations etc.

The most common form of variables you see is are the predefined built in variables e.g. $(Build.BuildNumber) and your own custom ones e.g. $(var). Usually the value of these variables are set before/as the build is run, as an input condition.

But this is not the only way variables can be used. As noted in the documentation there are different ways to access a variable…

In a pipeline, template expression variables ${{ variables.var }} get processed at compile time, before runtime starts. Macro syntax variables $(var) get processed during runtime before a task runs. Runtime expressions $[variables.var] also get processed during runtime but were designed for use with conditions and expressions.

Azure DevOps Documentation

99% of the time I have been fine using just the $(var) syntax, but I recently was working on a case where this would not work for me.

The Issue

I had a pipeline that made heavy use of YML templates and conditional task insertion to include sets of task based upon the manually entered and pre-defined variables.

The problems that one of the tasks, used in a template, set a boolean output variable $(outVar) by calling

echo '##vso[task.setvariable variable=outvar;isOutput=true]true'

This task created the output variable could be accessed by other tasks as the variable $(mytask.outvar), but it was set at runtime it not available at the time of the YML compilation.

This caused me a problem as it meant that it could not be used in the template’s conditional task inclusion blocks as it as not present art compile time when this code is evaluated e.g.

- ${{ if eq(mytask.outvar, 'true') }} :
  # the task to run if the condition is met
  - task: Some.Task@1 
    ....

I tied referencing the variable using all forms of $ followed by brackets syntax I could think of, but it did not help.

The lesson here is that you cannot make a runtime value a compile time value by wishing it to change.

The only solution I could find was to make use of the runtime variable in a place where it can be resolved. If you wish to enable or disable a task based on the variable value then the only option is to use the condition parameter

  # the task to run if the condition is met
  - task: Some.Task@1 
    condition: and(succeeded(), eq(mytask.outvar, 'true'))
    ....

The only downside of this way of working as opposed to the conditional insertion is that

  • If you conditional insertion, non required tasks are never shown in the pipeline as they are not compiled into it
  • If using the condition property to exclude a task, it will still appear in the log, but it can be seen that it has not been run.

So I got there in the end, it was just not as neat as I had hoped, but I do have a clearer understanding of compile and runtime variables in Azure DevOps YML

Positively Impacting your Organisation with Collaborative Working

Collaboration has been and will continue to be one of the important business advantages that the Cloud can deliver to an organisation. Collaboration can be thought of as not just connecting people to one another and improving their day to day working practices, but also enabling and encouraging collaboration between people and data.

Black Marble can support your move to the cloud for collaboration services between people and data. We can help your organisation realise the full potential of people to people collaboration using services such as SharePoint and, in particular for this white paper, Microsoft Teams. Our approach will help you identify how Microsoft’s collaboration solutions can improve your ways of working, whilst helping you visualise your end goal.

Positively Impacting your Organisation with Collaborative Working.
Positively Impacting your Organisation with Collaborative Working.

SPWakeUp (SPWakeUp3) v1.2.0 Released

SPWakeUp version 1.2.0 has been released. This version includes the ability to import a list of additional URLs to be woken from a file instead of providing a series of URLs to be included individually on the command line.

Use the ‘-IncludeFile:’ command line parameter to specify the full path to the file containing URLs to be imported.

The file specified should contain a list of URLs one per line.

How to export Azure DevOps Classic Builds and Release to YAML

This is another one of those posts so I can remember where some useful information is….

If you are migrating your Azure DevOps Classic Builds and Release to Multi-Stage YAML then an import step is to export all the exiting build, task groups and release as YAML files.

You can do this by hand within the Pipeline UI, with a lot of cut and pasting, but much easier is to use the excellent Yamlizr – Azure DevOps Classic-to-YAML Pipelines CLI from Alex Vincent. A single CLI command exports everything with a Team project into a neat folder structure of template base YAML.

I cannot recommend the tool enough

Successful Software Delivery with DevOps

With DevOps best practices and Microsoft’s DevOps tooling, Black Marble can deliver agile planning, source code control, package management, build, testing and release automation to continuously integrate, test, deliver and monitor your application.

It is crucial to not only have the right people in place for your cloud adoption journey, but also to use the right processes and the right tools. A typical DevOps approach consists of cross-functional teams provisioning their own infrastructure, with high degrees of automation using templates, codified rules for security controls and cloud-native architecture.

This is where the core aspects of continuous value delivery meet the demands currently driving companies; an integrated team approach including enterprise agile and cloud computing.

Successful Software Delivery with DevOps
Successful Software Delivery with DevOps

Delivering AI in Policing

Over the last few years, I’ve spent a great deal of time working with police forces in the UK, and regularly have conversations with them about AI, and the potential impact on policing. More and more police forces are looking at AI in order to take advantage of the services available to drive insights from existing data, to produce the next generation of artificial intelligent applications. AI can be thought of as the pinnacle of data comprehension, analysis and insight, as a destination and a focused target for police forces to maintain their responsibility for huge amount of personal data which they hold. It needs quality and connected data and relies on solid governance and best practices to achieve reliable and good quality outcomes.

The new Police Digital and Data Strategy supports Policing Vision 2025 (the strategic direction of policing) and a key arm of that strategy includes how best to manage, analyse and share police data. AI offers unparalleled opportunities to police forces to make best use of their data.

For more information on Delivering AI in Policing, get in touch for a copy of my white paper.

Delivering AI in Policing White Paper.
Delivering AI in Policing White Paper.