But it works on my PC!

The random thoughts of Richard Fennell on technology and software development

Upgrading to SonarQube 5.2 in the land of Windows, MSBuild and TFS

SonarQube released version 5.2 a couple of weeks ago. This enabled some new features that really help if you are working with MSbuild or just on a Windows platform in general. These are detailed in the posts

The new ability to manage users with LDAP is good, but one of the most important for me is the way 5.2 ease the configuration with SQL in integrated security mode. This is mentioned in the upgrade notes; basically it boils down to the fact you get better JDBC drivers with better support for SQL Clustering and security.

We found the upgrade process mostly straight forward

  1. Download SonarQube 5.2 and unzipped it
  2. Replaced the files on your SonarQube server
  3. Edited the sonar.properties file with the correct SQL connection details for an integrated security SQL connection. As we wanted to move to integrated security we did not need to set the sonar.jdbc.username setting.
    Important: One thing was not that clear, if you want to use integrated security you do need the sqljdbc_auth.dll file in a folder on a search path (C:\windows\system32 is an obvious place to keep it). You can find this file on MSDN
  4. Once the server was restarted we ran the http://localhost:9000/setup command and it upgraded our DBs

And that was it, for the upgrade. We could then use the standard SonarQube upgrade features to upgrade our plug-ins and to add the new ones like the  LDAP one.

Once the LDAP plug-in was in place (and the server restarted) we were automatically logged into SonarQube with our Windows AD accounts, so that was easy.

However we hit a problem with the new SonarQube 5.2 architecture and LDAP. The issue was that with 5.2 there is now no requirement for the upgraded 1.0.2 SonarQube MSBuild runner to talk directly to the SonarQube DB, all communication is via the SonarQube server. Obviously the user account that makes the call to the SonarQube server needs to be granted suitable rights. That is fairly obvious, the point we  tripped up on was ‘who was the runner running as?’ I had assumed it was as the build agent account, but this was not the case. As the connection to SonarQube is a TFS managed service, it had its own security credentials. Prior to 5.2 these credentials (other than the SonarQube server URL) had not mattered as the SonarQube runner made it own direct connection to the SonarQube DB. Post 5.2, with no DB connection and LDAP in use,  these service credentials become important. Once we had set these correctly, to a user with suitable rights, we were able to do new SonarQube analysis runs.


One other item of note. The change in architecture with 5.2 means that more work is being done on the server as opposed the client runner. The net effect of this is there is a short delay between runs being completed and the results appearing on the dashboard. Once expect it, it is not an issue, but a worry the first time.

Why you need to use vNext build tasks to share scripts between builds

Whilst doing a vNext build from a TFVC repository I needed map both my production code branch and a common folder of scripts that I intended to use in a number of builds, so my build workspace was set to

  • Map – $/BM/mycode/main                                     - my production code
  • Map – $/BM/BuildDefinations/vNextScripts  - my shared PowerShell I wish to run in different builds e.g. assembly versioning.

As I wanted this to be a CI build, I also  set the trigger to $/tp1/mycode/main

The problem I found was that with my workspace set as above, the associated changes for the build include anything checked into $/BM and below. Also the source branch was set as $/BM



To fix this problem I had to remove the mapping to the scripts folder, once this was done  the associated changes shown were only those for my production code area, and the source branch was listed correctly.

But what to do about running my script?

I don’t really want to have to copy a common script to each build, huge potential for error there, and versioning issues if I want a common script in all build. The best solution I found was to take the PowerShell script, in my case the sample assembly versioning script provided for VSO, and package it as a vNext build Task. It took no modification just required the addition of a manifest file. You can find the task on my vNextBuild Github repo

This custom task could then be uploaded to my TFS server and used in all my builds. As it picks up it variable from environment variables it required so configuration, extracting the version number for the build number format.


If you wish to use this task, you need to follow the same instructions to setup your development environment as for the VSO Tasks, then

  1. Clone the repo https://github.com/rfennell/vNextBuild.git
  2. In the root of the repo run gulp to build the task
  3. Use tfx to upload the task to your TFS instance

Why can’t I assign a VSO user as having ‘eligible MSDN’ using an AAD work account?

When access VSO you have two authentication options; either a LiveID (or an MSA using it’s newest name) or a Work Account ID (a domain account). The latter is used to provide extra security, so a domain admin can easily control who has access to a whole set of systems. It does assume you have used Azure Active Directory (AAD) that is sync’d with your on premises AD, and that this AAD is used to back your VSO instance. See my previous post on this subject.

If you are doing this the issue you often see is that VSO does not pickup your MSDN subscription because it is linked to an MSA not a work account. This is all solvable, but there are hoops to jump through, more than there should be sometimes.

Basic Process

First you need to link your MSDN account to a Work Account

  • Login to https://msdn.micrsoft.com with the MSA that is associated with your MSDN account.
  • Click on the MSDN subscriptions menu option.
  • Click on the Link to work account  and enter your work ID. Note that it will also set your Microsoft Azure linked work account



Assuming your work account is listed in your AD/AAD, over in VSO you should now be able to …

  • Login as the VSO administrator
  • Invite any user in the AAD to your VSO instance via the link https://[theaccount].visualstudio.com/_user . A user can be invited as
    • Basic – you get 5 for free
    • Stakeholder – what we fall back to if there is an issue
    • MSDN Subscription – the one we want (in screenshot below the green box shows a user where MSDN has been validated, the red box is a user who has not logged in yet with an account associated with a valid MSDN subscription)


  • Once invited a user gets an email so they can login as shown below. Make sure you pick the work account login link (lower left. Note that this is mocked up in the screen shot below as which login options are shown appears in a context sensitive way, only being shown the first time a user connects and if the VSO is AAD backed. If you pick the main login fields (the wrong ones) it will try to login assuming the ID is an MSA, which will not work. This is particularly a confusing issue if you used the same email address for your MSA as your Work Account, more on this in the troubleshooting section


  • On later connections only the work ID login will be shown
  • Once a user has logged in for the first time with the correct ID, the VSO admin should be able to see the MSDN subscription is validated


We have seen problem that though the user is in the domain and correctly added to VSO it will not register that the MSDN subscription is active. These steps can help.

  • Make sure in the  https://msdn.microsoft.com portal you have actually linked your work ID. You still need to explicably do this even if your MSA and Work ID use the same email address e.g.   user@domain.com. Using the same email address for both IDs can get confusing, so I would recommend considering you setup your MSA email addresses to not clash with your work ID.
  • When you login to VSO MAKE SURE YOU USE THE WORK ID LOGIN LINK (LHS OF DIALOG UNDER VSO LOGO) TO LOGIN WITH A WORK ID AND NOT THE MAIN LIVEID FIELDS. I can’t stress this enough, especially if you use the same email address  for both the MSA and work account
  • If you still get issues with picking up the MSDN subscription
    • In VSO the admin should set the user to be a basic user
    • In  https://msdn.microsoft.com the user should make sure they did not make any typo's when linking the work account ID
    • The user should sign out of VSO and back in using their work ID, MAKE SURE THEYUSE THE CORRECT WORK ID LOGIN DIALOG. They should see the features available to a basic user
    • The VSO admin should change the role assignment in VSO to be MSDN eligible and it should flip over without a problem. There seems to be no need to logout and back in again.

Note if you assign a new MSA to an MSDN subscription it can take a little while to propagate, if you get issues that activation emails don’t arrive, pause a while and try again later. You can’t do any of this until your can login to MSDN with your MSA.

Release Manager - New deployment is not allowed as an another deployment is in progress

Whilst working with a vNext Release Management pipeline I started seeing the error

New deployment is not allowed as an another deployment is in progress.
Retry the deployment after sometime.

Problem was I could not see any blocked or paused deployment releases. All Internet searches mentioned multiple pipelines that share components, but this was not the issue.

Eventually I found the issue, my release pipeline included a step that ran CodedUI tests via TCM, hence a previous running of this template had triggered the test via TCM, but they had stalled. I found this by looking in MTM.


Release Management was just saying the release was rejected with the above error message, no clue about the unfinished test run. Not that helpful.

You might have expect Release Management to return only after the test had timed out, but that is only down to whether you set the release pipeline to wait or not, I had set mine not to wait.

Once I stopped this test run via MTM all was OK.

Running nUnit and Jasmine.JS unit tests in TFS/VSO vNext build

This article was first published on the Microsoft’s UK Developers site as Running nUnit and Jasmine.JS unit tests in TFS/VSO vNext build

With the advent of vNext build in TFS 2015 and Visual Studio Online running unit tests that are not MSTest based within your build process is far more straightforward than it used to be. No longer do you have to use custom XAML build activities or tell all your TFS build controllers where the test runner assemblies are. The ‘out the box’ vNext build Visual Studio Test task will automatically load any test adaptors it finds in the path specified for test runners in its advanced properties, a path that can be populated via NuGet.

Running nUnit tests

All this means that to find and run MSTest and nUnit tests as part of your build all you have to do is as follows

  1. Create a solution that contains a project with MStest and nUnit tests, in my sample this is a MVC web application project with its automatically created MSTest unit tests project.
  2. In the test project add some nUnit tests. Use NuGet to add the references to nUnit to the test project so it compiles.
  3. Historically in your local Visual Studio instance you needed to install the nUnit Test Runner VSIX package from Visual Studio Gallery – this allows Visual Studio to discover your nUnit tests, as well as any MSTest ones, and run them via the built in Test Explorer


    IMPORTANT Change –
    However installing this VSIX package is no longer required. If you use Nuget to add the nUnit Test Runner to the solution, as well as the nUnit package itself, then Visual Studio can find the nUnit tests without the VSIX package. This is useful but not world changing on your development PC, but when on the build box it means the NuGet restore will make sure the nUnit test adapter assemblies are pulled down onto the local build boxes file system and used to find tests with no extra work.

    : If you still want to install the VSIX package on your local Visual Studio instance you can, it is just you don’t have to.
  4. Check in your solution into TFS/VSO source control. It does not matter if it is TFVC or Git based
  5. Create a new vNext build using the Visual Studio template
  6. You can leave most of the parameters on default setting. But you do need to edit the Visual Studio Test task’s advanced settings to point at the NuGet packages folder for your solution (which will be populated via NuGet restore) so the custom nUnit test adaptor can be found i.e. usually setting it to  $(Build.SourcesDirectory)\packages

  7. The build should run and find your tests, the MStest ones because they are built in and the nUnit ones because it found the custom test adaptor due to the NuGet restore being done prior to the build. The test results can be found on the build summary page



But what if you want run Jasmine.JS test?

If you want to run Jasmine JavaScript unit tests the process is basically the same. The only major difference is that you do still need to install the Chutzpah Test runner on your local Visual Studio as a VSIX package to run the tests locally. There is a NuGet package for the Chutzpah test runner so you can avoid having to manually unpack the VSIX and get it into source control to deploy it to the build host (unless you really want to follow this process), but this package does not currently enable Visual Studio to find the Jasmine tests without the VSIX extension being installed, or at least it didn’t for me.

Using the solution I used before

  1. Use NuGet to add Jasmine.JS to the test project
  2. Add a test file to the test project e.g. mycode.tests.js (adding any JavaScript references needed to find any script code under test in the main WebApp project)
  3. Install the Chutzpah Test runner in your local Visual Studio as a VSIX extension, restart Visual Studio
  4. You should now be able to see and run the Jasmine test run in the test runner as well as the MSTest and nUnit tests.

  5. Add the NuGet package for the Chutzpah test runner to your solution, this is a solution level package, so does not need to be associated with any project.
  6. Check the revised code into source control
  7. In your vNext build add another Visual Studio Test task, set the test assembly to match your javascript test naming convention e.g. **\*.tests.js and the path to the custom test adaptor to $(Build.SourcesDirectory)\packages (as before)

  8. Run the revised build.

  9. You should see the two test tasks run and a pair of test results in the summary for the build.

So now hopefully you should find this a more straight forward way to added testing to your vNext builds. Allowing easy use of both your own build boxes and the hosted build service for VSO with testing frameworks they do not support ‘out the box’

Powershell to help plot graphs of how long TFS upgrades take

When doing TFS upgrades it is useful to know roughly how long they will take. The upgrade programs give a number of steps, but not all steps are equal. Some are quick, some are slow. I have found it useful to graph past updates so I can get a feel of how long an update will take given it got to ‘step x in y minutes’. You can do this by hand, noting down time as specific steps are reached. However for a long upgrade it usually means pulling data out of the TFS TPC upgrade logs.

To make this process easier I put together this script to find the step completion rows in the log file and format them out such that they are easy to graph in Excel

    $logfile = "TPC_ApplyPatch.log",
    $outfile = "out.csv"

# A function to covert the start and end times to a number of minutes
# Can't use simple timespan as we only have the time portion not the whole datetime
# Hence the hacky added a day-1 second
function CalcDuration

    $diff = [dateTime]$endTime - $startTime
    if ([dateTime]$endTime -lt $startTime)
       $diff += "23:59" # add a day as we past midnight

    [int]$diff.Hours *60 + $diff.Minutes

Write-Host "Importing $logfile for processing"
# pull out the lines we are interested in using a regular expression to extract the columns
# the (.{8} handle the fixed width, exact matches are used for the test
$lines = Get-Content -Path $logfile | Select-String "  Executing step:"  | Where{$_ -match "^(.)(.{8})(.{8})(Executing step:)(.{2})(.*)(')(.*)([(])(.*)([ ])([of])(.*)"} | ForEach{
        'Step' = $Matches[10]
        'TimeStamp' = $Matches[2]
        'Action' = $Matches[6]
# We assume the upgrade started at the timestamp of the 0th step
# Not true but very close
[DateTime]$start = $lines[0].TimeStamp

Write-Host "Writing results to $outfile"
# Work out the duration
$steps = $lines | ForEach{
        'Step' = $_.Step
        'TimeStamp' = $_.TimeStamp
        'EplasedTime' = CalcDuration -startTime $start -endTime $_.TimeStamp
        'Action' = $_.Action
$steps | export-csv $outfile -NoTypeInformation

# and list to screen

Running Typemock Isolator based tests in TFS vNext build

Typemock Isolator provides a way to ‘mock the un-mockable’, such as sealed private classes in .NET, so can be a invaluable tool in unit testing. To allow this mocking Isolator interception has to be started before any unit tests are run and stopped when completed. For a developer this is done automatically within the Visual Studio IDE, but on build systems you have to run something to do this as part of your build process. Typemock provide documentation and tools for common build systems such as MSBuild, Jenkins, Team City and TFS XAML builds. However, they don’t provide tools or documentation on getting it working with TFS vNext build, so I had to write my own vNext build Task to do the job, wrapping Tmockrunner.exe provided by Typemock which handles the starting and stopping of mocking whilst calling any EXE of your choice.

tmockrunner <name of the test tool to run> <and parameters for the test tool>

Microsoft provide a vNext build task to run the vstest.console.exe. This task generates all the command line parameters needed depending on the arguments provided for the build task. The source for this can be found on any build VM (in the [build agent folder]\tasks folder after a build has run) or on Microsoft’s vso agent github repo. I decided to use this as my starting point, swapping the logic to generate the tmockrunner.exe command line as opposed to the one for vstest.console.exe. You can find my task on my github. It has been developed in the same manner as the Microsoft provided tasks, this means the process to build and use the task is

  1. Clone the repo https://github.com/rfennell/vNextBuild.git
  2. In the root of the repo use gulp to build the task
  3. Use tfx to upload the task to your TFS or VSO instance

See http://realalm.com/2015/07/31/uploading-a-custom-build-vnext-task/ and http://blog.devmatter.com/custom-build-tasks-in-vso/ for a good walkthroughs of building tasks, the process is the same for mine and Microsoft’s tasks.

IMPORTANT NOTE: This task is only for on premises TFS vNext build instances connected to either an on premises TFS or VSO. Typemock at the time of writing this post does not support VSO’s host build agents. This is because the registration of Typemock requires admin rights on the build agent which you only get if you ‘own’ the build agent VM

Once the task is installed on your TFS/VSO server you can use it in vNext builds. You will note that it takes all the same parameters as the standard VSTest task (it will usually be used as a replacement when there are Typemock Isolator based tests in a solution). The only addition to the parameters are the three parameters for Typemock licensing and deployment location.


Using the task allows tests that require Typemock Isolator to pass. So test that if run with the standard VSTest task give


With the new task gives


WebDeploy, parameters.xml transforms and nLog settings

I have been trying to parameterise the SQL DB connection string used by nLog when it is defined in a web.config file of a web site being deployed via Release Management and  WebDeploy i.e. I wanted to select and edit the bit highlighted of my web.config file

    <nlog xmlns="http://www.nlog-project.org/schemas/NLog.xsd" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">

    <targets async="true">
      <target xsi:type="Database" name="SQL" dbProvider="System.Data.SqlClient" connectionString="Data Source=myserver;Database=mydb;Persist Security Info=True;Pooling=False" keepConnection="true" commandText="INSERT INTO [Logs](ID, TimeStamp, Message, Level, Logger, Details, Application, MachineName, Username) VALUES(newid(), getdate(), @message, @level, @logger, @exception, @application, @machineName, @username)">

        <parameter layout="${message}" name="@message"></parameter>

The problem I had was that the xpath query I was using was not returning the nLog node because the nLog node has a namespace defined. This means we can’t just use a query in the form

<parameter name="NLogConnectionString" description="Description for NLogConnectionString" defaultvalue="__NLogConnectionString__" tags="">
  <parameterentry kind="XmlFile" scope="\\web.config$" match="/configuration/nlog/targets/target[@name='SQL']/@connectionString" />

I needed to use

<parameter name="NLogConnectionString" description="Description for NLogConnectionString" defaultvalue="__NLogConnectionString__" tags="">
  <parameterentry kind="XmlFile" scope="\\web.config$" match="/configuration/*[local-name() = 'nlog']/*[local-name() = 'targets']/*[local-name() = 'target' and @name='SQL']/@connectionString" />

So more complex, but it does work. Hopefully this will save others the time I wasted working it out today

An alternative to setting a build quality on a TFS vNext build

TFS vNext builds do not have a concept of build quality unlike the old XAML based builds. This is an issue for us as we used the changing of the build quality as signal to test a build, or to mark it as released to a client (this was all managed with my TFS Alerts DSL to make sure suitable emails and build retention were used).

So how to get around this problem with vNext?

I have used Tag on builds, set using the same REST API style calls as detailed in my post on Release Management vNext templates. I also use the REST API to set the retention on the build, so I actually now don’t need to manage this via the alerts DSL.

The following script, if used to wrapper the calling of integration tests via TCM, should set the tags and retention on a build

function Get-BuildDetailsByNumber
        $tfsUri ,


    $uri = "$($tfsUri)/_apis/build/builds?api-version=2.0&buildnumber=$buildNumber"

    $wc = New-Object System.Net.WebClient
    if ($username -eq $null)
        $wc.UseDefaultCredentials = $true
    } else
        $wc.Credentials = new-object System.Net.NetworkCredential($username, $password)
    write-verbose "Getting ID of $buildNumber from $tfsUri "

    $jsondata = $wc.DownloadString($uri) | ConvertFrom-Json

function Set-BuildTag
        $tfsUri ,


    $wc = New-Object System.Net.WebClient
    $wc.Headers["Content-Type"] = "application/json"
    if ($username -eq $null)
        $wc.UseDefaultCredentials = $true
    } else
        $wc.Credentials = new-object System.Net.NetworkCredential($username, $password)
    write-verbose "Setting BuildID $buildID with Tag $tag via $tfsUri "

    $uri = "$($tfsUri)/_apis/build/builds/$($buildID)/tags/$($tag)?api-version=2.0"

    $data = @{value = $tag } | ConvertTo-Json

    $wc.UploadString($uri,"PUT", $data)

function Set-BuildRetension
        $tfsUri ,


    $wc = New-Object System.Net.WebClient
    $wc.Headers["Content-Type"] = "application/json"
    if ($username -eq $null)
        $wc.UseDefaultCredentials = $true
    } else
        $wc.Credentials = new-object System.Net.NetworkCredential($username, $password)
    write-verbose "Setting BuildID $buildID with retension set to $keepForever via $tfsUri "

    $uri = "$($tfsUri)/_apis/build/builds/$($buildID)?api-version=2.0"
    $data = @{keepForever = $keepForever} | ConvertTo-Json
    $response = $wc.UploadString($uri,"PATCH", $data)

# Output execution parameters.
$VerbosePreference ='Continue' # equiv to -verbose

$ErrorActionPreference = 'Continue' # this controls if any test failure cause the script to stop


$folder = Split-Path -Parent $MyInvocation.MyCommand.Definition

write-verbose "Running $folder\TcmExec.ps1"


& "$folder\TcmExec.ps1" -Collection $Collection -Teamproject $Teamproject -PlanId $PlanId  -SuiteId $SuiteId -ConfigId $ConfigId -BuildDirectory $PackageLocation -TestEnvironment $TestEnvironment -SettingsName $SettingsName write-verbose "TCM exited with code '$LASTEXITCODE'"
$newquality = "Test Passed"
$tag = "Deployed to Lab"
$keep = $true
if ($LASTEXITCODE -gt 0 )
    $newquality = "Test Failed"
    $tag = "Lab Deployed failed"
    $keep = $false
write-verbose "Setting build tag to '$tag' for build $BuildNumber"

$url = "$Collection/$Teamproject"
$jsondata = Get-BuildDetailsByNumber -tfsUri $url -buildNumber $BuildNumber #-username $TestUserUid -password $TestUserPwd
$buildId = $jsondata.id
write-verbose "The build $BuildNumber has ID of $buildId"
write-verbose "The build tag set to '$tag' and retention set to '$key'"
Set-BuildTag -tfsUri $url  -buildID $buildId -tag $tag #-username $TestUserUid -password $TestUserPwd
Set-BuildRetension -tfsUri $url  -buildID $buildId  -keepForever $keep #-username $TestUserUid -password $TestUserPwd

# now fail the stage after we have sorted the logging
if ($LASTEXITCODE -gt 0 )
    Write-error "Test have failed"

If all the tests pass we see the Tag being added and the retention being set, if they fail just a tag should be set


$ErrorActionPreference = 'Continue'

Cannot create an MSDeploy package for an Azure Web Job project as part of an automated build/

I like web deploy as a means to package up websites for deployment. I like the way I only need to add


as an MSBuild argument to get the package produced as part of an automated build. This opening up loads of deployment options

I recently hit an issue packaging up a solution that contained an Azure WebSite and an Azure Web Job (to be hosted in the web site). It is easy to add the web job so that it is included in the Web Deploy package. Once this was done we could deploy from Visual Studio, or package to the local file system and see the web job EXE in the app_data\jobs folder as expected.

The problems occurred when we tried to get TFS build to create the deployment package using the arguments shown above. I got the error

The value for PublishProfile is set to 'Release', expected to find the file at 'C:\vNextBuild\_work\4253ff91\BM\Src\MyWebJob\Properties\PublishProfiles\Release.pubxml' but it could not be found.

The issue is that there is a Publish target for the web jobs project type, but if run from Visual Studio it actually creates a ClickOnce package. This wizard provides no means create an MSDeploy style package.

MSBuild is getting confused as it expects there to be this MSDeploy style package definition for the web job projects, even though it won’t actually use it as the Web Job EXE will be copied into the web site deployment package.

The solution was to add a dummy PublishProfiles\Release.pubxml file into the properties folder of the web jobs project.

<?xml version="1.0" encoding="utf-8"?>
<Project ToolsVersion="4.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
    <LastUsedPlatform>Any CPU</LastUsedPlatform>
    <SiteUrlToLaunchAfterPublish />
    <DesktopBuildPackageLocation />
    <DeployIisAppPath />

Note: I had to add this file to source control via the TFS Source Code Explorer as Visual Studio does not allow you add folders/files manually under the properties folder.

Once this file was added my automated build worked OK, and I got my web site package including the web job.