But it works on my PC!

The random thoughts of Richard Fennell on technology and software development

Versioning a VSIX package as part of the TFS vNext build (when the source is on GitHub)

I have recently added a CI build to my GitHub stored ParametersXmlAddin VSIX project. I did this using Visual Studio Online’s hosted build service, did you know that this could used to build source from GitHub?

As part of this build I wanted to version stamp the assemblies and the resultant VSIX package. To do the former I used the script documented on MSDN, for the latter I also used the same basic method of extracting the version from the build number as used in the script for versioning assemblies. You can find my VSIX script stored in this repo.

I added both of these scripts in my ParametersXmlAddin project repo’s Script folder and just call them at the start of my build with a pair of PowerShell tasks. As they both get the build number from the environment variables there is no need to pass any arguments.

image

I only wanted to publish the VSIX package. This was done by setting the contents filter on the Publish Build Artifacts task to **\*.vsix

image

The final step was to enable the badge for the build, this is done on the General tab. Once enabled, I copied the provided URL for the badge graphics that shows the build status and added this as an image to the Readme.MD file on my repo’s wiki

image

Why can’t I assign a VSO user as having ‘eligible MSDN’ using an AAD work account?

When access VSO you have two authentication options; either a LiveID (or an MSA using it’s newest name) or a Work Account ID (a domain account). The latter is used to provide extra security, so a domain admin can easily control who has access to a whole set of systems. It does assume you have used Azure Active Directory (AAD) that is sync’d with your on premises AD, and that this AAD is used to back your VSO instance. See my previous post on this subject.

If you are doing this the issue you often see is that VSO does not pickup your MSDN subscription because it is linked to an MSA not a work account. This is all solvable, but there are hoops to jump through, more than there should be sometimes.

Basic Process

First you need to link your MSDN account to a Work Account

  • Login to https://msdn.micrsoft.com with the MSA that is associated with your MSDN account.
  • Click on the MSDN subscriptions menu option.
  • Click on the Link to work account  and enter your work ID. Note that it will also set your Microsoft Azure linked work account

 

image


Assuming your work account is listed in your AD/AAD, over in VSO you should now be able to …

  • Login as the VSO administrator
  • Invite any user in the AAD to your VSO instance via the link https://[theaccount].visualstudio.com/_user . A user can be invited as
    • Basic – you get 5 for free
    • Stakeholder – what we fall back to if there is an issue
    • MSDN Subscription – the one we want (in screenshot below the green box shows a user where MSDN has been validated, the red box is a user who has not logged in yet with an account associated with a valid MSDN subscription)

image

  • Once invited a user gets an email so they can login as shown below. Make sure you pick the work account login link (lower left. Note that this is mocked up in the screen shot below as which login options are shown appears in a context sensitive way, only being shown the first time a user connects and if the VSO is AAD backed. If you pick the main login fields (the wrong ones) it will try to login assuming the ID is an MSA, which will not work. This is particularly a confusing issue if you used the same email address for your MSA as your Work Account, more on this in the troubleshooting section

 image

  • On later connections only the work ID login will be shown
  • Once a user has logged in for the first time with the correct ID, the VSO admin should be able to see the MSDN subscription is validated

Troubleshooting

We have seen problem that though the user is in the domain and correctly added to VSO it will not register that the MSDN subscription is active. These steps can help.

  • Make sure in the  https://msdn.microsoft.com portal you have actually linked your work ID. You still need to explicably do this even if your MSA and Work ID use the same email address e.g.   user@domain.com. Using the same email address for both IDs can get confusing, so I would recommend considering you setup your MSA email addresses to not clash with your work ID.
  • When you login to VSO MAKE SURE YOU USE THE WORK ID LOGIN LINK (LHS OF DIALOG UNDER VSO LOGO) TO LOGIN WITH A WORK ID AND NOT THE MAIN LIVEID FIELDS. I can’t stress this enough, especially if you use the same email address  for both the MSA and work account
  • If you still get issues with picking up the MSDN subscription
    • In VSO the admin should set the user to be a basic user
    • In  https://msdn.microsoft.com the user should make sure they did not make any typo's when linking the work account ID
    • The user should sign out of VSO and back in using their work ID, MAKE SURE THEYUSE THE CORRECT WORK ID LOGIN DIALOG. They should see the features available to a basic user
    • The VSO admin should change the role assignment in VSO to be MSDN eligible and it should flip over without a problem. There seems to be no need to logout and back in again.

Note if you assign a new MSA to an MSDN subscription it can take a little while to propagate, if you get issues that activation emails don’t arrive, pause a while and try again later. You can’t do any of this until your can login to MSDN with your MSA.

Running nUnit and Jasmine.JS unit tests in TFS/VSO vNext build

This article was first published on the Microsoft’s UK Developers site as Running nUnit and Jasmine.JS unit tests in TFS/VSO vNext build

With the advent of vNext build in TFS 2015 and Visual Studio Online running unit tests that are not MSTest based within your build process is far more straightforward than it used to be. No longer do you have to use custom XAML build activities or tell all your TFS build controllers where the test runner assemblies are. The ‘out the box’ vNext build Visual Studio Test task will automatically load any test adaptors it finds in the path specified for test runners in its advanced properties, a path that can be populated via NuGet.

Running nUnit tests

All this means that to find and run MSTest and nUnit tests as part of your build all you have to do is as follows

  1. Create a solution that contains a project with MStest and nUnit tests, in my sample this is a MVC web application project with its automatically created MSTest unit tests project.
  2. In the test project add some nUnit tests. Use NuGet to add the references to nUnit to the test project so it compiles.
  3. Historically in your local Visual Studio instance you needed to install the nUnit Test Runner VSIX package from Visual Studio Gallery – this allows Visual Studio to discover your nUnit tests, as well as any MSTest ones, and run them via the built in Test Explorer

    image

    IMPORTANT Change –
    However installing this VSIX package is no longer required. If you use Nuget to add the nUnit Test Runner to the solution, as well as the nUnit package itself, then Visual Studio can find the nUnit tests without the VSIX package. This is useful but not world changing on your development PC, but when on the build box it means the NuGet restore will make sure the nUnit test adapter assemblies are pulled down onto the local build boxes file system and used to find tests with no extra work.

    Note
    : If you still want to install the VSIX package on your local Visual Studio instance you can, it is just you don’t have to.
  4. Check in your solution into TFS/VSO source control. It does not matter if it is TFVC or Git based
  5. Create a new vNext build using the Visual Studio template
  6. You can leave most of the parameters on default setting. But you do need to edit the Visual Studio Test task’s advanced settings to point at the NuGet packages folder for your solution (which will be populated via NuGet restore) so the custom nUnit test adaptor can be found i.e. usually setting it to  $(Build.SourcesDirectory)\packages

    image
  7. The build should run and find your tests, the MStest ones because they are built in and the nUnit ones because it found the custom test adaptor due to the NuGet restore being done prior to the build. The test results can be found on the build summary page

    image

 

But what if you want run Jasmine.JS test?

If you want to run Jasmine JavaScript unit tests the process is basically the same. The only major difference is that you do still need to install the Chutzpah Test runner on your local Visual Studio as a VSIX package to run the tests locally. There is a NuGet package for the Chutzpah test runner so you can avoid having to manually unpack the VSIX and get it into source control to deploy it to the build host (unless you really want to follow this process), but this package does not currently enable Visual Studio to find the Jasmine tests without the VSIX extension being installed, or at least it didn’t for me.

Using the solution I used before

  1. Use NuGet to add Jasmine.JS to the test project
  2. Add a test file to the test project e.g. mycode.tests.js (adding any JavaScript references needed to find any script code under test in the main WebApp project)
  3. Install the Chutzpah Test runner in your local Visual Studio as a VSIX extension, restart Visual Studio
  4. You should now be able to see and run the Jasmine test run in the test runner as well as the MSTest and nUnit tests.

    image
  5. Add the NuGet package for the Chutzpah test runner to your solution, this is a solution level package, so does not need to be associated with any project.
  6. Check the revised code into source control
  7. In your vNext build add another Visual Studio Test task, set the test assembly to match your javascript test naming convention e.g. **\*.tests.js and the path to the custom test adaptor to $(Build.SourcesDirectory)\packages (as before)

    image
  8. Run the revised build.

    image
  9. You should see the two test tasks run and a pair of test results in the summary for the build.

So now hopefully you should find this a more straight forward way to added testing to your vNext builds. Allowing easy use of both your own build boxes and the hosted build service for VSO with testing frameworks they do not support ‘out the box’

Powershell to help plot graphs of how long TFS upgrades take

When doing TFS upgrades it is useful to know roughly how long they will take. The upgrade programs give a number of steps, but not all steps are equal. Some are quick, some are slow. I have found it useful to graph past updates so I can get a feel of how long an update will take given it got to ‘step x in y minutes’. You can do this by hand, noting down time as specific steps are reached. However for a long upgrade it usually means pulling data out of the TFS TPC upgrade logs.

To make this process easier I put together this script to find the step completion rows in the log file and format them out such that they are easy to graph in Excel

param
(
    $logfile = "TPC_ApplyPatch.log",
    $outfile = "out.csv"
)


# A function to covert the start and end times to a number of minutes
# Can't use simple timespan as we only have the time portion not the whole datetime
# Hence the hacky added a day-1 second
function CalcDuration
{
    param
    (
        $startTime,
        $endTime
    )

    $diff = [dateTime]$endTime - $startTime
    if ([dateTime]$endTime -lt $startTime)
    {
       $diff += "23:59" # add a day as we past midnight
    }

    [int]$diff.Hours *60 + $diff.Minutes
}

Write-Host "Importing $logfile for processing"
# pull out the lines we are interested in using a regular expression to extract the columns
# the (.{8} handle the fixed width, exact matches are used for the test
$lines = Get-Content -Path $logfile | Select-String "  Executing step:"  | Where{$_ -match "^(.)(.{8})(.{8})(Executing step:)(.{2})(.*)(')(.*)([(])(.*)([ ])([of])(.*)"} | ForEach{
    [PSCustomObject]@{
        'Step' = $Matches[10]
        'TimeStamp' = $Matches[2]
        'Action' = $Matches[6]
    }
}
 
# We assume the upgrade started at the timestamp of the 0th step
# Not true but very close
[DateTime]$start = $lines[0].TimeStamp

Write-Host "Writing results to $outfile"
# Work out the duration
$steps = $lines | ForEach{
    [PSCustomObject]@{
        'Step' = $_.Step
        'TimeStamp' = $_.TimeStamp
        'EplasedTime' = CalcDuration -startTime $start -endTime $_.TimeStamp
        'Action' = $_.Action
       
    }
}
$steps | export-csv $outfile -NoTypeInformation

# and list to screen
$steps

An alternative to setting a build quality on a TFS vNext build

TFS vNext builds do not have a concept of build quality unlike the old XAML based builds. This is an issue for us as we used the changing of the build quality as signal to test a build, or to mark it as released to a client (this was all managed with my TFS Alerts DSL to make sure suitable emails and build retention were used).

So how to get around this problem with vNext?

I have used Tag on builds, set using the same REST API style calls as detailed in my post on Release Management vNext templates. I also use the REST API to set the retention on the build, so I actually now don’t need to manage this via the alerts DSL.

The following script, if used to wrapper the calling of integration tests via TCM, should set the tags and retention on a build


function Get-BuildDetailsByNumber
{
    param
    (
        $tfsUri ,
        $buildNumber,
        $username,
        $password

    )

    $uri = "$($tfsUri)/_apis/build/builds?api-version=2.0&buildnumber=$buildNumber"

    $wc = New-Object System.Net.WebClient
    if ($username -eq $null)
    {
        $wc.UseDefaultCredentials = $true
    } else
    {
        $wc.Credentials = new-object System.Net.NetworkCredential($username, $password)
    }
    write-verbose "Getting ID of $buildNumber from $tfsUri "

    $jsondata = $wc.DownloadString($uri) | ConvertFrom-Json
    $jsondata.value[0]
 
}

function Set-BuildTag
{
    param
    (
        $tfsUri ,
        $buildID,
        $tag,
        $username,
        $password

    )

 
    $wc = New-Object System.Net.WebClient
    $wc.Headers["Content-Type"] = "application/json"
    if ($username -eq $null)
    {
        $wc.UseDefaultCredentials = $true
    } else
    {
        $wc.Credentials = new-object System.Net.NetworkCredential($username, $password)
    }
   
    write-verbose "Setting BuildID $buildID with Tag $tag via $tfsUri "

    $uri = "$($tfsUri)/_apis/build/builds/$($buildID)/tags/$($tag)?api-version=2.0"

    $data = @{value = $tag } | ConvertTo-Json

    $wc.UploadString($uri,"PUT", $data)
   
}

function Set-BuildRetension
{
    param
    (
        $tfsUri ,
        $buildID,
        $keepForever,
        $username,
        $password

    )

 
    $wc = New-Object System.Net.WebClient
    $wc.Headers["Content-Type"] = "application/json"
    if ($username -eq $null)
    {
        $wc.UseDefaultCredentials = $true
    } else
    {
        $wc.Credentials = new-object System.Net.NetworkCredential($username, $password)
    }
   
    write-verbose "Setting BuildID $buildID with retension set to $keepForever via $tfsUri "

    $uri = "$($tfsUri)/_apis/build/builds/$($buildID)?api-version=2.0"
    $data = @{keepForever = $keepForever} | ConvertTo-Json
    $response = $wc.UploadString($uri,"PATCH", $data)
   
}


# Output execution parameters.
$VerbosePreference ='Continue' # equiv to -verbose

$ErrorActionPreference = 'Continue' # this controls if any test failure cause the script to stop

 

$folder = Split-Path -Parent $MyInvocation.MyCommand.Definition

write-verbose "Running $folder\TcmExec.ps1"

 

& "$folder\TcmExec.ps1" -Collection $Collection -Teamproject $Teamproject -PlanId $PlanId  -SuiteId $SuiteId -ConfigId $ConfigId -BuildDirectory $PackageLocation -TestEnvironment $TestEnvironment -SettingsName $SettingsName write-verbose "TCM exited with code '$LASTEXITCODE'"
$newquality = "Test Passed"
$tag = "Deployed to Lab"
$keep = $true
if ($LASTEXITCODE -gt 0 )
{
    $newquality = "Test Failed"
    $tag = "Lab Deployed failed"
    $keep = $false
}
write-verbose "Setting build tag to '$tag' for build $BuildNumber"


$url = "$Collection/$Teamproject"
$jsondata = Get-BuildDetailsByNumber -tfsUri $url -buildNumber $BuildNumber #-username $TestUserUid -password $TestUserPwd
$buildId = $jsondata.id
write-verbose "The build $BuildNumber has ID of $buildId"
 
write-verbose "The build tag set to '$tag' and retention set to '$key'"
Set-BuildTag -tfsUri $url  -buildID $buildId -tag $tag #-username $TestUserUid -password $TestUserPwd
Set-BuildRetension -tfsUri $url  -buildID $buildId  -keepForever $keep #-username $TestUserUid -password $TestUserPwd

# now fail the stage after we have sorted the logging
if ($LASTEXITCODE -gt 0 )
{
    Write-error "Test have failed"
}

If all the tests pass we see the Tag being added and the retention being set, if they fail just a tag should be set

image

$ErrorActionPreference = 'Continue'

Using Release Management vNext templates when you don’t want to use DSC scripts – A better script

A couple of months ago I wrote a post on using PowerShell scripts to deploy web sites in Release Management vNext templates as opposed to DSC. In that post I provided a script to help with the translation of Release Management configuration variables to entries in the [MSDELPOY].setparameters.xml file for web sites.

The code I provided in that post required you to hard code the variables to translate. This quickly become a problem for maintenance. However, there is a simple solution.

If we use a naming convention for our RM configuration variables that map to web.config entries (I chose __NAME__ to be consistent to the old RM Agent based deployment standards) we can let PowerShell do the work.

So the revised script is

$VerbosePreference ='Continue' # equiv to -verbose

function Update-ParametersFile
{
    param
    (
        $paramFilePath,
        $paramsToReplace
    )

    write-verbose "Updating parameters file '$paramFilePath'" -verbose
    $content = get-content $paramFilePath
    $paramsToReplace.GetEnumerator() | % {
        Write-Verbose "Replacing value for key '$($_.Name)'" -Verbose
        $content = $content.Replace($_.Name, $_.Value)
    }
    set-content -Path $paramFilePath -Value $content

}

# the script folder
$folder = Split-Path -parent $MyInvocation.MyCommand.Definition
write-verbose "Deploying Website '$package' using script in '$folder'"

# work out the variables to replace using a naming convention

# we make sure that the value is stored in an array even if it is single item
$parameters = @(Get-Variable -include "__*__" )
write-verbose "Discovered replacement parameters that match the convention '__*__': $($parameters | Out-string)"
Update-ParametersFile -paramFilePath "$ApplicationPath\$packagePath\$package.SetParameters.xml" -paramsToReplace $parameters

write-verbose "Calling '$ApplicationPath\$packagePath\$package.deploy.cmd'"
& "$ApplicationPath\$packagePath\$package.deploy.cmd" /Y  /m:"$PublishUrl" -allowUntrusted /u:"$PublishUser" /p:"$PublishPassword" /a:Basic | Write-Verbose

Note: This script allow the deployment to a remote IIS server, so useful for Azure Web Sites. If you are running it locally on an IIS server just trim everything after the /Y on the last line

So now I provide

  • $PackagePath – path to our deployment on the deployment VM(relative to the $ApplicationPath local working folder)
  • $Package – name of the MSdeploy package
  • The publish settings you can get from the Azure Portal
  • $__PARAM1__ –  a value to swap in the web.config
  • $__PARAM2__ –  another value to swap in the web.config

In RM it will look like this.

image

So now you can use a single script for all your web deployments.

TF30063 Errors accessing a TFS 2015 server via the C# API after upgrade from 2013

Background

We  upgraded our production TFS 2013.4 server to TFS 2015 RTM this week. As opposed to an in-place upgrade we chose to make a few change on the way; so whilst leaving our DBs on our SQL 2012 cluster

  • We moved to a new VM for our AT (to upgrade from Windows 2008R2 to 2012R2)
  • Split the SSRS instance off the AT to a separate VM with a new SSAS server (again to move to 2012R2 and to ease management, getting all the reporting bits in one place)

But we do not touch

  • Our XAML Build systems leaving them at 2013 as we intend to migrate to vNext build ASAP
  • Our Test Controller/Release Management/Lab Environment leaving it at 2013 for now, as we have other projects on the go to update the hardware/cloud solutions underpinning theses.

All went well, no surprises, the running of the upgrade tool took about 1 hour.

The Problem

The only problem we have had was to do with my TFS Alerts DSL Processor, which listens for TFS Alerts and runs custom scripts . I host this on the TFS AT, and I would expect it to set build retention and send emails when a TFS XAML Build quality changes. This did not occur, in the Windows error log  I was seeing

2015-08-12 21:04:02.4195 ERROR TFSEventsProcessor.DslScriptService: TF30063: You are not authorized to access https://tfs.blackmarble.co.uk/tfs/DefaultCollection.

After much fiddling, including writing a small command line test client, I confirmed that the issue was specific to the production server. The tool ran fine on other PCs, but on the live server a Window authentication dialog was shown which would not accept any valid credentials

It was not as I had feared a change in the TFS API, in fact there is no reason my 2012 or 2013 API targeted version of the TFS Alert DSL should not be able to talk to a TFS 2015 server as long as the correct version of the TFS API is installed on the machine hosting the DSL.

The Solution

The issue was due to Windows loopback protection. This had been disabled on our old old TFS AT, but not on the new one. As we wanted to avoid changing the global loopback protection setting we set the following via Regedit to allow it for a single CName

HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Lsa\MSV1_0
    ValueName - BackConnectionHostNames
    Type - multistring
    Data  - tfs.blackmarble.co.uk

Once this was done(and without a reboot) my alerts processing work without any problems.

Running Microsoft Test Manager Test Suites as part of a vNext Release pipeline - Part 2

In my last post I discussed how you could wire TCM tests into a Release Management vNext pipeline. The problem with the script I provided, as I noted, was that the deployment was triggered synchronously by the build i.e. the build/release process was:

  1. TFS Build
    1. Gets the source
    2. Compiled the code
    3. Run the unit tests
    4. Trigger the RM pipeline
    5. Wait while the RM pipeline completed
  2. RM then
    1. Deploys the code
    2. Runs the integration tests
  3. When RM completed the TFS build completes

This process raised a couple of problems

  • You cannot associate the integration tests with the build as TCM only allow association with completed successful builds. When TCM finishes in this model the build is still in progress.
  • You have to target only the first automated stage of the pipeline, else the build will be held as ‘in progress’ until all the release stages have complete, which may be days if there are manual approvals involved

The script InitiateReleaseFromBuild

These problems can all be fixed by altering the PowerShell that triggers the RM pipeline so that it does not wait for the deployment to complete, so the TFS build completes as soon as possible.

This is done by passing in an extra parameter which is set in TFS build

param(
    [string]$rmserver = $Args[0],
    [string]$port = $Args[1], 
    [string]$teamProject = $Args[2],  
    [string]$targetStageName = $Args[3],
    [string]$waitForCompletion = $Args[4]
)

cls
$teamFoundationServerUrl = $env:TF_BUILD_COLLECTIONURI
$buildDefinition = $env:TF_BUILD_BUILDDEFINITIONNAME
$buildNumber = $env:TF_BUILD_BUILDNUMBER


"Executing with the following parameters:`n"
"  RMserver Name: $rmserver"
"  Port number: $port"
"  Team Foundation Server URL: $teamFoundationServerUrl"
"  Team Project: $teamProject"
"  Build Definition: $buildDefinition"
"  Build Number: $buildNumber"
"  Target Stage Name: $targetStageName`n"
"  Wait for RM completion: $waitForCompletion`n"

$wait = [System.Convert]::ToBoolean($waitForCompletion)
$exitCode = 0

trap
{
  $e = $error[0].Exception
  $e.Message
  $e.StackTrace
  if ($exitCode -eq 0) { $exitCode = 1 }
}

$scriptName = $MyInvocation.MyCommand.Name
$scriptPath = Split-Path -Parent (Get-Variable MyInvocation -Scope Script).Value.MyCommand.Path

Push-Location $scriptPath   

$server = [System.Uri]::EscapeDataString($teamFoundationServerUrl)
$project = [System.Uri]::EscapeDataString($teamProject)
$definition = [System.Uri]::EscapeDataString($buildDefinition)
$build = [System.Uri]::EscapeDataString($buildNumber)
$targetStage = [System.Uri]::EscapeDataString($targetStageName)

$serverName = $rmserver + ":" + $port
$orchestratorService = "http://$serverName/account/releaseManagementService/_apis/releaseManagement/OrchestratorService"

$status = @{
    "2" = "InProgress";
    "3" = "Released";
    "4" = "Stopped";
    "5" = "Rejected";
    "6" = "Abandoned";
}

$uri = "$orchestratorService/InitiateReleaseFromBuild?teamFoundationServerUrl=$server&teamProject=$project&buildDefinition=$definition&buildNumber=$build&targetStageName=$targetStage"
"Executing the following API call:`n`n$uri"

$wc = New-Object System.Net.WebClient
$wc.UseDefaultCredentials = $true
# rmuser should be part rm users list and he should have permission to trigger the release.

#$wc.Credentials = new-object System.Net.NetworkCredential("rmuser", "rmuserpassword", "rmuserdomain")

try
{
    $releaseId = $wc.DownloadString($uri)

    $url = "$orchestratorService/ReleaseStatus?releaseId=$releaseId"

    $releaseStatus = $wc.DownloadString($url)


    if ($wait -eq $true)
    {
        Write-Host -NoNewline "`nReleasing ..."

        while($status[$releaseStatus] -eq "InProgress")
        {
            Start-Sleep -s 5
            $releaseStatus = $wc.DownloadString($url)
            Write-Host -NoNewline "."
        }

        " done.`n`nRelease completed with {0} status." -f $status[$releaseStatus]
    } else {

        Write-Host -NoNewline "`nTriggering Release and exiting"
    }

}
catch [System.Exception]
{
    if ($exitCode -eq 0) { $exitCode = 1 }
    Write-Host "`n$_`n" -ForegroundColor Red
}

if ($exitCode -eq 0)
{
    if ($wait -eq $true)
    {
        if ($releaseStatus -eq 3)
        {
          "`nThe script completed successfully. Product deployed without error`n"
        } else {
            Write-Host "`nThe script completed successfully. Product failed to deploy`n" -ForegroundColor Red
            $exitCode = -1 # reset the code to show the error
        }
    } else {
        "`nThe script completed successfully. Product deploying`n"
    }
}
else
{
  $err = "Exiting with error: " + $exitCode + "`n"
  Write-Host $err -ForegroundColor Red
}

Pop-Location

exit $exitCode

The Script TcmExecWrapper

A change is also required in the wrapper script I use to trigger the TCM test run. We need to check the exit code from the inner TCM PowerShell script and update the TFS build quality appropriately.

To this I use the new REST API in TFS 2015 as this is far easier than using the older .NET client API. No DLLs to distribute.

It is worth noticing that

  • I pass the credentials into the script from RM that are used to talk to the TFS server. This is because I am running my tests in a network isolated TFS Lab Environment, this means I am in the wrong domain to see the TFS server without providing login details. If you are not working cross domain you could just use Default Credentials.
  • RM only passes the BuildNumber into the script e.g. MyBuild_1.2.3.4, but the REST API need the build id to set the quality. Hence the need for function Get-BuildDetailsByNumber to get the id from the name
# Output execution parameters.
$VerbosePreference ='Continue' # equiv to -verbose
function Get-BuildDetailsByNumber
{
    param
    (
        $tfsUri ,
        $buildNumber,
        $username,
        $password
    )
    $uri = "$($tfsUri)/_apis/build/builds?api-version=2.0&buildnumber=$buildNumber"
    $wc = New-Object System.Net.WebClient
    #$wc.UseDefaultCredentials = $true
    $wc.Credentials = new-object System.Net.NetworkCredential($username, $password)
   
    write-verbose "Getting ID of $buildNumber from $tfsUri "
    $jsondata = $wc.DownloadString($uri) | ConvertFrom-Json
    $jsondata.value[0]
 
}
function Set-BuildQuality
{
    param
    (
        $tfsUri ,
        $buildID,
        $quality,
        $username,
        $password
    )
    $uri = "$($tfsUri)/_apis/build/builds/$($buildID)?api-version=1.0"
    $data = @{quality = $quality} | ConvertTo-Json
    $wc = New-Object System.Net.WebClient
    $wc.Headers["Content-Type"] = "application/json"
    #$wc.UseDefaultCredentials = $true
    $wc.Credentials = new-object System.Net.NetworkCredential($username, $password)
   
    write-verbose "Setting BuildID $buildID to quality $quality via $tfsUri "
    $wc.UploadString($uri,"PATCH", $data)
   
}
$folder = Split-Path -Parent $MyInvocation.MyCommand.Definition
write-verbose "Running $folder\TcmExecWithLogin.ps1"
& "$folder\TcmExecWithLogin.ps1" -Collection $Collection -Teamproject $Teamproject -PlanId $PlanId  -SuiteId $SuiteId -ConfigId $ConfigId -BuildDirectory $PackageLocation -TestEnvironment $TestEnvironment -LoginCreds "$TestUserUid,$TestUserPwd" -SettingsName $SettingsName -BuildNumber $BuildNumber -BuildDefinition $BuildDefinition
write-verbose "Got the exit code from the TCM run of $LASTEXITCODE"
$url = "$Collection/$Teamproject"
$jsondata = Get-BuildDetailsByNumber -tfsUri $url -buildNumber $BuildNumber -username $TestUserUid -password $TestUserPwd
$buildId = $jsondata.id
write-verbose "The build ID is $buildId"
$newquality = "Test Passed"
if ($LASTEXITCODE -gt 0 )
{
    $newquality = "Test Failed"
}
 
write-verbose "The build quality is $newquality"
Set-BuildQuality -tfsUri $url  -buildID $buildId -quality $newquality -username $TestUserUid -password $TestUserPwd

Note: TcmExecWithLogin.ps1 is the same as in the In my last post

Summary

So with these changes the process is now

  1. TFS Build
    1. Gets the source
    2. Compiled the code
    3. Run the unit tests
    4. Trigger the RM pipeline
    5. Build ends
  2. RM then
    1. Deploys the code
    2. Runs the integration tests
    3. When the test complete we set the TFS build quality

This means we can associate both unit and integration tests with a build and target our release at any stage in the pipeline, it pausing at the points manual approval is required without blocking the initiating build.

Running Microsoft Test Manager Test Suites as part of a vNext Release pipeline

Also see Part 2 on how to address gotcha's in this process

When using Release Management there is a good chance you will want to run test suites as part of your automated deployment pipeline. If you are using a vNext PowerShell based pipeline you need a way to trigger the tests via PowerShell as there is no out the box agent to do the job.

Step 1 - Install a Test Agent

The first step is to make sure that the Visual Studio Test Agent is installed on the box you wish to run the test on. if you don’t already have a MTM Environment in place with a test agent then this can be done by creating a standard environment in Microsoft Test Manager. Remember you only need this environment to include the VM you want to run the test on, unless you want to also gather logs and events from our machines in the system. The complexity is up to you.

In my case I was using a network isolated environment so all this was already set up.

Step 2 - Setup the Test Suite

Once you have an environment you can setup your test suite and test plan in MTM to include the tests you wish to run. These can be unit test style integration tests or Coded UI it is up to you.

If you have a lot of unit tests to associate for automation remember the TCM.EXE command can make your life a lot easier

This post does not aim to be a tutorial on setting up test plans, have a look at the ALM Rangers guides for more details.

Step 3 -  The Release Management environment

This is where it gets a bit confusing, you have already set up a Lab Management environment, but you still need to setup the Release Management vNext environment. As I was using a network isolated Lab management environment this gets even more complex, but RM provides some tools to help

Again this is not a detailed tutorial. The key steps if you are using network isolation are

  1. Make sure that PowerShell on the VM is setup for remote access by running  winrm quickconfig
  2. In RM create a vNext environment
  3. Add each a new server, using it’s corporate LAN name from Lab Management with the PowerShell remote access port e.g. VSLM-1002-e7858e28-77cf-4163-b6ba-1df2e91bfcab.lab.blackmarble.co.uk:5985
  4. Make sure the server is set to use a shared UNC path for deployment.
  5. Remember you will login to this VM with the credentials for the test domain.


image

By this point you might be a bit confused as to what you have, well here is a diagram

image

Step 4  - Wiring the test into the pipeline

The final step is get the release pipeline to trigger the tests. This is done by calling the TCM.EXE command line to instruct the Test Controller trigger the tests. Now the copy of TCM does not have to be in Lab Management environment, but it does need to be on a VM known to RM vNext environment. This will usually mean a VM with Visual Studio Test Manager or Premium (or Enterprise for 2015) installed. In my case this was a dedicated test VM within the environment.

The key to the process is to run a script similar to the one used by the older RM agent based system to trigger the tests. You can extract this PowerShell script from an old release pipeline, but for ease I show my modified version here. The key changes are that I pass in the login credentials required for the call to the TFS server from TCM.EXE to be made from inside the network isolated environment and do a little extra checking of the test results so I can fail the build if the tests fail. These edits might not be required if you trigger TCM from a VM that is in the same domain as your TFS server, or have different success criteria.

param
(
    [string]$BuildDirectory = $null,
    [string]$BuildDefinition = $null,
    [string]$BuildNumber = $null,
    [string]$TestEnvironment = $null,
    [string]$LoginCreds = $null,
    [string]$Collection = $(throw "The collection URL must be provided."),
    [string]$TeamProject = $(throw "The team project must be provided."),
    [Int]$PlanId = $(throw "The test plan ID must be provided."),
    [Int]$SuiteId = $(throw "The test suite ID must be provided."),
    [Int]$ConfigId = $(throw "The test configuration ID must be provided."),
    [string]$Title = 'Automated UI Tests',
    [string]$SettingsName = $null,
    [Switch]$InconclusiveFailsTests = $false,
    [Switch]$RemoveIncludeParameter = $false,
    [Int]$TestRunWaitDelay = 10
)

 

##################################################################################
# Output the logo.
write-verbose "Based on the Microsoft Release Management TcmExec PowerShell Script v12.0"
write-verbose "Copyright (c) 2013 Microsoft. All rights reserved.`n"


 

##################################################################################
# Initialize the default script exit code.
$exitCode = 1

##################################################################################
# Output execution parameters.
write-verbose "Executing with the following parameters:"
write-verbose "  Build Directory: $BuildDirectory"
write-verbose "  Build Definition: $BuildDefinition"
write-verbose "  Build Number: $BuildNumber"
write-verbose "  Test Environment: $TestEnvironment"
write-verbose "  Collection: $Collection"
write-verbose "  Team project: $TeamProject"
write-verbose "  Plan ID: $PlanId"
write-verbose "  Suite ID: $SuiteId"
write-verbose "  Configuration ID: $ConfigId"
write-verbose "  Title: $Title"
write-verbose "  Settings Name: $SettingsName"
write-verbose "  Inconclusive result fails tests: $InconclusiveFailsTests"
write-verbose "  Remove /include parameter from /create command: $RemoveIncludeParameter"
write-verbose "  Test run wait delay: $TestRunWaitDelay"

##################################################################################
# Define globally used variables and constants.
# Visual Studio 2013
$vscommtools = [System.Environment]::GetEnvironmentVariable("VS120COMNTOOLS")
if ($vscommtools -eq $null)
{
    # Visual Studio 2012
    $vscommtools = [System.Environment]::GetEnvironmentVariable("VS110COMNTOOLS")
}
if ($vscommtools -eq $null)
{
    # Visual Studio 2010
    $vscommtools = [System.Environment]::GetEnvironmentVariable("VS100COMNTOOLS")
    if ($vscommtools -ne $null)
    {
        if ([string]::IsNullOrEmpty($BuildDirectory))
        {
            $(throw "The build directory must be provided.")
        }
        if (![string]::IsNullOrEmpty($BuildDefinition) -or ![string]::IsNullOrEmpty($BuildNumber))
        {
            $(throw "The build definition and build number parameters may be used only under Visual Studio 2012/2013.")
        }
    }
}
else
{
    if ([string]::IsNullOrEmpty($BuildDefinition) -and [string]::IsNullOrEmpty($BuildNumber) -and [string]::IsNullOrEmpty($BuildDirectory))
    {
        $(throw "You must specify the build directory or the build definition and build number.")
    }
}
$tcmExe = [System.IO.Path]::GetFullPath($vscommtools + "..\IDE\TCM.exe")

##################################################################################
# Ensure TCM.EXE is available in the assumed path.
if ([System.IO.File]::Exists($tcmExe))
{
    ##################################################################################
    # Prepare optional parameters.
    $testEnvironmentParameter = "/testenvironment:$TestEnvironment"
    if ([string]::IsNullOrEmpty($TestEnvironment))
    {
        $testEnvironmentParameter = [string]::Empty
    }
    if ([string]::IsNullOrEmpty($BuildDirectory))
    {
        $buildDirectoryParameter = [string]::Empty
    } else
    {
        # make sure we remove any trailing slashes as the cause permission issues
        $BuildDirectory = $BuildDirectory.Trim()
        while ($BuildDirectory.EndsWith("\"))
        {
            $BuildDirectory = $BuildDirectory.Substring(0,$BuildDirectory.Length-1)
        }
        $buildDirectoryParameter = "/builddir:""$BuildDirectory"""
   
    }
    $buildDefinitionParameter = "/builddefinition:""$BuildDefinition"""
    if ([string]::IsNullOrEmpty($BuildDefinition))
    {
        $buildDefinitionParameter = [string]::Empty
    }
    $buildNumberParameter = "/build:""$BuildNumber"""
    if ([string]::IsNullOrEmpty($BuildNumber))
    {
        $buildNumberParameter = [string]::Empty
    }
    $includeParameter = '/include'
    if ($RemoveIncludeParameter)
    {
        $includeParameter = [string]::Empty
    }
    $settingsNameParameter = "/settingsname:""$SettingsName"""
    if ([string]::IsNullOrEmpty($SettingsName))
    {
        $settingsNameParameter = [string]::Empty
    }

    ##################################################################################
    # Create the test run.
    write-verbose "`nCreating test run ..."
    $testRunId = & "$tcmExe" run /create /title:"$Title" /login:$LoginCreds /planid:$PlanId /suiteid:$SuiteId /configid:$ConfigId /collection:"$Collection" /teamproject:"$TeamProject" $testEnvironmentParameter $buildDirectoryParameter $buildDefinitionParameter $buildNumberParameter $settingsNameParameter $includeParameter
    if ($testRunId -match '.+\:\s(?<TestRunId>\d+)\.')
    {
        # The test run ID is identified as a property in the match collection
        # so we can access it directly by using the group name from the regular
        # expression (i.e. TestRunId).
        $testRunId = $matches.TestRunId

        write-verbose "Waiting for test run $testRunId to complete ..."
        $waitingForTestRunCompletion = $true
        while ($waitingForTestRunCompletion)
        {
            Start-Sleep -s $TestRunWaitDelay
            $testRunStatus = & "$tcmExe" run /list  /collection:"$collection" /login:$LoginCreds /teamproject:"$TeamProject" /querytext:"SELECT * FROM TestRun WHERE TestRunId=$testRunId"
            if ($testRunStatus.Count -lt 3 -or ($testRunStatus.Count -gt 2 -and $testRunStatus.GetValue(2) -match '.+(?<DateCompleted>\d+[/]\d+[/]\d+)'))
            {
                $waitingForTestRunCompletion = $false
            }
        }

        write-verbose "Evaluating test run $testRunId results..."
        # We do a small pause since the results might not be published yet.
        Start-Sleep -s $TestRunWaitDelay

        $testRunResultsTrxFileName = "TestRunResults$testRunId.trx"
        & "$tcmExe" run /export /id:$testRunId  /collection:"$collection" /login:$LoginCreds /teamproject:"$TeamProject" /resultsfile:"$testRunResultsTrxFileName" | Out-Null
        if (Test-path($testRunResultsTrxFileName))
        {
            # Load the XML document contents.
            [xml]$testResultsXml = Get-Content "$testRunResultsTrxFileName"
           
            # Extract the results of the test run.
            $total = $testResultsXml.TestRun.ResultSummary.Counters.total
            $passed = $testResultsXml.TestRun.ResultSummary.Counters.passed
            $failed = $testResultsXml.TestRun.ResultSummary.Counters.failed
            $inconclusive = $testResultsXml.TestRun.ResultSummary.Counters.inconclusive

            # Output the results of the test run.
            write-verbose "`n========== Test: $total tests ran, $passed succeeded, $failed failed, $inconclusive inconclusive =========="

            # Determine if there were any failed tests during the test run execution.
            if ($failed -eq 0 -and (-not $InconclusiveFailsTests -or $inconclusive -eq 0))
            {
                # Update this script's exit code.
                $exitCode = 0
            }

            # Remove the test run results file.
            remove-item($testRunResultsTrxFileName) | Out-Null
        }
        else
        {
            write-error "`nERROR: Unable to export test run results file for analysis."
        }
    }
}
else
{
    write-error "`nERROR: Unable to locate $tcmExe"
}

##################################################################################
# Indicate the resulting exit code to the calling process.
if ($exitCode -gt 0)
{
    write-error "`nERROR: Operation failed with error code $exitCode."
}
write-verbose "`nDone."
exit $exitCode

 

Once this script is placed into source control in such a way that it ends up in the drops location for the build you can call it as a standard script item in your pipeline, targeting the VM that has TCM installed. Remember, you get the test environment name and various IDs required from MTM. Check the TCM command line for more details.

image

 

However we hit a problem, RM sets PowerShell variable, not the parameters for script . So I find it easiest to use a wrapper script, also stored in source control, that converts the variable to the needed parameters. This also gives the opportunity to use RM set runtime variables and build more complex objects such as the credentials

 

# Output execution parameters.
$VerbosePreference ='Continue' # equiv to -verbose
$folder = Split-Path -Parent $MyInvocation.MyCommand.Definition

write-verbose "Running $folder\TcmExecWithLogin.ps1"

& "$folder\TcmExecWithLogin.ps1" -Collection $Collection -Teamproject $Teamproject -PlanId $PlanId  -SuiteId $SuiteId -ConfigId $ConfigId -BuildDirectory $PackageLocation -TestEnvironment $TestEnvironment -LoginCreds "$TestUserUid,$TestUserPwd" -SettingsName $SettingsName

 

Step 5 – Run it all

If you have everything in place you should now be able to trigger your deployment and have the tests run.

image

Finishing Up and One final gotcha

I had hoped that my integration test run would be associated with my build. Normally when triggering test via TCM you do this by adding the following parameters to the TCM command line

TCM [all the other params] -BuildNumber 'My.Build.CI_1.7.25.29773' -BuildDefinition 'My.Build.CI' 

However this will not work in the scenario above. This is because you can only use these flags to associate with successful builds, at the time TCM is run in the pipeline the build has not finished so it is not marked as successful. This does somewhat limit the end to end reporting. However, I think for now I can accept this limitation as the deployment completing is a suitable marker that the tests were passed.

The only workaround I can think is not to trigger the release directly from the build but to use the TFS events system to allow the build to finish first then trigger the release. You could use my TFS DSL Alert processor for that.