# But it works on my PC!

### The random thoughts of Richard Fennell on technology and software development

This article was first published on the Microsoft’s UK Developers site as Running nUnit and Jasmine.JS unit tests in TFS/VSO vNext build

With the advent of vNext build in TFS 2015 and Visual Studio Online running unit tests that are not MSTest based within your build process is far more straightforward than it used to be. No longer do you have to use custom XAML build activities or tell all your TFS build controllers where the test runner assemblies are. The ‘out the box’ vNext build Visual Studio Test task will automatically load any test adaptors it finds in the path specified for test runners in its advanced properties, a path that can be populated via NuGet.

### Running nUnit tests

All this means that to find and run MSTest and nUnit tests as part of your build all you have to do is as follows

1. Create a solution that contains a project with MStest and nUnit tests, in my sample this is a MVC web application project with its automatically created MSTest unit tests project.
2. In the test project add some nUnit tests. Use NuGet to add the references to nUnit to the test project so it compiles.
3. Historically in your local Visual Studio instance you needed to install the nUnit Test Runner VSIX package from Visual Studio Gallery – this allows Visual Studio to discover your nUnit tests, as well as any MSTest ones, and run them via the built in Test Explorer

IMPORTANT Change –
However installing this VSIX package is no longer required. If you use Nuget to add the nUnit Test Runner to the solution, as well as the nUnit package itself, then Visual Studio can find the nUnit tests without the VSIX package. This is useful but not world changing on your development PC, but when on the build box it means the NuGet restore will make sure the nUnit test adapter assemblies are pulled down onto the local build boxes file system and used to find tests with no extra work.

Note
: If you still want to install the VSIX package on your local Visual Studio instance you can, it is just you don’t have to.
4. Check in your solution into TFS/VSO source control. It does not matter if it is TFVC or Git based
5. Create a new vNext build using the Visual Studio template
6. You can leave most of the parameters on default setting. But you do need to edit the Visual Studio Test task’s advanced settings to point at the NuGet packages folder for your solution (which will be populated via NuGet restore) so the custom nUnit test adaptor can be found i.e. usually setting it to  $(Build.SourcesDirectory)\packages 7. The build should run and find your tests, the MStest ones because they are built in and the nUnit ones because it found the custom test adaptor due to the NuGet restore being done prior to the build. The test results can be found on the build summary page ### But what if you want run Jasmine.JS test? If you want to run Jasmine JavaScript unit tests the process is basically the same. The only major difference is that you do still need to install the Chutzpah Test runner on your local Visual Studio as a VSIX package to run the tests locally. There is a NuGet package for the Chutzpah test runner so you can avoid having to manually unpack the VSIX and get it into source control to deploy it to the build host (unless you really want to follow this process), but this package does not currently enable Visual Studio to find the Jasmine tests without the VSIX extension being installed, or at least it didn’t for me. Using the solution I used before 1. Use NuGet to add Jasmine.JS to the test project 2. Add a test file to the test project e.g. mycode.tests.js (adding any JavaScript references needed to find any script code under test in the main WebApp project) 3. Install the Chutzpah Test runner in your local Visual Studio as a VSIX extension, restart Visual Studio 4. You should now be able to see and run the Jasmine test run in the test runner as well as the MSTest and nUnit tests. 5. Add the NuGet package for the Chutzpah test runner to your solution, this is a solution level package, so does not need to be associated with any project. 6. Check the revised code into source control 7. In your vNext build add another Visual Studio Test task, set the test assembly to match your javascript test naming convention e.g. **\*.tests.js and the path to the custom test adaptor to$(Build.SourcesDirectory)\packages (as before)

8. Run the revised build.

9. You should see the two test tasks run and a pair of test results in the summary for the build.

So now hopefully you should find this a more straight forward way to added testing to your vNext builds. Allowing easy use of both your own build boxes and the hosted build service for VSO with testing frameworks they do not support ‘out the box’

When doing TFS upgrades it is useful to know roughly how long they will take. The upgrade programs give a number of steps, but not all steps are equal. Some are quick, some are slow. I have found it useful to graph past updates so I can get a feel of how long an update will take given it got to ‘step x in y minutes’. You can do this by hand, noting down time as specific steps are reached. However for a long upgrade it usually means pulling data out of the TFS TPC upgrade logs.

To make this process easier I put together this script to find the step completion rows in the log file and format them out such that they are easy to graph in Excel

param(    $logfile = "TPC_ApplyPatch.log",$outfile = "out.csv")
# A function to covert the start and end times to a number of minutes # Can't use simple timespan as we only have the time portion not the whole datetime # Hence the hacky added a day-1 second function CalcDuration {    param    (        $startTime,$endTime    )
$diff = [dateTime]$endTime - $startTime if ([dateTime]$endTime -lt $startTime) {$diff += "23:59" # add a day as we past midnight    }
[int]$diff.Hours *60 +$diff.Minutes }
Write-Host "Importing $logfile for processing" # pull out the lines we are interested in using a regular expression to extract the columns # the (.{8} handle the fixed width, exact matches are used for the test$lines = Get-Content -Path $logfile | Select-String " Executing step:" | Where{$_ -match "^(.)(.{8})(.{8})(Executing step:)(.{2})(.*)(')(.*)([(])(.*)([ ])([of])(.*)"} | ForEach{    [PSCustomObject]@{        'Step' = $Matches[10] 'TimeStamp' =$Matches[2]        'Action' = $Matches[6] } } # We assume the upgrade started at the timestamp of the 0th step# Not true but very close[DateTime]$start = $lines[0].TimeStamp Write-Host "Writing results to$outfile"# Work out the duration $steps =$lines | ForEach{    [PSCustomObject]@{        'Step' = $_.Step 'TimeStamp' =$_.TimeStamp        'EplasedTime' = CalcDuration -startTime $start -endTime$_.TimeStamp         'Action' = $_.Action } }$steps | export-csv $outfile -NoTypeInformation # and list to screen$steps

TFS vNext builds do not have a concept of build quality unlike the old XAML based builds. This is an issue for us as we used the changing of the build quality as signal to test a build, or to mark it as released to a client (this was all managed with my TFS Alerts DSL to make sure suitable emails and build retention were used).

So how to get around this problem with vNext?

I have used Tag on builds, set using the same REST API style calls as detailed in my post on Release Management vNext templates. I also use the REST API to set the retention on the build, so I actually now don’t need to manage this via the alerts DSL.

The following script, if used to wrapper the calling of integration tests via TCM, should set the tags and retention on a build

function Get-BuildDetailsByNumber{    param    (        $tfsUri ,$buildNumber,        $username,$password
)
$uri = "$($tfsUri)/_apis/build/builds?api-version=2.0&buildnumber=$buildNumber"
$wc = New-Object System.Net.WebClient if ($username -eq $null) {$wc.UseDefaultCredentials = $true } else {$wc.Credentials = new-object System.Net.NetworkCredential($username,$password)    }    write-verbose "Getting ID of $buildNumber from$tfsUri "    $jsondata =$wc.DownloadString($uri) | ConvertFrom-Json$jsondata.value[0]  }
function Set-BuildTag{    param    (        $tfsUri ,$buildID,        $tag,$username,         $password )$wc = New-Object System.Net.WebClient    $wc.Headers["Content-Type"] = "application/json" if ($username -eq $null) {$wc.UseDefaultCredentials = $true } else {$wc.Credentials = new-object System.Net.NetworkCredential($username,$password)    }        write-verbose "Setting BuildID $buildID with Tag$tag via $tfsUri "$uri = "$($tfsUri)/_apis/build/builds/$($buildID)/tags/$($tag)?api-version=2.0"    $data = @{value =$tag } | ConvertTo-Json    $wc.UploadString($uri,"PUT", $data) } function Set-BuildRetension{ param ($tfsUri ,        $buildID,$keepForever,        $username,$password    )      $wc = New-Object System.Net.WebClient$wc.Headers["Content-Type"] = "application/json"    if ($username -eq$null)    {        $wc.UseDefaultCredentials =$true    } else     {        $wc.Credentials = new-object System.Net.NetworkCredential($username, $password) } write-verbose "Setting BuildID$buildID with retension set to $keepForever via$tfsUri "    $uri = "$($tfsUri)/_apis/build/builds/$($buildID)?api-version=2.0"$data = @{keepForever = $keepForever} | ConvertTo-Json$response = $wc.UploadString($uri,"PATCH", $data) } # Output execution parameters.$VerbosePreference ='Continue' # equiv to -verbose$ErrorActionPreference = 'Continue' # this controls if any test failure cause the script to stop$folder = Split-Path -Parent $MyInvocation.MyCommand.Definitionwrite-verbose "Running$folder\TcmExec.ps1"
& "$folder\TcmExec.ps1" -Collection$Collection -Teamproject $Teamproject -PlanId$PlanId  -SuiteId $SuiteId -ConfigId$ConfigId -BuildDirectory $PackageLocation -TestEnvironment$TestEnvironment -SettingsName $SettingsName write-verbose "TCM exited with code '$LASTEXITCODE'"$newquality = "Test Passed"$tag = "Deployed to Lab"$keep =$trueif ($LASTEXITCODE -gt 0 ){$newquality = "Test Failed"    $tag = "Lab Deployed failed"$keep = $false}write-verbose "Setting build tag to '$tag' for build $BuildNumber"$url = "$Collection/$Teamproject"$jsondata = Get-BuildDetailsByNumber -tfsUri$url -buildNumber $BuildNumber #-username$TestUserUid -password $TestUserPwd$buildId = $jsondata.idwrite-verbose "The build$BuildNumber has ID of $buildId" write-verbose "The build tag set to '$tag' and retention set to '$key'"Set-BuildTag -tfsUri$url  -buildID $buildId -tag$tag #-username $TestUserUid -password$TestUserPwdSet-BuildRetension -tfsUri $url -buildID$buildId  -keepForever $keep #-username$TestUserUid -password $TestUserPwd # now fail the stage after we have sorted the loggingif ($LASTEXITCODE -gt 0 ){    Write-error "Test have failed"}

If all the tests pass we see the Tag being added and the retention being set, if they fail just a tag should be set

$ErrorActionPreference = 'Continue' A couple of months ago I wrote a post on using PowerShell scripts to deploy web sites in Release Management vNext templates as opposed to DSC. In that post I provided a script to help with the translation of Release Management configuration variables to entries in the [MSDELPOY].setparameters.xml file for web sites. The code I provided in that post required you to hard code the variables to translate. This quickly become a problem for maintenance. However, there is a simple solution. If we use a naming convention for our RM configuration variables that map to web.config entries (I chose __NAME__ to be consistent to the old RM Agent based deployment standards) we can let PowerShell do the work. So the revised script is $VerbosePreference ='Continue' # equiv to -verbose
function Update-ParametersFile{    param    (        $paramFilePath,$paramsToReplace    )
write-verbose "Updating parameters file '$paramFilePath'" -verbose$content = get-content $paramFilePath$paramsToReplace.GetEnumerator() | % {        Write-Verbose "Replacing value for key '$($_.Name)'" -Verbose        $content =$content.Replace($_.Name,$_.Value)    }    set-content -Path $paramFilePath -Value$content
}
# the script folder$folder = Split-Path -parent$MyInvocation.MyCommand.Definitionwrite-verbose "Deploying Website '$package' using script in '$folder'"
# work out the variables to replace using a naming convention# we make sure that the value is stored in an array even if it is single item$parameters = @(Get-Variable -include "__*__" )write-verbose "Discovered replacement parameters that match the convention '__*__':$($parameters | Out-string)" Update-ParametersFile -paramFilePath "$ApplicationPath\$packagePath\$package.SetParameters.xml" -paramsToReplace $parameters write-verbose "Calling '$ApplicationPath\$packagePath\$package.deploy.cmd'" & "$ApplicationPath\$packagePath\$package.deploy.cmd" /Y /m:"$PublishUrl" -allowUntrusted /u:"$PublishUser" /p:"$PublishPassword" /a:Basic | Write-Verbose 

Note: This script allow the deployment to a remote IIS server, so useful for Azure Web Sites. If you are running it locally on an IIS server just trim everything after the /Y on the last line

So now I provide

• $PackagePath – path to our deployment on the deployment VM(relative to the$ApplicationPath local working folder)
• $Package – name of the MSdeploy package • The publish settings you can get from the Azure Portal •$__PARAM1__ –  a value to swap in the web.config
• $__PARAM2__ – another value to swap in the web.config In RM it will look like this. So now you can use a single script for all your web deployments. ### Background We upgraded our production TFS 2013.4 server to TFS 2015 RTM this week. As opposed to an in-place upgrade we chose to make a few change on the way; so whilst leaving our DBs on our SQL 2012 cluster • We moved to a new VM for our AT (to upgrade from Windows 2008R2 to 2012R2) • Split the SSRS instance off the AT to a separate VM with a new SSAS server (again to move to 2012R2 and to ease management, getting all the reporting bits in one place) But we do not touch • Our XAML Build systems leaving them at 2013 as we intend to migrate to vNext build ASAP • Our Test Controller/Release Management/Lab Environment leaving it at 2013 for now, as we have other projects on the go to update the hardware/cloud solutions underpinning theses. All went well, no surprises, the running of the upgrade tool took about 1 hour. ### The Problem The only problem we have had was to do with my TFS Alerts DSL Processor, which listens for TFS Alerts and runs custom scripts . I host this on the TFS AT, and I would expect it to set build retention and send emails when a TFS XAML Build quality changes. This did not occur, in the Windows error log I was seeing 2015-08-12 21:04:02.4195 ERROR TFSEventsProcessor.DslScriptService: TF30063: You are not authorized to access https://tfs.blackmarble.co.uk/tfs/DefaultCollection. After much fiddling, including writing a small command line test client, I confirmed that the issue was specific to the production server. The tool ran fine on other PCs, but on the live server a Window authentication dialog was shown which would not accept any valid credentials It was not as I had feared a change in the TFS API, in fact there is no reason my 2012 or 2013 API targeted version of the TFS Alert DSL should not be able to talk to a TFS 2015 server as long as the correct version of the TFS API is installed on the machine hosting the DSL. ### The Solution The issue was due to Windows loopback protection. This had been disabled on our old old TFS AT, but not on the new one. As we wanted to avoid changing the global loopback protection setting we set the following via Regedit to allow it for a single CName HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Lsa\MSV1_0 ValueName - BackConnectionHostNames Type - multistring Data - tfs.blackmarble.co.uk Once this was done(and without a reboot) my alerts processing work without any problems. In my last post I discussed how you could wire TCM tests into a Release Management vNext pipeline. The problem with the script I provided, as I noted, was that the deployment was triggered synchronously by the build i.e. the build/release process was: 1. TFS Build 1. Gets the source 2. Compiled the code 3. Run the unit tests 4. Trigger the RM pipeline 5. Wait while the RM pipeline completed 2. RM then 1. Deploys the code 2. Runs the integration tests 3. When RM completed the TFS build completes This process raised a couple of problems • You cannot associate the integration tests with the build as TCM only allow association with completed successful builds. When TCM finishes in this model the build is still in progress. • You have to target only the first automated stage of the pipeline, else the build will be held as ‘in progress’ until all the release stages have complete, which may be days if there are manual approvals involved ## The script InitiateReleaseFromBuild These problems can all be fixed by altering the PowerShell that triggers the RM pipeline so that it does not wait for the deployment to complete, so the TFS build completes as soon as possible. This is done by passing in an extra parameter which is set in TFS build param( [string]$rmserver = $Args[0], [string]$port = $Args[1], [string]$teamProject = $Args[2], [string]$targetStageName = $Args[3], [string]$waitForCompletion = $Args[4]) cls$teamFoundationServerUrl = $env:TF_BUILD_COLLECTIONURI$buildDefinition = $env:TF_BUILD_BUILDDEFINITIONNAME$buildNumber = $env:TF_BUILD_BUILDNUMBER "Executing with the following parameters:n"" RMserver Name:$rmserver""  Port number: $port"" Team Foundation Server URL:$teamFoundationServerUrl""  Team Project: $teamProject"" Build Definition:$buildDefinition""  Build Number: $buildNumber"" Target Stage Name:$targetStageNamen""  Wait for RM completion: $waitForCompletionn"$wait = [System.Convert]::ToBoolean($waitForCompletion)$exitCode = 0
trap{  $e =$error[0].Exception  $e.Message$e.StackTrace  if ($exitCode -eq 0) {$exitCode = 1 }}
$scriptName =$MyInvocation.MyCommand.Name$scriptPath = Split-Path -Parent (Get-Variable MyInvocation -Scope Script).Value.MyCommand.Path Push-Location$scriptPath
$server = [System.Uri]::EscapeDataString($teamFoundationServerUrl)$project = [System.Uri]::EscapeDataString($teamProject)$definition = [System.Uri]::EscapeDataString($buildDefinition)$build = [System.Uri]::EscapeDataString($buildNumber)$targetStage = [System.Uri]::EscapeDataString($targetStageName)
$serverName =$rmserver + ":" + $port$orchestratorService = "http://$serverName/account/releaseManagementService/_apis/releaseManagement/OrchestratorService"$status = @{    "2" = "InProgress";    "3" = "Released";    "4" = "Stopped";    "5" = "Rejected";    "6" = "Abandoned";}
$uri = "$orchestratorService/InitiateReleaseFromBuild?teamFoundationServerUrl=$server&teamProject=$project&buildDefinition=$definition&buildNumber=$build&targetStageName=$targetStage""Executing the following API call:nn$uri"
$wc = New-Object System.Net.WebClient$wc.UseDefaultCredentials = $true# rmuser should be part rm users list and he should have permission to trigger the release. #$wc.Credentials = new-object System.Net.NetworkCredential("rmuser", "rmuserpassword", "rmuserdomain")
try{    $releaseId =$wc.DownloadString($uri)$url = "$orchestratorService/ReleaseStatus?releaseId=$releaseId"
$releaseStatus =$wc.DownloadString($url) if ($wait -eq $true) { Write-Host -NoNewline "nReleasing ..." while($status[$releaseStatus] -eq "InProgress") { Start-Sleep -s 5$releaseStatus = $wc.DownloadString($url)            Write-Host -NoNewline "."        }
" done.nnRelease completed with {0} status." -f $status[$releaseStatus]    } else {
Write-Host -NoNewline "nTriggering Release and exiting"    }
}catch [System.Exception]{    if ($exitCode -eq 0) {$exitCode = 1 }    Write-Host "n$_n" -ForegroundColor Red} if ($exitCode -eq 0){    if ($wait -eq$true)    {        if ($releaseStatus -eq 3) { "nThe script completed successfully. Product deployed without errorn" } else { Write-Host "nThe script completed successfully. Product failed to deployn" -ForegroundColor Red$exitCode = -1 # reset the code to show the error        }    } else {        "nThe script completed successfully. Product deployingn"    }}else{  $err = "Exiting with error: " +$exitCode + "n"  Write-Host $err -ForegroundColor Red} Pop-Location exit$exitCode

## The Script TcmExecWrapper

A change is also required in the wrapper script I use to trigger the TCM test run. We need to check the exit code from the inner TCM PowerShell script and update the TFS build quality appropriately.

To this I use the new REST API in TFS 2015 as this is far easier than using the older .NET client API. No DLLs to distribute.

It is worth noticing that

• I pass the credentials into the script from RM that are used to talk to the TFS server. This is because I am running my tests in a network isolated TFS Lab Environment, this means I am in the wrong domain to see the TFS server without providing login details. If you are not working cross domain you could just use Default Credentials.
• RM only passes the BuildNumber into the script e.g. MyBuild_1.2.3.4, but the REST API need the build id to set the quality. Hence the need for function Get-BuildDetailsByNumber to get the id from the name
# Output execution parameters.$VerbosePreference ='Continue' # equiv to -verbosefunction Get-BuildDetailsByNumber{ param ($tfsUri ,        $buildNumber,$username,         $password )$uri = "$($tfsUri)/_apis/build/builds?api-version=2.0&buildnumber=$buildNumber"$wc = New-Object System.Net.WebClient    #$wc.UseDefaultCredentials =$true    $wc.Credentials = new-object System.Net.NetworkCredential($username, $password) write-verbose "Getting ID of$buildNumber from $tfsUri "$jsondata = $wc.DownloadString($uri) | ConvertFrom-Json     $jsondata.value[0] }function Set-BuildQuality{ param ($tfsUri ,        $buildID,$quality,        $username,$password    )    $uri = "$($tfsUri)/_apis/build/builds/$($buildID)?api-version=1.0"$data = @{quality = $quality} | ConvertTo-Json$wc = New-Object System.Net.WebClient    $wc.Headers["Content-Type"] = "application/json" #$wc.UseDefaultCredentials = $true$wc.Credentials = new-object System.Net.NetworkCredential($username,$password)        write-verbose "Setting BuildID $buildID to quality$quality via $tfsUri "$wc.UploadString($uri,"PATCH",$data)     }$folder = Split-Path -Parent$MyInvocation.MyCommand.Definitionwrite-verbose "Running $folder\TcmExecWithLogin.ps1" & "$folder\TcmExecWithLogin.ps1" -Collection $Collection -Teamproject$Teamproject -PlanId $PlanId -SuiteId$SuiteId -ConfigId $ConfigId -BuildDirectory$PackageLocation -TestEnvironment $TestEnvironment -LoginCreds "$TestUserUid,$TestUserPwd" -SettingsName$SettingsName -BuildNumber $BuildNumber -BuildDefinition$BuildDefinitionwrite-verbose "Got the exit code from the TCM run of $LASTEXITCODE"$url = "$Collection/$Teamproject"$jsondata = Get-BuildDetailsByNumber -tfsUri$url -buildNumber $BuildNumber -username$TestUserUid -password $TestUserPwd$buildId = $jsondata.idwrite-verbose "The build ID is$buildId"$newquality = "Test Passed"if ($LASTEXITCODE -gt 0 ){    $newquality = "Test Failed"} write-verbose "The build quality is$newquality"Set-BuildQuality -tfsUri $url -buildID$buildId -quality $newquality -username$TestUserUid -password $TestUserPwd Note: TcmExecWithLogin.ps1 is the same as in the In my last post ## Summary So with these changes the process is now 1. TFS Build 1. Gets the source 2. Compiled the code 3. Run the unit tests 4. Trigger the RM pipeline 5. Build ends 2. RM then 1. Deploys the code 2. Runs the integration tests 3. When the test complete we set the TFS build quality This means we can associate both unit and integration tests with a build and target our release at any stage in the pipeline, it pausing at the points manual approval is required without blocking the initiating build. Also see Part 2 on how to address gotcha's in this process When using Release Management there is a good chance you will want to run test suites as part of your automated deployment pipeline. If you are using a vNext PowerShell based pipeline you need a way to trigger the tests via PowerShell as there is no out the box agent to do the job. ## Step 1 - Install a Test Agent The first step is to make sure that the Visual Studio Test Agent is installed on the box you wish to run the test on. if you don’t already have a MTM Environment in place with a test agent then this can be done by creating a standard environment in Microsoft Test Manager. Remember you only need this environment to include the VM you want to run the test on, unless you want to also gather logs and events from our machines in the system. The complexity is up to you. In my case I was using a network isolated environment so all this was already set up. ## Step 2 - Setup the Test Suite Once you have an environment you can setup your test suite and test plan in MTM to include the tests you wish to run. These can be unit test style integration tests or Coded UI it is up to you. If you have a lot of unit tests to associate for automation remember the TCM.EXE command can make your life a lot easier This post does not aim to be a tutorial on setting up test plans, have a look at the ALM Rangers guides for more details. ## Step 3 - The Release Management environment This is where it gets a bit confusing, you have already set up a Lab Management environment, but you still need to setup the Release Management vNext environment. As I was using a network isolated Lab management environment this gets even more complex, but RM provides some tools to help Again this is not a detailed tutorial. The key steps if you are using network isolation are 1. Make sure that PowerShell on the VM is setup for remote access by running winrm quickconfig 2. In RM create a vNext environment 3. Add each a new server, using it’s corporate LAN name from Lab Management with the PowerShell remote access port e.g. VSLM-1002-e7858e28-77cf-4163-b6ba-1df2e91bfcab.lab.blackmarble.co.uk:5985 4. Make sure the server is set to use a shared UNC path for deployment. 5. Remember you will login to this VM with the credentials for the test domain. By this point you might be a bit confused as to what you have, well here is a diagram ## Step 4 - Wiring the test into the pipeline The final step is get the release pipeline to trigger the tests. This is done by calling the TCM.EXE command line to instruct the Test Controller trigger the tests. Now the copy of TCM does not have to be in Lab Management environment, but it does need to be on a VM known to RM vNext environment. This will usually mean a VM with Visual Studio Test Manager or Premium (or Enterprise for 2015) installed. In my case this was a dedicated test VM within the environment. The key to the process is to run a script similar to the one used by the older RM agent based system to trigger the tests. You can extract this PowerShell script from an old release pipeline, but for ease I show my modified version here. The key changes are that I pass in the login credentials required for the call to the TFS server from TCM.EXE to be made from inside the network isolated environment and do a little extra checking of the test results so I can fail the build if the tests fail. These edits might not be required if you trigger TCM from a VM that is in the same domain as your TFS server, or have different success criteria. param( [string]$BuildDirectory = $null, [string]$BuildDefinition = $null, [string]$BuildNumber = $null, [string]$TestEnvironment = $null, [string]$LoginCreds = $null, [string]$Collection = $(throw "The collection URL must be provided."), [string]$TeamProject = $(throw "The team project must be provided."), [Int]$PlanId = $(throw "The test plan ID must be provided."), [Int]$SuiteId = $(throw "The test suite ID must be provided."), [Int]$ConfigId = $(throw "The test configuration ID must be provided."), [string]$Title = 'Automated UI Tests',    [string]$SettingsName =$null,    [Switch]$InconclusiveFailsTests =$false,    [Switch]$RemoveIncludeParameter =$false,    [Int]$TestRunWaitDelay = 10) ################################################################################### Output the logo.write-verbose "Based on the Microsoft Release Management TcmExec PowerShell Script v12.0"write-verbose "Copyright (c) 2013 Microsoft. All rights reserved.n" ################################################################################### Initialize the default script exit code.$exitCode = 1
################################################################################### Output execution parameters.write-verbose "Executing with the following parameters:"write-verbose "  Build Directory: $BuildDirectory"write-verbose " Build Definition:$BuildDefinition"write-verbose "  Build Number: $BuildNumber"write-verbose " Test Environment:$TestEnvironment"write-verbose "  Collection: $Collection"write-verbose " Team project:$TeamProject"write-verbose "  Plan ID: $PlanId"write-verbose " Suite ID:$SuiteId"write-verbose "  Configuration ID: $ConfigId"write-verbose " Title:$Title"write-verbose "  Settings Name: $SettingsName"write-verbose " Inconclusive result fails tests:$InconclusiveFailsTests"write-verbose "  Remove /include parameter from /create command: $RemoveIncludeParameter"write-verbose " Test run wait delay:$TestRunWaitDelay"
################################################################################### Define globally used variables and constants.# Visual Studio 2013$vscommtools = [System.Environment]::GetEnvironmentVariable("VS120COMNTOOLS")if ($vscommtools -eq $null){ # Visual Studio 2012$vscommtools = [System.Environment]::GetEnvironmentVariable("VS110COMNTOOLS")}if ($vscommtools -eq$null){    # Visual Studio 2010    $vscommtools = [System.Environment]::GetEnvironmentVariable("VS100COMNTOOLS") if ($vscommtools -ne $null) { if ([string]::IsNullOrEmpty($BuildDirectory))        {            $(throw "The build directory must be provided.") } if (![string]::IsNullOrEmpty($BuildDefinition) -or ![string]::IsNullOrEmpty($BuildNumber)) {$(throw "The build definition and build number parameters may be used only under Visual Studio 2012/2013.")        }    }}else{    if ([string]::IsNullOrEmpty($BuildDefinition) -and [string]::IsNullOrEmpty($BuildNumber) -and [string]::IsNullOrEmpty($BuildDirectory)) {$(throw "You must specify the build directory or the build definition and build number.")    }}$tcmExe = [System.IO.Path]::GetFullPath($vscommtools + "..\IDE\TCM.exe")
################################################################################### Ensure TCM.EXE is available in the assumed path.if ([System.IO.File]::Exists($tcmExe)){ ################################################################################## # Prepare optional parameters.$testEnvironmentParameter = "/testenvironment:$TestEnvironment" if ([string]::IsNullOrEmpty($TestEnvironment))    {        $testEnvironmentParameter = [string]::Empty } if ([string]::IsNullOrEmpty($BuildDirectory))    {        $buildDirectoryParameter = [string]::Empty } else { # make sure we remove any trailing slashes as the cause permission issues$BuildDirectory = $BuildDirectory.Trim() while ($BuildDirectory.EndsWith("\"))        {            $BuildDirectory =$BuildDirectory.Substring(0,$BuildDirectory.Length-1) }$buildDirectoryParameter = "/builddir:""$BuildDirectory""" }$buildDefinitionParameter = "/builddefinition:""$BuildDefinition""" if ([string]::IsNullOrEmpty($BuildDefinition))    {        $buildDefinitionParameter = [string]::Empty }$buildNumberParameter = "/build:""$BuildNumber""" if ([string]::IsNullOrEmpty($BuildNumber))    {        $buildNumberParameter = [string]::Empty }$includeParameter = '/include'    if ($RemoveIncludeParameter) {$includeParameter = [string]::Empty    }    $settingsNameParameter = "/settingsname:""$SettingsName"""    if ([string]::IsNullOrEmpty($SettingsName)) {$settingsNameParameter = [string]::Empty    }
##################################################################################    # Create the test run.    write-verbose "nCreating test run ..."    $testRunId = & "$tcmExe" run /create /title:"$Title" /login:$LoginCreds /planid:$PlanId /suiteid:$SuiteId /configid:$ConfigId /collection:"$Collection" /teamproject:"$TeamProject"$testEnvironmentParameter $buildDirectoryParameter$buildDefinitionParameter $buildNumberParameter$settingsNameParameter $includeParameter if ($testRunId -match '.+\:\s(?<TestRunId>\d+)\.')    {        # The test run ID is identified as a property in the match collection        # so we can access it directly by using the group name from the regular        # expression (i.e. TestRunId).        $testRunId =$matches.TestRunId
write-verbose "Waiting for test run $testRunId to complete ..."$waitingForTestRunCompletion = $true while ($waitingForTestRunCompletion)        {            Start-Sleep -s $TestRunWaitDelay$testRunStatus = & "$tcmExe" run /list /collection:"$collection" /login:$LoginCreds /teamproject:"$TeamProject" /querytext:"SELECT * FROM TestRun WHERE TestRunId=$testRunId" if ($testRunStatus.Count -lt 3 -or ($testRunStatus.Count -gt 2 -and$testRunStatus.GetValue(2) -match '.+(?<DateCompleted>\d+[/]\d+[/]\d+)'))            {                $waitingForTestRunCompletion =$false            }        }
write-verbose "Evaluating test run $testRunId results..." # We do a small pause since the results might not be published yet. Start-Sleep -s$TestRunWaitDelay
$testRunResultsTrxFileName = "TestRunResults$testRunId.trx"        & "$tcmExe" run /export /id:$testRunId  /collection:"$collection" /login:$LoginCreds /teamproject:"$TeamProject" /resultsfile:"$testRunResultsTrxFileName" | Out-Null        if (Test-path($testRunResultsTrxFileName)) { # Load the XML document contents. [xml]$testResultsXml = Get-Content "$testRunResultsTrxFileName" # Extract the results of the test run.$total = $testResultsXml.TestRun.ResultSummary.Counters.total$passed = $testResultsXml.TestRun.ResultSummary.Counters.passed$failed = $testResultsXml.TestRun.ResultSummary.Counters.failed$inconclusive = $testResultsXml.TestRun.ResultSummary.Counters.inconclusive # Output the results of the test run. write-verbose "n========== Test:$total tests ran, $passed succeeded,$failed failed, $inconclusive inconclusive ==========" # Determine if there were any failed tests during the test run execution. if ($failed -eq 0 -and (-not $InconclusiveFailsTests -or$inconclusive -eq 0))            {                # Update this script's exit code.                $exitCode = 0 } # Remove the test run results file. remove-item($testRunResultsTrxFileName) | Out-Null        }        else        {            write-error "nERROR: Unable to export test run results file for analysis."        }    }}else{    write-error "nERROR: Unable to locate $tcmExe"} ################################################################################### Indicate the resulting exit code to the calling process.if ($exitCode -gt 0){    write-error "nERROR: Operation failed with error code $exitCode."}write-verbose "nDone."exit$exitCode

Once this script is placed into source control in such a way that it ends up in the drops location for the build you can call it as a standard script item in your pipeline, targeting the VM that has TCM installed. Remember, you get the test environment name and various IDs required from MTM. Check the TCM command line for more details.

However we hit a problem, RM sets PowerShell variable, not the parameters for script . So I find it easiest to use a wrapper script, also stored in source control, that converts the variable to the needed parameters. This also gives the opportunity to use RM set runtime variables and build more complex objects such as the credentials

# Output execution parameters.$VerbosePreference ='Continue' # equiv to -verbose$folder = Split-Path -Parent $MyInvocation.MyCommand.Definition write-verbose "Running$folder\TcmExecWithLogin.ps1"
& "$folder\TcmExecWithLogin.ps1" -Collection$Collection -Teamproject $Teamproject -PlanId$PlanId  -SuiteId $SuiteId -ConfigId$ConfigId -BuildDirectory $PackageLocation -TestEnvironment$TestEnvironment -LoginCreds "$TestUserUid,$TestUserPwd" -SettingsName $SettingsName ## Step 5 – Run it all If you have everything in place you should now be able to trigger your deployment and have the tests run. ## Finishing Up and One final gotcha I had hoped that my integration test run would be associated with my build. Normally when triggering test via TCM you do this by adding the following parameters to the TCM command line TCM [all the other params] -BuildNumber 'My.Build.CI_1.7.25.29773' -BuildDefinition 'My.Build.CI'  However this will not work in the scenario above. This is because you can only use these flags to associate with successful builds, at the time TCM is run in the pipeline the build has not finished so it is not marked as successful. This does somewhat limit the end to end reporting. However, I think for now I can accept this limitation as the deployment completing is a suitable marker that the tests were passed. The only workaround I can think is not to trigger the release directly from the build but to use the TFS events system to allow the build to finish first then trigger the release. You could use my TFS DSL Alert processor for that. If you are using basic PowerShell scripts as opposed to DSC with Release Management there are a few gotcha’s I have found. ## You cannot pass parameters Lets look at a sample script that we would like to run via Release Manager param($param1 )
write-verbose -verbose "Start"write-verbose -verbose "Got var1 [$var1]"write-verbose -verbose "Got param1 [$param1]"write-verbose -verbose "End"

In Release Manager we have the following vNext workflow

You can see we are setting two custom values which we intend to use within our script, one is a script parameter (Param1), the other one is just a global variable (Var1).

If we do a deployment we get the log

Copying recursively from \\store\drops\rm\4583e318-abb2-4f21-9289-9cb0264a3542\152 to C:\Windows\DtlDownloads\ISS vNext Drops succeeded.StartGot var1 [XXXvar1]Got param1 []End

You can see the problem, $var1 is set,$param1 is not. Took me a while to get my head around this, the problem is the RM activity’s PSSCriptPath is just that a script path, not a command line that will be executed. Unlike the PowerShell activities in the vNext build tools you don’t have a pair of settings, one for the path to the script and another for the arguments. Here we have no ways to set the command line arguments.

Note: The PSConfigurationPath is just for DSC configurations as discussed elsewhere.

So in effect the Param1 is not set, as we did not call

test -param1 “some value”

This means there is no point using parameters in the script you wish to use with RM vNext. But wait, I bet you are thinking ‘I want to run my script externally to Release Manager to test it, and using parameters with validation rules is best practice, I don’t want to loose that advantage

The best workaround I have found is to use a wrapper script that takes the variable and makes them parameters, something like this

$folder = Split-Path -Parent$MyInvocation.MyCommand.Definition& $folder\test.ps1 -param1$param1

Another Gotcha Note that I need to find the path the wrapper script is running in and use it to build the path to my actual script. If I don’t do this I get that the test.ps1 script can’t be found.

After altering my pipeline to use the wrapper and rerunning the deployment I get the log file I wanted

Copying recursively from \\store\drops\rm\4583e318-abb2-4f21-9289-9cb0264a3542\160 to C:\Windows\DtlDownloads\ISS vNext Drops succeeded.StartGot var1 [XXXvar1]Got param1 [XXXparam1]End 

This is all a bit ugly, but works.

Looking forward this appears to not be too much of an issue. The next version of Release Management as shown at Build is based around the vNext  TFS build tooling which seems to always allow you to pass true PowerShell command line arguments. So this problem should go away in the not too distant future.

## Don’t write to the console

The other big problem is any script that writes or reads from the console. Usually this means a write-host call in a script that causes an error along the lines

A command that prompts the user failed because the host program or the command type does not support user interaction. Try a host program that supports user interaction, such as the Windows PowerShell Console or Windows PowerShell ISE, and remove prompt-related commands from command types that do not support user interaction, such as Windows PowerShell workflows. +At C:\Windows\DtlDownloads\ISS vNext Drops\scripts\test.ps1:7 char:1+ Write-Host "hello 1" -ForegroundColor red

But also watch out for any CLS calls, that has caught me out. I have found the it can be hard to track down the offending lines, especially if there are PowerShell modules loading modules.

The best recommendation is to just use write-verbose and write-error.

• write-error if your script has errored. This will let RM know the script has failed, thus failing the deployment – just what we want
• write-verbose for any logging

Any other form of PowerShell output will not be passed to RM, be warned!

You might also notice in my sample script that I am passing the –verbose argument to the write-verbose command, again you have to have this maximal level of logging on  for the messages to make it out to the RM logs. Probably a better solution, if you think you might vary the level of logging, is to change the script to set the $VerbosePreference param($param1 )  $VerbosePreference ='Continue' # equiv to -verbose write-verbose "Start"write-verbose "Got var1 [$var1]"write-verbose "Got param1 [\$param1]"write-verbose "End"`

So hopefully a few pointers to make your deployments a bit smoother

With the release of Visual Studio 2015 there are some significant changes to Visual Studio and TFS licensing, you can find the details of Brian Harry’s blog. These changes can make a serious change in what you need to purchase for different roles, so it could well be worth a look.