But it works on my PC!

The random thoughts of Richard Fennell on technology and software development

An alternative to setting a build quality on a TFS vNext build

TFS vNext builds do not have a concept of build quality unlike the old XAML based builds. This is an issue for us as we used the changing of the build quality as signal to test a build, or to mark it as released to a client (this was all managed with my TFS Alerts DSL to make sure suitable emails and build retention were used).

So how to get around this problem with vNext?

I have used Tag on builds, set using the same REST API style calls as detailed in my post on Release Management vNext templates. I also use the REST API to set the retention on the build, so I actually now don’t need to manage this via the alerts DSL.

The following script, if used to wrapper the calling of integration tests via TCM, should set the tags and retention on a build


function Get-BuildDetailsByNumber
{
    param
    (
        $tfsUri ,
        $buildNumber,
        $username,
        $password

    )

    $uri = "$($tfsUri)/_apis/build/builds?api-version=2.0&buildnumber=$buildNumber"

    $wc = New-Object System.Net.WebClient
    if ($username -eq $null)
    {
        $wc.UseDefaultCredentials = $true
    } else
    {
        $wc.Credentials = new-object System.Net.NetworkCredential($username, $password)
    }
    write-verbose "Getting ID of $buildNumber from $tfsUri "

    $jsondata = $wc.DownloadString($uri) | ConvertFrom-Json
    $jsondata.value[0]
 
}

function Set-BuildTag
{
    param
    (
        $tfsUri ,
        $buildID,
        $tag,
        $username,
        $password

    )

 
    $wc = New-Object System.Net.WebClient
    $wc.Headers["Content-Type"] = "application/json"
    if ($username -eq $null)
    {
        $wc.UseDefaultCredentials = $true
    } else
    {
        $wc.Credentials = new-object System.Net.NetworkCredential($username, $password)
    }
   
    write-verbose "Setting BuildID $buildID with Tag $tag via $tfsUri "

    $uri = "$($tfsUri)/_apis/build/builds/$($buildID)/tags/$($tag)?api-version=2.0"

    $data = @{value = $tag } | ConvertTo-Json

    $wc.UploadString($uri,"PUT", $data)
   
}

function Set-BuildRetension
{
    param
    (
        $tfsUri ,
        $buildID,
        $keepForever,
        $username,
        $password

    )

 
    $wc = New-Object System.Net.WebClient
    $wc.Headers["Content-Type"] = "application/json"
    if ($username -eq $null)
    {
        $wc.UseDefaultCredentials = $true
    } else
    {
        $wc.Credentials = new-object System.Net.NetworkCredential($username, $password)
    }
   
    write-verbose "Setting BuildID $buildID with retension set to $keepForever via $tfsUri "

    $uri = "$($tfsUri)/_apis/build/builds/$($buildID)?api-version=2.0"
    $data = @{keepForever = $keepForever} | ConvertTo-Json
    $response = $wc.UploadString($uri,"PATCH", $data)
   
}


# Output execution parameters.
$VerbosePreference ='Continue' # equiv to -verbose

$ErrorActionPreference = 'Continue' # this controls if any test failure cause the script to stop

 

$folder = Split-Path -Parent $MyInvocation.MyCommand.Definition

write-verbose "Running $folder\TcmExec.ps1"

 

& "$folder\TcmExec.ps1" -Collection $Collection -Teamproject $Teamproject -PlanId $PlanId  -SuiteId $SuiteId -ConfigId $ConfigId -BuildDirectory $PackageLocation -TestEnvironment $TestEnvironment -SettingsName $SettingsName write-verbose "TCM exited with code '$LASTEXITCODE'"
$newquality = "Test Passed"
$tag = "Deployed to Lab"
$keep = $true
if ($LASTEXITCODE -gt 0 )
{
    $newquality = "Test Failed"
    $tag = "Lab Deployed failed"
    $keep = $false
}
write-verbose "Setting build tag to '$tag' for build $BuildNumber"


$url = "$Collection/$Teamproject"
$jsondata = Get-BuildDetailsByNumber -tfsUri $url -buildNumber $BuildNumber #-username $TestUserUid -password $TestUserPwd
$buildId = $jsondata.id
write-verbose "The build $BuildNumber has ID of $buildId"
 
write-verbose "The build tag set to '$tag' and retention set to '$key'"
Set-BuildTag -tfsUri $url  -buildID $buildId -tag $tag #-username $TestUserUid -password $TestUserPwd
Set-BuildRetension -tfsUri $url  -buildID $buildId  -keepForever $keep #-username $TestUserUid -password $TestUserPwd

# now fail the stage after we have sorted the logging
if ($LASTEXITCODE -gt 0 )
{
    Write-error "Test have failed"
}

If all the tests pass we see the Tag being added and the retention being set, if they fail just a tag should be set

image

$ErrorActionPreference = 'Continue'

Cannot create an MSDeploy package for an Azure Web Job project as part of an automated build/

I like web deploy as a means to package up websites for deployment. I like the way I only need to add

/p:DeployOnBuild=True;PublishProfile=Release

as an MSBuild argument to get the package produced as part of an automated build. This opening up loads of deployment options

I recently hit an issue packaging up a solution that contained an Azure WebSite and an Azure Web Job (to be hosted in the web site). It is easy to add the web job so that it is included in the Web Deploy package. Once this was done we could deploy from Visual Studio, or package to the local file system and see the web job EXE in the app_data\jobs folder as expected.

The problems occurred when we tried to get TFS build to create the deployment package using the arguments shown above. I got the error

The value for PublishProfile is set to 'Release', expected to find the file at 'C:\vNextBuild\_work\4253ff91\BM\Src\MyWebJob\Properties\PublishProfiles\Release.pubxml' but it could not be found.

The issue is that there is a Publish target for the web jobs project type, but if run from Visual Studio it actually creates a ClickOnce package. This wizard provides no means create an MSDeploy style package.

MSBuild is getting confused as it expects there to be this MSDeploy style package definition for the web job projects, even though it won’t actually use it as the Web Job EXE will be copied into the web site deployment package.

The solution was to add a dummy PublishProfiles\Release.pubxml file into the properties folder of the web jobs project.

<?xml version="1.0" encoding="utf-8"?>
<Project ToolsVersion="4.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
  <PropertyGroup>
    <WebPublishMethod>Package</WebPublishMethod>
    <LastUsedBuildConfiguration>Release</LastUsedBuildConfiguration>
    <LastUsedPlatform>Any CPU</LastUsedPlatform>
    <SiteUrlToLaunchAfterPublish />
    <LaunchSiteAfterPublish>True</LaunchSiteAfterPublish>
    <ExcludeApp_Data>False</ExcludeApp_Data>
    <DesktopBuildPackageLocation />
    <PackageAsSingleFile>true</PackageAsSingleFile>
    <DeployIisAppPath />
    <PublishDatabaseSettings/>
    </PropertyGroup>
</Project>

Note: I had to add this file to source control via the TFS Source Code Explorer as Visual Studio does not allow you add folders/files manually under the properties folder.

Once this file was added my automated build worked OK, and I got my web site package including the web job.

Using Release Management vNext templates when you don’t want to use DSC scripts – A better script

A couple of months ago I wrote a post on using PowerShell scripts to deploy web sites in Release Management vNext templates as opposed to DSC. In that post I provided a script to help with the translation of Release Management configuration variables to entries in the [MSDELPOY].setparameters.xml file for web sites.

The code I provided in that post required you to hard code the variables to translate. This quickly become a problem for maintenance. However, there is a simple solution.

If we use a naming convention for our RM configuration variables that map to web.config entries (I chose __NAME__ to be consistent to the old RM Agent based deployment standards) we can let PowerShell do the work.

So the revised script is

$VerbosePreference ='Continue' # equiv to -verbose

function Update-ParametersFile
{
    param
    (
        $paramFilePath,
        $paramsToReplace
    )

    write-verbose "Updating parameters file '$paramFilePath'" -verbose
    $content = get-content $paramFilePath
    $paramsToReplace.GetEnumerator() | % {
        Write-Verbose "Replacing value for key '$($_.Name)'" -Verbose
        $content = $content.Replace($_.Name, $_.Value)
    }
    set-content -Path $paramFilePath -Value $content

}

# the script folder
$folder = Split-Path -parent $MyInvocation.MyCommand.Definition
write-verbose "Deploying Website '$package' using script in '$folder'"

# work out the variables to replace using a naming convention
$parameters = Get-Variable -include "__*__"
write-verbose "Discovered replacement parameters that match the convention '__*__': $($parameters | Out-string)"
Update-ParametersFile -paramFilePath "$ApplicationPath\$packagePath\$package.SetParameters.xml" -paramsToReplace $parameters

write-verbose "Calling '$ApplicationPath\$packagePath\$package.deploy.cmd'"
& "$ApplicationPath\$packagePath\$package.deploy.cmd" /Y  /m:"$PublishUrl" -allowUntrusted /u:"$PublishUser" /p:"$PublishPassword" /a:Basic | Write-Verbose

Note: This script allow the deployment to a remote IIS server, so useful for Azure Web Sites. If you are running it locally on an IIS server just trim everything after the /Y on the last line

So now I provide

  • $PackagePath – path to our deployment on the deployment VM(relative to the $ApplicationPath local working folder)
  • $Package – name of the MSdeploy package
  • The publish settings you can get from the Azure Portal
  • $__PARAM1__ –  a value to swap in the web.config
  • $__PARAM2__ –  another value to swap in the web.config

In RM it will look like this.

image

So now you can use a single script for all your web deployments.

TF30063 Errors accessing a TFS 2015 server via the C# API after upgrade from 2013

Background

We  upgraded our production TFS 2013.4 server to TFS 2015 RTM this week. As opposed to an in-place upgrade we chose to make a few change on the way; so whilst leaving our DBs on our SQL 2012 cluster

  • We moved to a new VM for our AT (to upgrade from Windows 2008R2 to 2012R2)
  • Split the SSRS instance off the AT to a separate VM with a new SSAS server (again to move to 2012R2 and to ease management, getting all the reporting bits in one place)

But we do not touch

  • Our XAML Build systems leaving them at 2013 as we intend to migrate to vNext build ASAP
  • Our Test Controller/Release Management/Lab Environment leaving it at 2013 for now, as we have other projects on the go to update the hardware/cloud solutions underpinning theses.

All went well, no surprises, the running of the upgrade tool took about 1 hour.

The Problem

The only problem we have had was to do with my TFS Alerts DSL Processor, which listens for TFS Alerts and runs custom scripts . I host this on the TFS AT, and I would expect it to set build retention and send emails when a TFS XAML Build quality changes. This did not occur, in the Windows error log  I was seeing

2015-08-12 21:04:02.4195 ERROR TFSEventsProcessor.DslScriptService: TF30063: You are not authorized to access https://tfs.blackmarble.co.uk/tfs/DefaultCollection.

After much fiddling, including writing a small command line test client, I confirmed that the issue was specific to the production server. The tool ran fine on other PCs, but on the live server a Window authentication dialog was shown which would not accept any valid credentials

It was not as I had feared a change in the TFS API, in fact there is no reason my 2012 or 2013 API targeted version of the TFS Alert DSL should not be able to talk to a TFS 2015 server as long as the correct version of the TFS API is installed on the machine hosting the DSL.

The Solution

The issue was due to Windows loopback protection. This had been disabled on our old old TFS AT, but not on the new one. As we wanted to avoid changing the global loopback protection setting we set the following via Regedit to allow it for a single CName

HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Lsa\MSV1_0
    ValueName - BackConnectionHostNames
    Type - multistring
    Data  - tfs.blackmarble.co.uk

Once this was done(and without a reboot) my alerts processing work without any problems.

Running Microsoft Test Manager Test Suites as part of a vNext Release pipeline - Part 2

In my last post I discussed how you could wire TCM tests into a Release Management vNext pipeline. The problem with the script I provided, as I noted, was that the deployment was triggered synchronously by the build i.e. the build/release process was:

  1. TFS Build
    1. Gets the source
    2. Compiled the code
    3. Run the unit tests
    4. Trigger the RM pipeline
    5. Wait while the RM pipeline completed
  2. RM then
    1. Deploys the code
    2. Runs the integration tests
  3. When RM completed the TFS build completes

This process raised a couple of problems

  • You cannot associate the integration tests with the build as TCM only allow association with completed successful builds. When TCM finishes in this model the build is still in progress.
  • You have to target only the first automated stage of the pipeline, else the build will be held as ‘in progress’ until all the release stages have complete, which may be days if there are manual approvals involved

The script InitiateReleaseFromBuild

These problems can all be fixed by altering the PowerShell that triggers the RM pipeline so that it does not wait for the deployment to complete, so the TFS build completes as soon as possible.

This is done by passing in an extra parameter which is set in TFS build

param(
    [string]$rmserver = $Args[0],
    [string]$port = $Args[1], 
    [string]$teamProject = $Args[2],  
    [string]$targetStageName = $Args[3],
    [string]$waitForCompletion = $Args[4]
)

cls
$teamFoundationServerUrl = $env:TF_BUILD_COLLECTIONURI
$buildDefinition = $env:TF_BUILD_BUILDDEFINITIONNAME
$buildNumber = $env:TF_BUILD_BUILDNUMBER


"Executing with the following parameters:`n"
"  RMserver Name: $rmserver"
"  Port number: $port"
"  Team Foundation Server URL: $teamFoundationServerUrl"
"  Team Project: $teamProject"
"  Build Definition: $buildDefinition"
"  Build Number: $buildNumber"
"  Target Stage Name: $targetStageName`n"
"  Wait for RM completion: $waitForCompletion`n"

$wait = [System.Convert]::ToBoolean($waitForCompletion)
$exitCode = 0

trap
{
  $e = $error[0].Exception
  $e.Message
  $e.StackTrace
  if ($exitCode -eq 0) { $exitCode = 1 }
}

$scriptName = $MyInvocation.MyCommand.Name
$scriptPath = Split-Path -Parent (Get-Variable MyInvocation -Scope Script).Value.MyCommand.Path

Push-Location $scriptPath   

$server = [System.Uri]::EscapeDataString($teamFoundationServerUrl)
$project = [System.Uri]::EscapeDataString($teamProject)
$definition = [System.Uri]::EscapeDataString($buildDefinition)
$build = [System.Uri]::EscapeDataString($buildNumber)
$targetStage = [System.Uri]::EscapeDataString($targetStageName)

$serverName = $rmserver + ":" + $port
$orchestratorService = "http://$serverName/account/releaseManagementService/_apis/releaseManagement/OrchestratorService"

$status = @{
    "2" = "InProgress";
    "3" = "Released";
    "4" = "Stopped";
    "5" = "Rejected";
    "6" = "Abandoned";
}

$uri = "$orchestratorService/InitiateReleaseFromBuild?teamFoundationServerUrl=$server&teamProject=$project&buildDefinition=$definition&buildNumber=$build&targetStageName=$targetStage"
"Executing the following API call:`n`n$uri"

$wc = New-Object System.Net.WebClient
$wc.UseDefaultCredentials = $true
# rmuser should be part rm users list and he should have permission to trigger the release.

#$wc.Credentials = new-object System.Net.NetworkCredential("rmuser", "rmuserpassword", "rmuserdomain")

try
{
    $releaseId = $wc.DownloadString($uri)

    $url = "$orchestratorService/ReleaseStatus?releaseId=$releaseId"

    $releaseStatus = $wc.DownloadString($url)


    if ($wait -eq $true)
    {
        Write-Host -NoNewline "`nReleasing ..."

        while($status[$releaseStatus] -eq "InProgress")
        {
            Start-Sleep -s 5
            $releaseStatus = $wc.DownloadString($url)
            Write-Host -NoNewline "."
        }

        " done.`n`nRelease completed with {0} status." -f $status[$releaseStatus]
    } else {

        Write-Host -NoNewline "`nTriggering Release and exiting"
    }

}
catch [System.Exception]
{
    if ($exitCode -eq 0) { $exitCode = 1 }
    Write-Host "`n$_`n" -ForegroundColor Red
}

if ($exitCode -eq 0)
{
    if ($wait -eq $true)
    {
        if ($releaseStatus -eq 3)
        {
          "`nThe script completed successfully. Product deployed without error`n"
        } else {
            Write-Host "`nThe script completed successfully. Product failed to deploy`n" -ForegroundColor Red
            $exitCode = -1 # reset the code to show the error
        }
    } else {
        "`nThe script completed successfully. Product deploying`n"
    }
}
else
{
  $err = "Exiting with error: " + $exitCode + "`n"
  Write-Host $err -ForegroundColor Red
}

Pop-Location

exit $exitCode

The Script TcmExecWrapper

A change is also required in the wrapper script I use to trigger the TCM test run. We need to check the exit code from the inner TCM PowerShell script and update the TFS build quality appropriately.

To this I use the new REST API in TFS 2015 as this is far easier than using the older .NET client API. No DLLs to distribute.

It is worth noticing that

  • I pass the credentials into the script from RM that are used to talk to the TFS server. This is because I am running my tests in a network isolated TFS Lab Environment, this means I am in the wrong domain to see the TFS server without providing login details. If you are not working cross domain you could just use Default Credentials.
  • RM only passes the BuildNumber into the script e.g. MyBuild_1.2.3.4, but the REST API need the build id to set the quality. Hence the need for function Get-BuildDetailsByNumber to get the id from the name
# Output execution parameters.
$VerbosePreference ='Continue' # equiv to -verbose
function Get-BuildDetailsByNumber
{
    param
    (
        $tfsUri ,
        $buildNumber,
        $username,
        $password
    )
    $uri = "$($tfsUri)/_apis/build/builds?api-version=2.0&buildnumber=$buildNumber"
    $wc = New-Object System.Net.WebClient
    #$wc.UseDefaultCredentials = $true
    $wc.Credentials = new-object System.Net.NetworkCredential($username, $password)
   
    write-verbose "Getting ID of $buildNumber from $tfsUri "
    $jsondata = $wc.DownloadString($uri) | ConvertFrom-Json
    $jsondata.value[0]
 
}
function Set-BuildQuality
{
    param
    (
        $tfsUri ,
        $buildID,
        $quality,
        $username,
        $password
    )
    $uri = "$($tfsUri)/_apis/build/builds/$($buildID)?api-version=1.0"
    $data = @{quality = $quality} | ConvertTo-Json
    $wc = New-Object System.Net.WebClient
    $wc.Headers["Content-Type"] = "application/json"
    #$wc.UseDefaultCredentials = $true
    $wc.Credentials = new-object System.Net.NetworkCredential($username, $password)
   
    write-verbose "Setting BuildID $buildID to quality $quality via $tfsUri "
    $wc.UploadString($uri,"PATCH", $data)
   
}
$folder = Split-Path -Parent $MyInvocation.MyCommand.Definition
write-verbose "Running $folder\TcmExecWithLogin.ps1"
& "$folder\TcmExecWithLogin.ps1" -Collection $Collection -Teamproject $Teamproject -PlanId $PlanId  -SuiteId $SuiteId -ConfigId $ConfigId -BuildDirectory $PackageLocation -TestEnvironment $TestEnvironment -LoginCreds "$TestUserUid,$TestUserPwd" -SettingsName $SettingsName -BuildNumber $BuildNumber -BuildDefinition $BuildDefinition
write-verbose "Got the exit code from the TCM run of $LASTEXITCODE"
$url = "$Collection/$Teamproject"
$jsondata = Get-BuildDetailsByNumber -tfsUri $url -buildNumber $BuildNumber -username $TestUserUid -password $TestUserPwd
$buildId = $jsondata.id
write-verbose "The build ID is $buildId"
$newquality = "Test Passed"
if ($LASTEXITCODE -gt 0 )
{
    $newquality = "Test Failed"
}
 
write-verbose "The build quality is $newquality"
Set-BuildQuality -tfsUri $url  -buildID $buildId -quality $newquality -username $TestUserUid -password $TestUserPwd

Note: TcmExecWithLogin.ps1 is the same as in the In my last post

Summary

So with these changes the process is now

  1. TFS Build
    1. Gets the source
    2. Compiled the code
    3. Run the unit tests
    4. Trigger the RM pipeline
    5. Build ends
  2. RM then
    1. Deploys the code
    2. Runs the integration tests
    3. When the test complete we set the TFS build quality

This means we can associate both unit and integration tests with a build and target our release at any stage in the pipeline, it pausing at the points manual approval is required without blocking the initiating build.

Running Microsoft Test Manager Test Suites as part of a vNext Release pipeline

Also see Part 2 on how to address gotcha's in this process

When using Release Management there is a good chance you will want to run test suites as part of your automated deployment pipeline. If you are using a vNext PowerShell based pipeline you need a way to trigger the tests via PowerShell as there is no out the box agent to do the job.

Step 1 - Install a Test Agent

The first step is to make sure that the Visual Studio Test Agent is installed on the box you wish to run the test on. if you don’t already have a MTM Environment in place with a test agent then this can be done by creating a standard environment in Microsoft Test Manager. Remember you only need this environment to include the VM you want to run the test on, unless you want to also gather logs and events from our machines in the system. The complexity is up to you.

In my case I was using a network isolated environment so all this was already set up.

Step 2 - Setup the Test Suite

Once you have an environment you can setup your test suite and test plan in MTM to include the tests you wish to run. These can be unit test style integration tests or Coded UI it is up to you.

If you have a lot of unit tests to associate for automation remember the TCM.EXE command can make your life a lot easier

This post does not aim to be a tutorial on setting up test plans, have a look at the ALM Rangers guides for more details.

Step 3 -  The Release Management environment

This is where it gets a bit confusing, you have already set up a Lab Management environment, but you still need to setup the Release Management vNext environment. As I was using a network isolated Lab management environment this gets even more complex, but RM provides some tools to help

Again this is not a detailed tutorial. The key steps if you are using network isolation are

  1. Make sure that PowerShell on the VM is setup for remote access by running  winrm quickconfig
  2. In RM create a vNext environment
  3. Add each a new server, using it’s corporate LAN name from Lab Management with the PowerShell remote access port e.g. VSLM-1002-e7858e28-77cf-4163-b6ba-1df2e91bfcab.lab.blackmarble.co.uk:5985
  4. Make sure the server is set to use a shared UNC path for deployment.
  5. Remember you will login to this VM with the credentials for the test domain.


image

By this point you might be a bit confused as to what you have, well here is a diagram

image

Step 4  - Wiring the test into the pipeline

The final step is get the release pipeline to trigger the tests. This is done by calling the TCM.EXE command line to instruct the Test Controller trigger the tests. Now the copy of TCM does not have to be in Lab Management environment, but it does need to be on a VM known to RM vNext environment. This will usually mean a VM with Visual Studio Test Manager or Premium (or Enterprise for 2015) installed. In my case this was a dedicated test VM within the environment.

The key to the process is to run a script similar to the one used by the older RM agent based system to trigger the tests. You can extract this PowerShell script from an old release pipeline, but for ease I show my modified version here. The key changes are that I pass in the login credentials required for the call to the TFS server from TCM.EXE to be made from inside the network isolated environment and do a little extra checking of the test results so I can fail the build if the tests fail. These edits might not be required if you trigger TCM from a VM that is in the same domain as your TFS server, or have different success criteria.

param
(
    [string]$BuildDirectory = $null,
    [string]$BuildDefinition = $null,
    [string]$BuildNumber = $null,
    [string]$TestEnvironment = $null,
    [string]$LoginCreds = $null,
    [string]$Collection = $(throw "The collection URL must be provided."),
    [string]$TeamProject = $(throw "The team project must be provided."),
    [Int]$PlanId = $(throw "The test plan ID must be provided."),
    [Int]$SuiteId = $(throw "The test suite ID must be provided."),
    [Int]$ConfigId = $(throw "The test configuration ID must be provided."),
    [string]$Title = 'Automated UI Tests',
    [string]$SettingsName = $null,
    [Switch]$InconclusiveFailsTests = $false,
    [Switch]$RemoveIncludeParameter = $false,
    [Int]$TestRunWaitDelay = 10
)

 

##################################################################################
# Output the logo.
write-verbose "Based on the Microsoft Release Management TcmExec PowerShell Script v12.0"
write-verbose "Copyright (c) 2013 Microsoft. All rights reserved.`n"


 

##################################################################################
# Initialize the default script exit code.
$exitCode = 1

##################################################################################
# Output execution parameters.
write-verbose "Executing with the following parameters:"
write-verbose "  Build Directory: $BuildDirectory"
write-verbose "  Build Definition: $BuildDefinition"
write-verbose "  Build Number: $BuildNumber"
write-verbose "  Test Environment: $TestEnvironment"
write-verbose "  Collection: $Collection"
write-verbose "  Team project: $TeamProject"
write-verbose "  Plan ID: $PlanId"
write-verbose "  Suite ID: $SuiteId"
write-verbose "  Configuration ID: $ConfigId"
write-verbose "  Title: $Title"
write-verbose "  Settings Name: $SettingsName"
write-verbose "  Inconclusive result fails tests: $InconclusiveFailsTests"
write-verbose "  Remove /include parameter from /create command: $RemoveIncludeParameter"
write-verbose "  Test run wait delay: $TestRunWaitDelay"

##################################################################################
# Define globally used variables and constants.
# Visual Studio 2013
$vscommtools = [System.Environment]::GetEnvironmentVariable("VS120COMNTOOLS")
if ($vscommtools -eq $null)
{
    # Visual Studio 2012
    $vscommtools = [System.Environment]::GetEnvironmentVariable("VS110COMNTOOLS")
}
if ($vscommtools -eq $null)
{
    # Visual Studio 2010
    $vscommtools = [System.Environment]::GetEnvironmentVariable("VS100COMNTOOLS")
    if ($vscommtools -ne $null)
    {
        if ([string]::IsNullOrEmpty($BuildDirectory))
        {
            $(throw "The build directory must be provided.")
        }
        if (![string]::IsNullOrEmpty($BuildDefinition) -or ![string]::IsNullOrEmpty($BuildNumber))
        {
            $(throw "The build definition and build number parameters may be used only under Visual Studio 2012/2013.")
        }
    }
}
else
{
    if ([string]::IsNullOrEmpty($BuildDefinition) -and [string]::IsNullOrEmpty($BuildNumber) -and [string]::IsNullOrEmpty($BuildDirectory))
    {
        $(throw "You must specify the build directory or the build definition and build number.")
    }
}
$tcmExe = [System.IO.Path]::GetFullPath($vscommtools + "..\IDE\TCM.exe")

##################################################################################
# Ensure TCM.EXE is available in the assumed path.
if ([System.IO.File]::Exists($tcmExe))
{
    ##################################################################################
    # Prepare optional parameters.
    $testEnvironmentParameter = "/testenvironment:$TestEnvironment"
    if ([string]::IsNullOrEmpty($TestEnvironment))
    {
        $testEnvironmentParameter = [string]::Empty
    }
    if ([string]::IsNullOrEmpty($BuildDirectory))
    {
        $buildDirectoryParameter = [string]::Empty
    } else
    {
        # make sure we remove any trailing slashes as the cause permission issues
        $BuildDirectory = $BuildDirectory.Trim()
        while ($BuildDirectory.EndsWith("\"))
        {
            $BuildDirectory = $BuildDirectory.Substring(0,$BuildDirectory.Length-1)
        }
        $buildDirectoryParameter = "/builddir:""$BuildDirectory"""
   
    }
    $buildDefinitionParameter = "/builddefinition:""$BuildDefinition"""
    if ([string]::IsNullOrEmpty($BuildDefinition))
    {
        $buildDefinitionParameter = [string]::Empty
    }
    $buildNumberParameter = "/build:""$BuildNumber"""
    if ([string]::IsNullOrEmpty($BuildNumber))
    {
        $buildNumberParameter = [string]::Empty
    }
    $includeParameter = '/include'
    if ($RemoveIncludeParameter)
    {
        $includeParameter = [string]::Empty
    }
    $settingsNameParameter = "/settingsname:""$SettingsName"""
    if ([string]::IsNullOrEmpty($SettingsName))
    {
        $settingsNameParameter = [string]::Empty
    }

    ##################################################################################
    # Create the test run.
    write-verbose "`nCreating test run ..."
    $testRunId = & "$tcmExe" run /create /title:"$Title" /login:$LoginCreds /planid:$PlanId /suiteid:$SuiteId /configid:$ConfigId /collection:"$Collection" /teamproject:"$TeamProject" $testEnvironmentParameter $buildDirectoryParameter $buildDefinitionParameter $buildNumberParameter $settingsNameParameter $includeParameter
    if ($testRunId -match '.+\:\s(?<TestRunId>\d+)\.')
    {
        # The test run ID is identified as a property in the match collection
        # so we can access it directly by using the group name from the regular
        # expression (i.e. TestRunId).
        $testRunId = $matches.TestRunId

        write-verbose "Waiting for test run $testRunId to complete ..."
        $waitingForTestRunCompletion = $true
        while ($waitingForTestRunCompletion)
        {
            Start-Sleep -s $TestRunWaitDelay
            $testRunStatus = & "$tcmExe" run /list  /collection:"$collection" /login:$LoginCreds /teamproject:"$TeamProject" /querytext:"SELECT * FROM TestRun WHERE TestRunId=$testRunId"
            if ($testRunStatus.Count -lt 3 -or ($testRunStatus.Count -gt 2 -and $testRunStatus.GetValue(2) -match '.+(?<DateCompleted>\d+[/]\d+[/]\d+)'))
            {
                $waitingForTestRunCompletion = $false
            }
        }

        write-verbose "Evaluating test run $testRunId results..."
        # We do a small pause since the results might not be published yet.
        Start-Sleep -s $TestRunWaitDelay

        $testRunResultsTrxFileName = "TestRunResults$testRunId.trx"
        & "$tcmExe" run /export /id:$testRunId  /collection:"$collection" /login:$LoginCreds /teamproject:"$TeamProject" /resultsfile:"$testRunResultsTrxFileName" | Out-Null
        if (Test-path($testRunResultsTrxFileName))
        {
            # Load the XML document contents.
            [xml]$testResultsXml = Get-Content "$testRunResultsTrxFileName"
           
            # Extract the results of the test run.
            $total = $testResultsXml.TestRun.ResultSummary.Counters.total
            $passed = $testResultsXml.TestRun.ResultSummary.Counters.passed
            $failed = $testResultsXml.TestRun.ResultSummary.Counters.failed
            $inconclusive = $testResultsXml.TestRun.ResultSummary.Counters.inconclusive

            # Output the results of the test run.
            write-verbose "`n========== Test: $total tests ran, $passed succeeded, $failed failed, $inconclusive inconclusive =========="

            # Determine if there were any failed tests during the test run execution.
            if ($failed -eq 0 -and (-not $InconclusiveFailsTests -or $inconclusive -eq 0))
            {
                # Update this script's exit code.
                $exitCode = 0
            }

            # Remove the test run results file.
            remove-item($testRunResultsTrxFileName) | Out-Null
        }
        else
        {
            write-error "`nERROR: Unable to export test run results file for analysis."
        }
    }
}
else
{
    write-error "`nERROR: Unable to locate $tcmExe"
}

##################################################################################
# Indicate the resulting exit code to the calling process.
if ($exitCode -gt 0)
{
    write-error "`nERROR: Operation failed with error code $exitCode."
}
write-verbose "`nDone."
exit $exitCode

 

Once this script is placed into source control in such a way that it ends up in the drops location for the build you can call it as a standard script item in your pipeline, targeting the VM that has TCM installed. Remember, you get the test environment name and various IDs required from MTM. Check the TCM command line for more details.

image

 

However we hit a problem, RM sets PowerShell variable, not the parameters for script . So I find it easiest to use a wrapper script, also stored in source control, that converts the variable to the needed parameters. This also gives the opportunity to use RM set runtime variables and build more complex objects such as the credentials

 

# Output execution parameters.
$VerbosePreference ='Continue' # equiv to -verbose
$folder = Split-Path -Parent $MyInvocation.MyCommand.Definition

write-verbose "Running $folder\TcmExecWithLogin.ps1"

& "$folder\TcmExecWithLogin.ps1" -Collection $Collection -Teamproject $Teamproject -PlanId $PlanId  -SuiteId $SuiteId -ConfigId $ConfigId -BuildDirectory $PackageLocation -TestEnvironment $TestEnvironment -LoginCreds "$TestUserUid,$TestUserPwd" -SettingsName $SettingsName

 

Step 5 – Run it all

If you have everything in place you should now be able to trigger your deployment and have the tests run.

image

Finishing Up and One final gotcha

I had hoped that my integration test run would be associated with my build. Normally when triggering test via TCM you do this by adding the following parameters to the TCM command line

TCM [all the other params] -BuildNumber 'My.Build.CI_1.7.25.29773' -BuildDefinition 'My.Build.CI' 

However this will not work in the scenario above. This is because you can only use these flags to associate with successful builds, at the time TCM is run in the pipeline the build has not finished so it is not marked as successful. This does somewhat limit the end to end reporting. However, I think for now I can accept this limitation as the deployment completing is a suitable marker that the tests were passed.

The only workaround I can think is not to trigger the release directly from the build but to use the TFS events system to allow the build to finish first then trigger the release. You could use my TFS DSL Alert processor for that.

Few issues a few days on with my Windows 10 upgrade

A few days in and I have solved the few problems I have had

Can apply update Security Update for Windows 10 for x64-based Systems (KB3074683)

My system tried to apply the KB3074683 patch a couple of time, rolling it back each time. A search of the forums found the answer to this one. As in the forum post I have an Nvidia video card, in fact it caused the problems during the update, so the fix was to delete the UpdatusUser registry entry under HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows NT\CurrentVersion\ProfileList.

Once this was deleted the update applied without an issues.

Windows Defender won’t start

Every time my PC started I got the error that Windows Defender would not start.

image

After much searching and fiddling with settings, it turned out this was a red herring. Defender was not starting as I had another AV product in place System Center End Point Protection, just as the dialog said. End Point Protection is installed by our IT team as part of our standard setup. So the actual issue was that the Defender tooltray app was trying to autostart, giving the error as it failed to connect to the background services which were not running. Strange as this appeared not to be an issue for Windows 8.1.

The answer was to use SysInternal AutoRuns to disable the loading of the tooltray application.

Can access a Data DeDup’d disk

On Windows 8.1 I use the Data DeDup hack on one of my disks that I use for Hyper-V VM; I got 71% disk space saving as there is so much common data between the various VMs. At the time of writing I could not find a matching set of DSIM packages for Windows 10, they need to come from the equivalent release of Server 2016, which is still in CTP/Preview.

After some fiddling with feature packs from preview builds,  I decided to just stop using Data DeDup feature for now. So I attached my disk to a 8.1 machine with DeDup enabled, copied the contents off, re-formated the disk and the replaced the data. then put the disk back in my laptop.

I do hope Microsoft choose to add Data DeDup to Windows 10 in the future, it is of great use to me and anyone else who uses plenty of local VMs.

 

 

So I think I am there now, let us see how reliable it is day to day.

Upgrade from Windows 8.1 to Windows 10 on my Lenovo W520

I have just done an in place upgrade on my Lenovo W520 from Windows 8.1 to Windows 10. Something I had not tried during the beta programme, sticking to running Windows 10 in VMs (mostly on Azure).

I have to say the process was pretty smooth. I only hit one issue, and this was the usual NVidia Optimus problems I saw installing Windows 8 and 8.1.

This is what happened

  1. With Windows 8.1 running mounted the Windows 10 Enterprise ISO
  2. Ran the setup
  3. It did a few checks and eventually asked if I wanted to keep everything – I said yes
  4. It showed a percentage complete gauge
    1. It copied files OK (about 30%)
    2. It said it had found 5% of drivers (32% overall) and stopped – I left it a couple of hours, no disk or network activity

At this point I was a bit worried. But guessed it was the same problem as I had seen on Windows 8.x; the installer needs to access the Intel GPU as well as the NVidia GPU else it gets confused and hangs. A disabled GPU is not an removed GPU.

So I

  1. I rebooted (via the power switch)
  2. Boot into BIOS (press the ThinkVantage button)
    1. Selected the Enable Nvidia Optimus in the graphics options
    2. Saved and rebooted
  3. The PC rolled back the Windows 10 update (very quickly, less than 5 minutes)
    Note: I had expected to be challenged for a Bitlocker code due to the BIOS setting change during the reboot but I wasn’t
  4. With Windows 8.1 running again I re-mounted the Windows 10 Enterprise ISO
  5. Ran the setup again
  6. It did the same few checks and eventually asked if I wanted to keep everything – I said yes again
  7. This time it completed without error, it took around an hour

So now I had an upgraded PC, and everything seemed OK. Including my Biometric login – I was surprised me as this had been a problem to setup in the past.

Only issue was with my external screen, so went back into the BIOS to disable NVidia Optimus again. This time it did prompt me to re-enter the Bitlocker key. Once this was done I could use external screens with no issues as before.

So a smooth upgrade from our standard Windows 8.1 dev machine image, a good stop gap until our IT team build a Windows 10 image in Systems Center.