But it works on my PC!

The random thoughts of Richard Fennell on technology and software development

Scroll bars in MTM Lab Center had me foxed – User too stupid error

I thought I had a problem with our TFS Lab Manager setup, 80% of our environments had disappeared. I wondered if it was rights, was it just showing environments I owned? No it was not that.

Turns our the issue was a UX/Scrollbar issue.

I had MTM full screen in ‘Test Center’ mode, with a long list of test suites, so long a  scroll bar was needed and I had scrolled to the bottom of the list

I then switched to ‘Lab Center’ mode, this list was shorter, not needing a scrollbar, but the the pane listing the environments (that had been showing the test suites) was still scrolled to the bottom. The need for the scrollbar was unexpected and I just missed it visually (in my defence it is light grey on white). Exiting and reloading MTM had no effect, the scroll did not reset on a reload or change of Test Plan/Team Project.

In fact I only realised the solution to the problem when it was pointed out by another member of our team after I asked if they were experiencing issues with Labs; the same had happened to them. Between us we wasted a fair bit of time on this issue!

Just goes to show how you can miss standard UX signals when you are not expecting them.

Updated Reprint - Migrating a TFS TFVC team project to a Git team project

This is a copy of the guest post done on the Microsoft UK web site published on the 7th June 2016

This is a revised version of a post originally published in August 2014. In this revision I have updated version numbers and links for tools used and added a discussion of adapting the process to support VSTS.

The code for this post can be found in my GitHub Repo


In the past I've written on the theory behind migrating TFVC to Git with history. I've since used this process for real, as opposed to as a proof of concept, and this post documents my experiences. The requirement was to move an on-premises TFS 2013.2 Scrum Team Project using TFVC to another on premises TFS 2013.2 Scrum Team Project, but this time using Git.

This process is equally applicable to any version of TFS that supports Git, and to VSTS.

Create new team project

On the target server create a new team project using the same (or as close as possible) process template as was used on the source TFS server. As we were using the same non-customised process template for both the source and the target we did not have to worry over any work item customisation. However, if you were changing the process template, this is where you would do any customisation required.

Remember that if you are targeting VSTS your customisation options are limited. You canadd custom fields to VSTS as of the time of writing (May 2016), but that is all.

Adding a field to all Work Item Types

We need to be able to associate the old work item ID with the new migrated one. For on-premises TFS servers, the TFS Integration Platform has a feature to do this automatically, but it suffers a bug. It is meant to automatically add a field for this purpose, but it actually needs it to be manually added prior to the migration.

To do this edit we need to either:

  1. Edit the process templates in place using the Process Template Editor Power Tool
  2. Export the WIT with WITADMIN.exe and edit them in Notepad and re-import them

In either case the field to add to ALL WORK ITEM TYPES is as follows:

<FIELD refname="TfsMigrationTool.ReflectedWorkItemId" name="ReflectedWorkItemId" type="String">

Once the edit is made the revised work item types need to be re-imported back into the new Team project.

If you are using VSTS this way of adding the field is not an option, but we can add custom fields to a work item type to VSTS. If we do this you will need to use the TFS Integration Mapper tool (mentioned below) to make sure the required old work item ID ends up in your custom location. TFS Integration Platform will not do this by default, butI have documented this process in an associated post.

The Work Item Migration

The actual work item migration is done using the TFS Integration Platform. This tool says it only supports TFS 2012, but it will function with newer versions of TFS as well as VSTS. This will move over all work item types from the source team project to the target team project. The process is as follows:

  1. Install TFS Integration Platform.
  2. Load TFS Integration Platform, as it seems it must be loaded after the team project is created, else it gets confused!
  3. Select 'Create New'.
  4. Pick the 'Team Foundation Server\WorkItemTracking' template. As we are migrating with the same process template this is OK. If you need to change field mappings use the template for field matching and look at the TFS Integration Mapper tool.
  5. Provide a sensible name for the migration. Not really needed for a one-off migration, but if testing, it's easy to end up with many test runs all of the same name, which is confusing in the logs.
  6. Pick the source server and team project as the left server.
  7. Pick the target server and team project as the right server.
  8. Accept the defaults and save to database.
  9. On the left menu select Start. The UI on this tool is not great. Avoid looking on the output tab as this seems to slow the process. Also, altering the refresh time on the options for once a minute seems to help process performance. All details of actions are placed in log files so nothing is lost by these changes.
  10. The migration should complete without any issues, assuming there are no outstanding template issues that need to be resolved.

Article image

Add the New ID to the Changesets on the source server

The key to this migration process to retain the links between the work items and source code checkins. This is done using the technique I outlined in the previous post i.e. editing the comments field of the changeset on the source team project prior to migration the source, adding #123 style references to point to the new work items on the target server.

To do this I used some PowerShell. This PowerShell was written before the new TFS REST API was available, hence uses the older C# API. If I was writing it now I would have used the REST API.

function Update-TfsCommentWithMigratedId
{

<# 
.SYNOPSIS 
This function is used as part of the migration for TFVC to Git to help retain checkin associations to work items 
 
.DESCRIPTION 
This function takes two team project references and looks up changset association in the source team project, it then looks for  
the revised work itme IT in the new team project and updates the source changeset 
 
.PARAMETER SourceCollectionUri 
Source TFS Collection URI 
 
.PARAMETER TargetCollectionUri 
Target TFS Collection URI 
 
.PARAMETER SourceTeamProject 
Source Team Project Name 
 
.EXAMPLE 
 
Update-TfsCommentWithMigratedId -SourceCollectionUri "http://server1:8080/tfs/defaultcollection" -TargetCollectionUri "http://server2:8080/tfs/defaultcollection" -SourceTeamProject "Scrumproject" 
 
#> 
 
    Param 
    ( 
    [Parameter(Mandatory=$true)] 
    [uri] $SourceCollectionUri,  
 
    [Parameter(Mandatory=$true)] 
    [uri] $TargetCollectionUri, 
 
    [Parameter(Mandatory=$true)] 
    [string] $SourceTeamProject 
 
    ) 
 
    # get the source TPC 
    $sourceTeamProjectCollection = New-Object Microsoft.TeamFoundation.Client.TfsTeamProjectCollection($sourceCollectionUri) 
    # get the TFVC repository 
    $vcService = $sourceTeamProjectCollection.GetService([Microsoft.TeamFoundation.VersionControl.Client.VersionControlServer]) 
    # get the target TPC 
    $targetTeamProjectCollection = New-Object Microsoft.TeamFoundation.Client.TfsTeamProjectCollection($targetCollectionUri) 
    #Get the work item store 
    $wiService = $targetTeamProjectCollection.GetService([Microsoft.TeamFoundation.WorkItemTracking.Client.WorkItemStore]) 
 
    # Find all the changesets for the selected team project on the source server 
    foreach ($cs in $vcService.QueryHistory(”$/$SourceTeamProject”, [Microsoft.TeamFoundation.VersionControl.Client.RecursionType]::Full, [Int32]::MaxValue)) 
    { 
        if ($cs.WorkItems.Count -gt 0) 
        { 
            foreach ($wi in $cs.WorkItems) 
            { 
                "Changeset {0} linked to workitem {1}" -f $cs.ChangesetId, $wi.Id 
                # find new id for each changeset on the target server 
                foreach ($newwi in $wiService.Query("select id  FROM WorkItems WHERE [TfsMigrationTool.ReflectedWorkItemId] = '" + $wi.id + "'")) 
                { 
                    # if ID found update the source server if the tag has not already been added 
                    # we have to esc the [ as gets treated as a regular expression 
                    # we need the white space around between the [] else the TFS agent does not find the tags  
                    if ($cs.Comment -match "\[ Migrated ID #{0} \]" -f $newwi.Id) 
                    { 
                        Write-Output ("New Id {0} already associated with changeset {1}" -f $newwi.Id , $cs.ChangesetId) 
                    } else { 
                        Write-Output ("New Id {0} being associated with changeset {1}" -f $newwi.Id, $cs.ChangesetId ) 
                        $cs.Comment += "[ Migrated ID #{0} ]" -f $newwi.Id 
                    } 
                } 
            } 
            $cs.Update() 
        } 
    } 
}
     

With the usage:

Update-TfsCommentWithMigratedId -SourceCollectionUri "http://localhost:8080/tfs/defaultcollection" -TargetCollectionUri "http://localhost:8080/tfs/defaultcollection" -SourceTeamProject "Old team project"  

NOTE: This script is written so that it can be run multiple times, but only adds the migration entries once for any given changeset. This means both it and TFS Integration Platform can be run repeatedly on the same migration to do a staged migration e.g. get the bulk of the content over first whilst the team is using the old team project, then do a smaller migration of the later changes when the actual swap over happens.

When this script is run expect to see output similar to:

Article image

You can see the impact of the script in Visual Studio Team Explorer or the TFS web client when looking at changesets in the old team project. Expect to see a changeset comment in the form shown below with new [ Migrated ID #123 ] blocks in the comment field, with 123 being the work item ID on the new team project. Also note the changeset is still associated with the old work item ID on the source server.

Article image

NOTE: The space after the #123 is vital. If it is not there, then the TFS job agent cannot find the tag to associate the commit to a work item after the migration.

Source code migration

The source code can now be migrated. This is done by cloning the TFVC code to a local Git repo and then pushing it up to the new TFS Git repo using Git TF. We clone the source to a local repo in the folder localrepo with the -deep option used to retain history.

git tf clone http://typhoontfs:8080/tfs/defaultcollection '$/Scrum TFVC Source/Main' localrepo --deep

NOTE: I have seen problems with this command. On larger code bases we saw the error 'TF 400732 server cancelled error' as files were said to be missing or we had no permission - neither of which was true. This problem was repeated on a number of machines, including one that had in the past managed to do the clone. It was thought the issue was on the server connectivity, but no errors were logged.

As a work around the Git-TFS tool was used. This community tool uses the .NET TFS API, unlike the Microsoft one which uses the Java TFS API. Unfortunately, it also gave TF400732 errors, but did provide a suggested command line to retry continue, which continued from where it errored.

The command to do the clone was:

Git tfs clone http://typhoontfs:8080/tfs/defaultcollection '$/Scrum TFVC Source/Main' localrepo

The command to continue after an error was (from within the repo folder):

Git tfs fetch  

It should be noted that Git-TFS seems a good deal faster than Git TF, presumably due to being a native .NET client as opposed to using the Java VM. Also, Git-TFS has support for converting TFVC branches to Git branches, something Git TF is not able to do. So for some people, Git-TFS will be a better tool to use.

Once the clone is complete, we need to add the TFS Git repo as a remote target and then push the changes up to the new team project. The exact commands for this stage are shown on the target TFS server. Load the web client, go to the code section and you should see the commands needed:

git remote add origin http://typhoontfs:8080/tfs/DefaultCollection/_git/newproject 
git push -u origin --all  

Once this stage is complete the new TFS Git repo can be used. The Git commits should have the correct historic date and work item associations as shown below. Note now that the migration ID comments match the work item associations.

Article image

NOTE: There may be a lack in the associations being shown immediately after the git push. This is because the associations are done by a background TFS job process which may take a while to catch up when there are a lot of commits. On one system I worked on this took days, not hours! Be patient.

Shared Test Steps

At this point all work items have been moved over and their various associations with source commits are retained e.g. PBIs link to test cases and tasks. However, there is a problem that any test cases that have shared steps will be pointing to the old shared step work items. As there is already an open source tool to do this update, there was no immediate need to rewrite it as a PowerShell tool. So to use the open source tool use the command line: 

UpdateSharedStep.exe http://localhost:8080/tfs/defaultcollection myproject

Test Plans and Suites

Historically in TFS, test plans and suites were not work items, they became work items in TFS 2013.3. This means if you need these moved over too, then you had to use the TFS API.

Though these scripts were written for TFS 2013.2, there is no reason for these same API calls not to work with newer versions of TFS or VSTS. Just remember to make sure you exclude the Test Plans and Suites work items from the migration performed TFS Integration Platform so you don't move them twice.

This script moves the three test suite types as follows:

  1. Static - Creates a new suite, finds the migrated IDs of the test cases on the source suite and adds them to the new suite.
  2. Dynamic - Creates a new suite using the existing work item query. IMPORTANT - The query is NOT edited, so it may or may not work depending on what it actually contained. These suites will need to be checked by a tester manually in all cases and their queries 'tweaked'.
  3. Requirements - Create a new suite based on the migrated IDs of the requirement work items. This is the only test suite type where we edit the name to make it consistent with the new requirement ID not the old.

The script is as follows: 

function Update-TestPlanAfterMigration
{
<# 
.SYNOPSIS 
This function migrates a test plan and all its child test suites to a different team project 
 
.DESCRIPTION 
This function migrates a test plan and all its child test suites to a different team project, reassign work item IDs as required 
 
.PARAMETER SourceCollectionUri 
Source TFS Collection URI 
 
.PARAMETER SourceTeamProject 
Source Team Project Name 
 
.PARAMETER SourceCollectionUri 
Target TFS Collection URI 
 
.PARAMETER SourceTeamProject 
Targe Team Project Name 
 
 
.EXAMPLE 
 
Update-TestPlanAfterMigration -SourceCollectionUri "http://server1:8080/tfs/defaultcollection" -TargetCollectionUri "http://serrver2:8080/tfs/defaultcollection"  -SourceTeamProjectName "Old project" -TargetTeamProjectName "New project" 
 
#> 
    param( 
    [Parameter(Mandatory=$true)] 
    [uri] $SourceCollectionUri, 
 
    [Parameter(Mandatory=$true)] 
    [string] $SourceTeamProjectName, 
 
    [Parameter(Mandatory=$true)] 
    [uri] $TargetCollectionUri, 
 
    [Parameter(Mandatory=$true)] 
    [string] $TargetTeamProjectName 
 
    ) 
 
    # Get TFS connections 
    $sourcetfs = [Microsoft.TeamFoundation.Client.TfsTeamProjectCollectionFactory]::GetTeamProjectCollection($SourceCollectionUri) 
    try 
    { 
        $Sourcetfs.EnsureAuthenticated() 
    } 
    catch 
    { 
        Write-Error "Error occurred trying to connect to project collection: $_ " 
        exit 1 
    } 
    $targettfs = [Microsoft.TeamFoundation.Client.TfsTeamProjectCollectionFactory]::GetTeamProjectCollection($TargetCollectionUri) 
    try 
    { 
        $Targettfs.EnsureAuthenticated() 
    } 
    catch 
    { 
        Write-Error "Error occurred trying to connect to project collection: $_ " 
        exit 1 
    } 
 
    # get the actual services 
    $sourcetestService = $sourcetfs.GetService("Microsoft.TeamFoundation.TestManagement.Client.ITestManagementService") 
    $targettestService = $targettfs.GetService("Microsoft.TeamFoundation.TestManagement.Client.ITestManagementService") 
    $sourceteamproject = $sourcetestService.GetTeamProject($sourceteamprojectname) 
    $targetteamproject = $targettestService.GetTeamProject($targetteamprojectname) 
    # Get the work item store 
    $wiService = $targettfs.GetService([Microsoft.TeamFoundation.WorkItemTracking.Client.WorkItemStore]) 
 
 
    # find all the plans in the source 
     foreach ($plan in $sourceteamproject.TestPlans.Query("Select * From TestPlan")) 
     { 
         if ($plan.RootSuite -ne $null -and $plan.RootSuite.Entries.Count -gt 0) 
         { 
            # copy the plan to the new tp 
            Write-Host("Migrating Test Plan - {0}" -f $plan.Name)  
            $newplan = $targetteamproject.TestPlans.Create(); 
            $newplan.Name = $plan.Name 
            $newplan.AreaPath = $plan.AreaPath 
            $newplan.Description = $plan.Description 
            $newplan.EndDate = $plan.EndDate 
            $newplan.StartDate = $plan.StartDate 
            $newplan.State = $plan.State 
            $newplan.Save(); 
            # we use a function as it can be recursive 
            MoveTestSuite -sourceSuite $plan.RootSuite -targetSuite $newplan.RootSuite -targetProject $targetteamproject -targetPlan $newplan -wiService $wiService 
            # and have to save the test plan again to persit the suites 
            $newplan.Save(); 
 
         } 
     } 
 
 
 

 
# - is missing in name so this method is not exposed when module loaded 
function MoveTestSuite 

<# 
.SYNOPSIS 
This function migrates a test suite and all its child test suites to a different team project 
 
.DESCRIPTION 
This function migrates a test suite and all its child test suites to a different team project, it is a helper function Move-TestPlan and will probably not be called directly from the command line 
 
.PARAMETER SourceSuite 
Source TFS test suite 
 
.PARAMETER TargetSuite 
Target TFS test suite 
 
.PARAMETER TargetPlan 
The new test plan the tests suite are being created in 
 
.PARAMETER targetProject 
The new team project test suite are being created in 
 
.PARAMETER WiService 
Work item service instance used for lookup 
 
 
.EXAMPLE 
 
Move-TestSuite -sourceSuite $plan.RootSuite -targetSuite $newplan.RootSuite -targetProject $targetteamproject -targetPlan $newplan -wiService $wiService 
 
#> 
    param  
    ( 
        [Parameter(Mandatory=$true)] 
        $sourceSuite, 
 
        [Parameter(Mandatory=$true)] 
        $targetSuite, 
 
        [Parameter(Mandatory=$true)] 
        $targetProject, 
 
        [Parameter(Mandatory=$true)] 
        $targetplan, 
 
        [Parameter(Mandatory=$true)] 
        $wiService 
    ) 
 
    foreach ($suite_entry in $sourceSuite.Entries) 
    { 
       # get the suite to a local variable to make it easier to pass around 
       $suite = $suite_entry.TestSuite 
       if ($suite -ne $null) 
       { 
           # we have to build a suite of the correct type 
           if ($suite.IsStaticTestSuite -eq $true) 
           { 
                Write-Host("    Migrating static test suite - {0}" -f $suite.Title)       
                $newsuite = $targetProject.TestSuites.CreateStatic() 
                $newsuite.Title = $suite.Title 
                $newsuite.Description = $suite.Description  
                $newsuite.State = $suite.State  
                # need to add the suite to the plan else you cannot add test cases 
                $targetSuite.Entries.Add($newSuite) >$nul # sent to null as we get output 
                foreach ($test in $suite.TestCases) 
                { 
                    $migratedTestCaseIds = $targetProject.TestCases.Query("Select * from [WorkItems] where [TfsMigrationTool.ReflectedWorkItemId] = '{0}'" -f $Test.Id) 
                    # we assume we only get one match 
                    if ($migratedTestCaseIds[0] -ne $null) 
                    { 
                        Write-Host ("        Test {0} has been migrated to {1} and added to suite {2}" -f $Test.Id , $migratedTestCaseIds[0].Id, $newsuite.Title) 
                        $newsuite.Entries.Add($targetProject.TestCases.Find($migratedTestCaseIds[0].Id))  >$nul # sent to null as we get output 
                    } 
                } 
           } 
 
    
           if ($suite.IsDynamicTestSuite -eq $true) 
           { 
               Write-Host("    Migrating query based test suite - {0} (Note - query may need editing)" -f $suite.Title)       
               $newsuite = $targetProject.TestSuites.CreateDynamic() 
               $newsuite.Title = $suite.Title 
               $newsuite.Description = $suite.Description  
               $newsuite.State = $suite.State  
               $newsuite.Query = $suite.Query 
 
               $targetSuite.Entries.Add($newSuite) >$nul # sent to null as we get output 
               # we don't need to add tests as this is done dynamically 
   
           } 
 
           if ($suite.IsRequirementTestSuite -eq $true) 
           { 
               $newwis = $wiService.Query("select *  FROM WorkItems WHERE [TfsMigrationTool.ReflectedWorkItemId] = '{0}'" -f $suite.RequirementId)   
               if ($newwis[0] -ne $null) 
               { 
                    Write-Host("    Migrating requirement based test suite - {0} to new requirement ID {1}" -f $suite.Title, $newwis[0].Id )     
        
                    $newsuite = $targetProject.TestSuites.CreateRequirement($newwis[0]) 
                    $newsuite.Title = $suite.Title -replace $suite.RequirementId, $newwis[0].Id 
                    $newsuite.Description = $suite.Description  
                    $newsuite.State = $suite.State  
                    $targetSuite.Entries.Add($newSuite) >$nul # sent to null as we get output 
                    # we don't need to add tests as this is done dynamically 
               } 
           } 
   
           # look for child test cases 
           if ($suite.Entries.Count -gt 0) 
           { 
                 MoveTestSuite -sourceSuite $suite -targetSuite $newsuite -targetProject $targetteamproject -targetPlan $newplan -wiService $wiService 
           } 
        } 
    } 
}
     

NOTE: This script needs PowerShell 3.0 installed. This appears to be because some the TFS assemblies are .NET 4.5 which is not supported by previous PowerShell versions. If the version is wrong the test suite migration will fail as the TestPlan (ITestPlanHelper) object will be null.

The command to run the migration of test plans is:

Update-TestPlanAfterMigration -SourceCollectionUri "http://typhoontfs:8080/tfs/defaultcollection" -TargetCollectionUri "http://typhoontfs:8080/tfs/defaultcollection" -SourceTeamProjectName "Scrum TFVC Source" -TargetTeamProjectName "NewProject"  

This will create the new set of test plans and suites in addition to any already in place on the target server. It should give an output similar to:

Article image

Summary

Once all this is done you should have migrated a TFVC team project to a new team project based on Git on either on-premises TFS or VSTS, retaining as much history as is possible. I hope you find this of use!

This article was first published on the Microsoft’s UK Developers site Migrating a TFS TFVC based team project to a Git team project - a practical example originally published August the 15th 2014 updated 7 June 2016


    

Running WebTests as part of a VSTS VNext Release pipeline

Background

Most projects will have a range of tests

  • Unit tests (maybe using a mocking framework) running inside the build process
  • Integration/UX and load tests run as part of a release pipeline
  • and finally manual tests

In a recent project we were using WebTests to provide some integration tests (in addition to integration tests written using unit testing frameworks) as a means to test a REST/ODATA API, injecting data via the API, pausing while a backend Azure WebJob processed the injected data, then checking a second API to make sure the processed data was correctly presented. Basically mimicking user operations.

In past iterations we ran these tests via TFS Lab Management’s tooling, using the Test Agent that is deploys when an environment is created.

The problem was we are migrating to VSTS/TFS 2015.2 Release Management. This uses the new Functional Testing Task, which uses the newer Test Agent that is deployed on demand as part of the release pipeline (not pre-installed) and this agent does not support running WebTests at present.

This means my only option was to use MsTest if I wanted to continue using this form of webtest. However, there is no out the box MsTest task for VSTS, so I needed to write a script to do the job that I could deploy as part of my build artifacts.

Now I could write a build/release task to make this nice and easy to use, but that is more work and I suspect that I am not going to need this script too often in the future (I might be wrong here only time will tell). Also I hope that Microsoft will at some point provide an out the box task to do the job either by providing an MStest task or adding webtest support to the functional test task.

This actually reflects my usual work practice for build tasks, get the script working first locally, use it as PowerShell script in the build, and if I see enough refuse make it a task/extension.

So what did I actually need to do?

Preparation

  1. Install Visual Studio on the VM where the tests will be run from. I need to do this because though MSTest was already present  it fails to run .Webtest tests unless a suitable SKU of Visual Studio is installed
  2. Set the solution configuration so that the projects containing the webtests is not built, we only need the .webtest files copied to the drops location. If you build the project the files get duplicated into the bin folder, which we don’t need as we then need to work out which copy to use.
  3. Make sure the solution contains a .TestSettings file that switches on ‘Think Times’, and this file is copied as a build artifact. This stalled me for ages, could not work out why tests worked in Visual Studio and failed from the command line. Without this file there is no think time at all so my background process never had time to run.

    image
  4. Write a script that finds all my .Webtest files and place it in source control such that it is copied to the builds drop location.
param 

(

$tool = "C:\Program Files (x86)\Microsoft Visual Studio 14.0\Common7\IDE\MSTest.exe",
$path ,
$include = "*.webtest",
$results ,
$testsettings

)

$web_tests = get-ChildItem -Path $paths -Recurse -Include $include

foreach ($item in $web_tests) {
    $args += "/TestContainer:$item "

}


& $tool $args /resultsfile:$Results /testsettings:$testsettings

 

Build

Once the script and other settings are made I altered the build so that the .webtests (including their associated JSON test data sub folders), the script and the .testsettings files are all copied to the drops location

 

image

 

Release

In the release pipeline I need to call my script with suitable parameters so it find the tests, uses the .testsettings and creates a .TRX results file. I then need to use the ‘Publish Test Results’ task to uploaded these MSTest format results

image

So for the PowerShell MSTest task I set the following

  • Script name is $(System.DefaultWorkingDirectory)/MyBuild\drop\Scripts\RunMSTest.ps1 
  • The argument is -path $(System.DefaultWorkingDirectory)\MyBuild\drop\Src\WebtestsProject -results $(System.DefaultWorkingDirectory)\webtests.trx -testsettings $(System.DefaultWorkingDirectory)\MyBuild\drop\src\webtest.testsettings

And for the publish test results task.

  • Format – VSTest
  • Arguments - $(System.DefaultWorkingDirectory)\webtests.trx
  • I also set this task to always run to make sure I got test results even if some test failed

Once all this was done and the build/release run I got my test results I needed

image

 

I can drill into my detailed test reports as needed

image

So I have a functioning release pipeline that can run all the various types of automated tests within my solution.

Migrating work items to VSTS with custom fields using TFS Integration Platform

If you wish to migrate work items from TFS to VSTS your options are limited. You can of course just pull over work items, without history, using Excel. If you have no work item customisation them OpsHub is an option, but if you have work item customisation then you are going to have to use TFS Integration Platform. And we all know what a lovely experience that is!

Note: TFS Integration Platform will cease to be supported by Microsoft at the end of May 2016, this does not mean the tool is going away, just that there will be no support via forums.

In this post I will show how you can use TFS Integration platform to move over custom fields to VSTS, including the original TFS work item ID, this enabling migrations with history as detailed in my MSDN article

TFS Integration Platform Setup

Reference Assemblies

TFS Integration Platform being a somewhat old tool, design for TFS 2010, does not directly support TFS 2015 or VSTS. You have to select the Dev11 connection options (which is TFS 2012 by its internal code name). However, this will still cause problems as it fails to find all the assemblies it expects

The solution to this problem is provided in this post, the key being to add dummy registry entries

  1. Install either
  2. Add the following registry key after you have installed Team Explorer or equiv.
    Windows Registry Editor Version 5.00
    

    [HKEY_LOCAL_MACHINE\SOFTWARE\Wow6432Node\Microsoft\VisualStudio\11.0\InstalledProducts\Team System Tools for Developers]

    @="#101"

    "LogoID"="#100"

    "Package"="{97d9322b-672f-42ab-b3cb-ca27aaedf09d}"

    "ProductDetails"="#102"

    "UseVsProductID"=dword:00000001


MSI

Once this is done the TFS Integration Tools installation should work.

Accept the default options, you will need to select a SQL server for the tool to use as a database to store its progress. The installer will create a DB called tfs_integrationplatform on the SQL instance

Creating a Mappings File

TFS Integration platform needs a mapping file to work out which fields go where.

  1. We assume there is a local TFS server with the source to migrate from and a VSTS instance containing a team project using a reasonably compatible uncustomised process template
  2. Download the TFS Process Mapper and run it.
  3. You need to load into the process mapper the current work item configuration, the tools provides buttons to do this from XML files (exported with WITADMIN) or directly from the TFS/VSTS server.
  4. You should see a list of fields in both the source and target server definitions of the given work item type.
  5. Use the automap button to match the fields
  6. Any unmatch fields will be left on the left columns

    image
  7. Some field you may be match manually e.g. handing name changes from ‘Area ID’ to ‘AreadID’
  8. If you have local custom fields you can add matching fields on the VSTS instance, this is done using the process on MSDN.
  9. Once you have added your custom filed I have found it best to clear the mapping tool and re-import the VSTS work item definitions. The new fields appear in the list and can be mapped manually to their old equivalents.
  10. I now exported my mappings file.
  11. This process described above is the same as manually editing the mapping file in the form
    <MappedField MapFromSide="Left" LeftName="BM.Custom1" RightName="BMCustom1" />

    There is a good chance one of the fields you want is the old TFS servers work item. If you add the mapping as above for System.ID you would expect it to work. However, it does not the field is empty on the target system. I don’t think this is a bug, just an unexpected behaviour in the way the unique WI IDs are handled by the tool. As a workaround I found I had to also be an aggregate field to force the System.ID to be transferred. In my process customisation on VSTS I created an Integer OldId custom field. I then added the following to my mapping, it is important to note that I don’t use the  line in the mappedfields block, I used a AggregatedField. 
    <MappedFields>
             <-- all auto generated mapping stuff,
    This is where you would expect a line like the one below
             <MappedField MapFromSide="Left" LeftName="System.Id" RightName="OldID" /> –>
    </MappedFields>
    <AggregatedFields>
           <FieldsAggregationGroup MapFromSide="Left" TargetFieldName="OldID" Format="{0}">
               <SourceField Index="0" SourceFieldName="System.Id" valueMap=""/>
           </FieldsAggregationGroup>
    </AggregatedFields>
  12. I could now use my edited mappings file

Running TFS Integration Platform

I could now run the TFS Integration tools using the mappings file

  1. Load TFS Integration Platform
  2. Create a new configuration
  3. Select the option for work items with explicit mappings
  4. Select your source TFS server
  5. Select your target VSTS server
  6. Select the work item query that returns the items we wish to move
  7. Edit the mapping XML, but and past in the edited block from the previous section. Note that if you are moving multiple work item types then you will be combining a number of these mapping sections
  8. Save the mapping file, you are now ready to use it in TFS Integration Platform

 

And hopefully work migration will progress as you hope. It might take some trial and error but you should get there in the end.

But really……

This all said, I would still recommend just bring over the active work item backlog and current source when moving to VSTS. It a easier, faster and give you a chance to sort out structures without bringing in all your poor choices of the past.

New version of my VSTS Generate Release Notes extension - now supports Builds and Release

I am pleased to announce that I have just made public on the VSTS marketplace a new version of my VSTS Generate Release Notes extension.

This new version now supports both VSTS/TFS vNext Builds and vNext Releases. The previous versions only supported the generation of release notes as part of a build.

The adding of support for release has meant I have had to rethink the internals of how the templates is process as well as the way templates are passed into the task and where results are stored

  • You can now provide a template as a file (usually from source control) as before, but also as an inline property. The latter is really designed for Releases where there is usually no access to source control, only to build artifact drops (though you could put the template in one of these if you wanted)
  • With a build the obvious place to put the release notes file is in the drops location. For a release there is no such artifact drop location, so I just leave the releases notes on the release agent, it is up to the user to get this file copied to a sensible location for their release process.

To find out more check out the documentation on my GitHub repo and have a look at my sample templates to get you started generating release notes

Putting a release process around my VSTS extension development

I have been developing a few VSTS/TFS build related extensions and have published a few in the VSTS marketplace. This has all been a somewhat manual process, a mixture of Gulp and PowerShell has helped a bit, but I decided it was time to try to do a more formal approach. To do this I have used Jesse Houwing’s VSTS Extension Tasks.

Even with this set of tasks I am not sure what I have is ‘best practice’, but it does work. The doubt is due to the way the marketplace handles revisions and preview flags. What I have works for me, but ‘your mileage may differ’

My Workflow

The core of my workflow is that I am building the VSIX package twice, once as a private package and the other as a public one. They both contain the same code and have the same version number, they differ in only visibility flags

I am not using a the preview flag options at all. I have found they do not really help me. My workflow is to build the private package, upload it and test it by sharing it with a test VSTS instance. if all is good publish the matched public package on the marketplace. In this model there is no need to use a preview, it just adds complexity I don’t need.

This may not be true for everyone.

Build

The build’s job is to take the code, set the version number and package it into multiple VSIX package.

  1. First I have the vNext build get my source from my GitHub repo.
  2. I add two build variables $(Major) and $(Minor) that I use to manually manage my version number
  3. I set my build number format to $(Major).$(Minor).$(rev:r), so the final .number is incremented until I choose to increment the major or minor version.
  4. I then use one of Jesse’s tasks to package the extension multiple times using the extension tag model parameter. Each different package step uses different Visibility settings (circled in red). I also set the version, using the override options, to the $(Build.BuildNumber) (circled in green)

    image
  5. As I am using the VSTS hosted build agent I also need to make sure I check the install Tfx-cli in the global setting section
  6. I then add a second identical publish task, but this time there is no tag set and the visibility is set to public.
  7. Finally I use a ‘publish build artifacts’ task to copy the VSIX packages to a drop location

Release

So now I have multiple VSIX packages I can use the same family of tasks to create a release pipeline.

I create a new release linked to be a Continuous Deployment of the previously created build and set its release name format to Release-$(Build.BuildNumber)

My first environment uses three tasks, all using the option - to work from a VSIX package.

Note In all cases I am using the VSIX path in the format $(System.DefaultWorkingDirectory)/GenerateReleaseNotes.Master/vsix/<package name>-<tag>-$(Build.BuildNumber).vsix. I am including the build number variable in the path as I chose to put all the packages in a single folder, so path wildcards are not an option as the task would not know which package to use unless I alter my build to put one VSIX package per folder.

My tasks for the first environment are

  1. Publish VSTS Extension – using my private package so it is added as a private package to the marketplace
  2. Share VSTS Extension – to my test VSTS account
  3. Install VSTS Extension – to my test VSTS account

For details in the usage of these tasks and setting up the link to the VSTS Marketplace see Jesse’s wiki

If I only intend a extension to ever be private this is enough. However I want to make mine public so I add a second environment that has manual pre-approval (so I have to confirm the public release)

This environment only needs single task

  1. Publish VSTS Extension – using my public package so it is added as a public package to the marketplace

I can of course add other tasks to this environment maybe send a Tweet or email to publicise the new version’s release

Summary

So now I have a formal way to release my extensions. The dual packaging model means I can publish two different versions at the same time one privately and the other public

image

It is now just a case of moving all my extensions over to the new model.

Though I am still interested to hear what other people view are? Does this seem a reasonable process flow?

Updates to my StyleCop task for VSTS/TFS 2015.2

Tracking the current version of StyleCop is a bit awkward. Last week I got an automated email from CodePlex saying 4.7.52.0 had been released . I thought this was the most up to date version, so upgraded my StyleCop command line wrapper and my VSTS StyleCop task from 4.7.47.0 to 4.7.52.0.

However, I was wrong about the current version. I had not realised that the StyleCop team  had forked the code onto GitHub. GitHub is now the home of the Visual Studio 2015 and C# 6 development of StyleCop, while Codeplex remains the home of the legacy Visual Studio versions. I had only upgraded to a legacy patch version, not the current version.

So I upgraded my StyleCop Command Line tool and my VSTS StyleCop task to wrapper 4.7.59.0, thus I think bringing me up to date.

How to build a connection string from other parameters within MSDeploy packages to avoid repeating yourself in Release Management variables

Whilst working with the new Release Management features in VSTS/TFS 2015.2 I found I needed to pass in configuration variables i.e. server name, Db name, UID and Password to create a SQL server via an Azure Resource Management Template release step and a connection string to the same SQL instance for a web site’s web.config, set using an MSDeploy release step using token replacement (as discussed in this post)

Now I could just create RM configuration variables for both the connection string and ARM settings,

image

 

However, this seems wrong for a couple of reason

  1. You should not repeat your self, too easy to get the two values out of step
  2. I don’t really want to obfuscate the whole of a connection string in RM, when only a password really needs to be hidden (note the connection string variable is not set as secure in the above screenshot)

What did not work

I first considered nesting the RM variables, e.g. setting a the connection string variable to be equal to ‘Server=tcp: $(DatabaseServer).database.windows.net,1433;Database=$(DatabaseName)….’, but this does not give the desired results, the S(DatabaseServer) and $(DatabaseName) variables are not expanded at runtime, you just get a string with the variable names in it.

How I got want I was after….

(In this post as a sample I am using the Fabrikam Fiber solution. This means I need to provide a value for the FabrikamFiber-Express connection string)

I wanted to build the connection string from the other variables in the MSDeploy package. So to get the behaviour I want…

  1. In Visual Studio load the Fabrikam web site solution.
  2. In the web project, use the publish option to create a publish profile use the ‘WebDeploy package’ option.
  3. If you publish this package you end up with a setparameter.xml file containing the default connection string
    <setParameter name="FabrikamFiber-Express-Web.config Connection String" value="Your value”/>
    Where ‘your value’ is the value you set in the Publish wizard. So to use this I would need to pass in a whole connection string, where I only want to pass parts of this string
  4. To add bespoke parameters to an MSDeploy package you add a parameter.xml file to the project in Visual Studio (I wrote a Visual Studio Extension that help add this file, but you can create it by hand). My tool will create the parameters.xml file based on the AppSettings block of the projects Web.config. So if you have a web.config containing the following
    <appSettings>
        <add key="Location" value="DEVPC" />
      </appSettings>
    It will create a parameters.xml file as follows
    <?xml version="1.0" encoding="utf-8"?>
    <parameters>
      <parameter defaultValue="__LOCATION__" description="Description for Location" name="Location" tags="">
        <parameterentry kind="XmlFile" match="/configuration/appSettings/add[@key='Location']/@value" scope="\\web.config$" />
      </parameter>
    </parameters>
  5. If we publish at this point we will get a setparameters.xml file containing
    <?xml version="1.0" encoding="utf-8"?>
    <parameters>
      <setParameter name="IIS Web Application Name" value="__Sitename__" />
      <setParameter name="Location" value="__LOCATION__" />
      <setParameter name="FabrikamFiber-Express-Web.config Connection String" value="__FabrikamFiberWebContext__" />
    </parameters>
    This is assuming I used the publish wizard to set the site name to __SiteName__ and the DB connection string to __FabrikamFiberWebContext__
  6. Next step is to add my DB related parameters to the paramaters.xml file, this I do by hand, my tool does not help
    <?xml version="1.0" encoding="utf-8"?>
    <parameters>
      <parameter defaultValue="__LOCATION__" description="Description for Location" name="Location" tags="">
        <parameterentry kind="XmlFile" match="/configuration/appSettings/add[@key='Location']/@value" scope="\\web.config$" />
      </parameter>

      <parameter name="Database Server" defaultValue="__sqlservername__"></parameter>
      <parameter name="Database Name" defaultValue="__databasename__"></parameter>
      <parameter name="Database User" defaultValue="__SQLUser__"></parameter>
      <parameter name="Database Password" defaultValue="__SQLPassword__"></parameter>
    </parameters>
  7. If I publish again, this time the new variables also appear in the setparameters .xml file
  8. Now I need to supress the auto generated creation of the connection string  parameter, and replace it with a parameter that uses the other parameters to generate the connection string. You would think this was a case of added more text to the parameters.xml file, but that does not work. If you add the block you would expect (making sure the name matches the auto generated connection string name) as below
    <parameter 
      defaultValue="Server=tcp:{Database Server}.database.windows.net,1433;Database={Database Name};User ID={Database User}@{Database Server};Password={Database Password};Encrypt=True;TrustServerCertificate=False;Connection Timeout=30;"
      description="Enter the value for FabrikamFiber-Express connection string"
      name="FabrikamFiber-Express-Web.config Connection String"
      tags="">
      <parameterentry
        kind="XmlFile"
        match="/configuration/connectionStrings/add[@name='FabrikamFiber-Express']/@connectionString"
        scope="\\web.config$" />
    </parameter>

    It does add the entry to setparameters.xml, but this blocks the successful operations at deployment. It seems that if a value needs to be generated from other variables there can be no entry for it in the setparameters.xml. Documentation hints you can set the Tag to ‘Hidden’ but this does not appear to work.

    One option would be to let the setparameters.xml file be generated and then remove the offending line prior to deployment but this feels wrong and prone to human error
  9. To get around this you need to added a file name <projectname>.wpp.target to the same folder as the project (and add it to the project). In this file place the following
    <?xml version="1.0" encoding="utf-8"?>
    <Project ToolsVersion="4.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
    <Target Name="DeclareCustomParameters"
              BeforeTargets="Package">
        <ItemGroup>
          <MsDeployDeclareParameters Include="FabrikamFiber-Express">
            <Kind>XmlFile</Kind>
            <Scope>Web.config</Scope>
            <Match>/configuration/connectionStrings/add[@name='FabrikamFiber-Express']/@connectionString</Match>
            <Description>Enter the value for FabrikamFiber-Express connection string</Description>
            <DefaultValue>Server=tcp:{Database Server}.database.windows.net,1433;Database={Database Name};User ID={Database User}@{Database Server};Password={Database Password};Encrypt=True;TrustServerCertificate=False;Connection Timeout=30;</DefaultValue>
            <Tags></Tags>
            <ExcludeFromSetParameter>True</ExcludeFromSetParameter>
          </MsDeployDeclareParameters>
        </ItemGroup>
      </Target>
      <PropertyGroup>
        <AutoParameterizationWebConfigConnectionStrings>false</AutoParameterizationWebConfigConnectionStrings>
      </PropertyGroup>
    </Project>

    The first block declares the parameter I wish to use to build the connection string. Note the ‘ExcludeFromSetParameter’ setting so this parameter is not in the setparameters.xml file. This is what you cannot set in the parameters.xml

    The second block stops the auto generation of the connection string. (Thanks to Sayed Ibrahim Hashimi for various posts on getting this working)
  10. Once the edits are made unload and reload the project as the <project>. wpp.targets file is cached on loading by Visual Studio.
  11. Make sure the publish profile is not set to generate a connection string

    image
  12. Now when you publish the project, you should get a setparameters.xml file with only the four  SQL variables, the AppSettings variables and the site name.
    (Note I have set the values for all of these to the format  __NAME__, this is so I can use token replacement in  my release pipeline)
    <?xml version="1.0" encoding="utf-8"?>
    <parameters>
      <setParameter name="IIS Web Application Name" value="__Sitename__" />
      <setParameter name="Location" value="__LOCATION__" />
      <setParameter name="Database Server" value="__sqlservername__" />
      <setParameter name="Database Name" value="__databasename__" />
      <setParameter name="Database User" value="__SQLUser__" />
      <setParameter name="Database Password" value="__SQLPassword__" />
    </parameters>
  13. If you deploy the web site, the web.config should have your values from the setparameters.xml file in it
    <appSettings>
       <add key="Location" value="__LOCATION__" />
    </appSettings>
    <connectionStrings>
         <add name="FabrikamFiber-Express" connectionString="Server=tcp:__sqlservername__.database.windows.net,1433;Database=__databasename__;User ID=__SQLUser__@__sqlservername__;Password=__SQLPassword__;Encrypt=True;TrustServerCertificate=False;Connection Timeout=30;" providerName="System.Data.SqlClient" />
    </connectionStrings>

You are now in a position manage the values of the setparameters.xml file however you wish. My choice is to use the ‘Replace Tokens’ build/release tasks from Colin’s ALM Corner Build & Release Tools Extension, as this tasks correctly handles secure/encrypted RM variables as long as you use the ‘Secret Tokens’ option on the advanced menu.

image



 

Summary

So yes, it all seems a but too complex, but it does work, and I think it makes for a cleaner deployment solution, less prone to human error. Which is what any DevOps solution must always strive for.

Depending on the values you put in the <project>.wpp.targets you can parameterise the connection string however you need.

In place upgrade times from TFS 2013 to 2015

There is no easy way to work out how long a TFS in place upgrade will take, there are just too many factors to make any calculation reasonable

  • Start and end TFS version
  • Quality/Speed of hardware
  • Volume of source code
  • Volume of work items
  • Volume of work item attachments
  • The list goes on….

The best option I have found to a graph various upgrades I have done and try to make an estimate based in the shape of the curve. I did this for 2010 > 2013 upgrades, and now I think I have enough data from upgrades of sizable TFS instances to do the same for 2013 to 2015.

image

 

Note: I extracted this data from the TFS logs using the script in this blog post it is also in my git repo 

So as a rule of thumb, the upgrade process will pause around step 100 (the exact number varies depending on your starting 2013.x release), time this pause, and expect the upgrade to complete in about 10x this period.

It is not 100% accurate, but close enough so you know how long to go for a coffee/meal/pub or bed for the night