But it works on my PC!

The random thoughts of Richard Fennell on technology and software development

Scroll bars in MTM Lab Center had me foxed – User too stupid error

I thought I had a problem with our TFS Lab Manager setup, 80% of our environments had disappeared. I wondered if it was rights, was it just showing environments I owned? No it was not that.

Turns our the issue was a UX/Scrollbar issue.

I had MTM full screen in ‘Test Center’ mode, with a long list of test suites, so long a  scroll bar was needed and I had scrolled to the bottom of the list

I then switched to ‘Lab Center’ mode, this list was shorter, not needing a scrollbar, but the the pane listing the environments (that had been showing the test suites) was still scrolled to the bottom. The need for the scrollbar was unexpected and I just missed it visually (in my defence it is light grey on white). Exiting and reloading MTM had no effect, the scroll did not reset on a reload or change of Test Plan/Team Project.

In fact I only realised the solution to the problem when it was pointed out by another member of our team after I asked if they were experiencing issues with Labs; the same had happened to them. Between us we wasted a fair bit of time on this issue!

Just goes to show how you can miss standard UX signals when you are not expecting them.

Using Visual Studio Code to develop VSTS Build Tasks with PowerShell and Pester tests

Background

I am finding  myself writing a lot of PowerShell at present, mostly for VSTS build extensions. Here I hit a problem (or is it an opportunity for choice?) as to what development environment to use?

  • PowerShell ISE is the ‘best’ experience for debugging a script, but has no source control integration – and it is on all PCs
  • Visual Studio Code has good Git support, but you need to jump through some hoops to get debugging working.
  • Visual Studio PowerShell tools, are just too heavy weight, it is not even in the frame for me for this job.

So I have found myself getting the basic scripts working in the PowerShell ISE then moving to VS Code to package up the task/extensions as this means writing .JSON too – so awkward

This gets worse when I want to add Pester based unit tests, I needed a better way of working, and I chose to focus on VS Code

The PowerShell Extension for VS Code

Visual Studio Code now supports PowerShell. Once you have installed VS Code you can install the extension as follows

  1. Open the command pallet (Ctrl+Shift+P)
  2. Type “Extension”
  3. Select “Install Extensions”. 
  4. Once the extensions list loads, type PowerShell and press Enter.

Once this extension is installed you get Intellisense etc. as you would expect. So you have a good editor experience, but we still need a F5 debugging experience.

Setting up the F5 Debugging experience

Visual Studio Code can launch any tool to provide a debugging experience. The PowerShell extension provides the tools to get this running for PowerShell.

I found Keith Hill provided a nice walkthrough with screenshots of the setup, but here is my quick summary

  1. Open VS Code and load a folder structure, for me this usually this will be a Git repo
  2. Assuming the PowerShell extension is installed, goto the debug page in VS Code
  3. Press the cog at the top of the page and a .vscode\launch.json file will be added to the root of the folder structure currently loaded i.e. the root of your Git repo
  4. As Keith points out the important line, the program, the file/task to run when you press F5 is empty – a strange empty default.

image

We need to edit this file to tell it what to run when we press F5. I have decided I have two options and it depends on what I am putting in my Git Repo as to which I use

  • If we want to run the PowerShell file we have in focus in VS Code (at the moment we press F5) then we need the line

              "program": "${file}"

  • However, I soon released this was not that useful as I wanted to run Pester based tests. I was usually editing a script file but wanted to run a test script. So this meant changing the file in focus prior to pressing F5. In this case I decided it was easier to hard code the program setting to run to a script that ran all the Pester tests in my folder structure

               "program": "${workspaceRoot}/Extensions/Tests/runtests.ps1"

    Where my script contained the single line to run the tests in the script’s folder and below

               Invoke-Pester $PSScriptRoot –Verbose

Note: I have seen some comments that if you edit the launch.json file you need to reload VS Code for it to be read the new value, but this has not been my experience

So now when I press F5 my Pester tests run, I can debug into them as I want, but that raises some new issues due to the requirements of VSTS build tasks

Changes to my build task to enable testing

A VSTS build task is basically a PowerShell script that has some parameters. The problem is I needed to load the .PS1 script to allow any Pester tests to execute functions in the script file. This is done using the form

 

# Load the script under test
. "$PSScriptRoot\..\..\..\versioning\versiondacpactask\Update-DacPacVersionNumber.ps1"

Problem 1: If any of the parameters for the script are mandatory this include fails with errors over missing values. The fix is to make sure that any mandatory parameters are passed or they are not mandatory – I chose the latter as I can make any task parameter ‘required’ in the task.json file

Problem 2: When you include the script it is executed – not what I wanted at all. I had to put a guard if test at the top of the script to exit if the required parameters were not at least reasonable – I can’t think of a neater solution

# check if we are in test mode i.e.
If ($VersionNumber -eq "" -and $path -eq "") {Exit}
# the rest of my code …..

Once these changes were made I was able to run the Pester tests with an F5 as I wanted using mocks to help test program flow logic

 

# Load the script under test
. "$PSScriptRoot\..\..\..\versioning\versiondacpactask\Update-DacPacVersionNumber.ps1"

Describe "Use SQL2012 ToolPath settings" {
    Mock Test-Path  {return $false} -ParameterFilter {
            $Path -eq "C:\Program Files (x86)\Microsoft Visual Studio 14.0\Common7\IDE\Extensions\Microsoft\SQLDB\DAC\120\Microsoft.SqlServer.Dac.Extensions.dll"
        }
    Mock Test-Path  {return $true} -ParameterFilter {
            $Path -eq "C:\Program Files (x86)\Microsoft Visual Studio 12.0\Common7\IDE\Extensions\Microsoft\SQLDB\DAC\120\Microsoft.SqlServer.Dac.Extensions.dll"
        }    
 
    It "Find DLLs" {
        $path = Get-Toolpath -ToolPath ""
        $path | Should be "C:\Program Files (x86)\Microsoft Visual Studio 12.0\Common7\IDE\Extensions\Microsoft\SQLDB\DAC\120"
    }
}

Summary

So I think I now have a workable solution with a good IDE with a reasonable F5 debug experience. Ok the PowerShell console in VS Code is not as rich as that in the PowerShell ISE, but I think I can live with that given the quality of the rest of the debug tools.

Updated Reprint - Migrating a TFS TFVC team project to a Git team project

This is a copy of the guest post done on the Microsoft UK web site published on the 7th June 2016

This is a revised version of a post originally published in August 2014. In this revision I have updated version numbers and links for tools used and added a discussion of adapting the process to support VSTS.

The code for this post can be found in my GitHub Repo


In the past I've written on the theory behind migrating TFVC to Git with history. I've since used this process for real, as opposed to as a proof of concept, and this post documents my experiences. The requirement was to move an on-premises TFS 2013.2 Scrum Team Project using TFVC to another on premises TFS 2013.2 Scrum Team Project, but this time using Git.

This process is equally applicable to any version of TFS that supports Git, and to VSTS.

Create new team project

On the target server create a new team project using the same (or as close as possible) process template as was used on the source TFS server. As we were using the same non-customised process template for both the source and the target we did not have to worry over any work item customisation. However, if you were changing the process template, this is where you would do any customisation required.

Remember that if you are targeting VSTS your customisation options are limited. You canadd custom fields to VSTS as of the time of writing (May 2016), but that is all.

Adding a field to all Work Item Types

We need to be able to associate the old work item ID with the new migrated one. For on-premises TFS servers, the TFS Integration Platform has a feature to do this automatically, but it suffers a bug. It is meant to automatically add a field for this purpose, but it actually needs it to be manually added prior to the migration.

To do this edit we need to either:

  1. Edit the process templates in place using the Process Template Editor Power Tool
  2. Export the WIT with WITADMIN.exe and edit them in Notepad and re-import them

In either case the field to add to ALL WORK ITEM TYPES is as follows:

<FIELD refname="TfsMigrationTool.ReflectedWorkItemId" name="ReflectedWorkItemId" type="String">

Once the edit is made the revised work item types need to be re-imported back into the new Team project.

If you are using VSTS this way of adding the field is not an option, but we can add custom fields to a work item type to VSTS. If we do this you will need to use the TFS Integration Mapper tool (mentioned below) to make sure the required old work item ID ends up in your custom location. TFS Integration Platform will not do this by default, butI have documented this process in an associated post.

The Work Item Migration

The actual work item migration is done using the TFS Integration Platform. This tool says it only supports TFS 2012, but it will function with newer versions of TFS as well as VSTS. This will move over all work item types from the source team project to the target team project. The process is as follows:

  1. Install TFS Integration Platform.
  2. Load TFS Integration Platform, as it seems it must be loaded after the team project is created, else it gets confused!
  3. Select 'Create New'.
  4. Pick the 'Team Foundation Server\WorkItemTracking' template. As we are migrating with the same process template this is OK. If you need to change field mappings use the template for field matching and look at the TFS Integration Mapper tool.
  5. Provide a sensible name for the migration. Not really needed for a one-off migration, but if testing, it's easy to end up with many test runs all of the same name, which is confusing in the logs.
  6. Pick the source server and team project as the left server.
  7. Pick the target server and team project as the right server.
  8. Accept the defaults and save to database.
  9. On the left menu select Start. The UI on this tool is not great. Avoid looking on the output tab as this seems to slow the process. Also, altering the refresh time on the options for once a minute seems to help process performance. All details of actions are placed in log files so nothing is lost by these changes.
  10. The migration should complete without any issues, assuming there are no outstanding template issues that need to be resolved.

Article image

Add the New ID to the Changesets on the source server

The key to this migration process to retain the links between the work items and source code checkins. This is done using the technique I outlined in the previous post i.e. editing the comments field of the changeset on the source team project prior to migration the source, adding #123 style references to point to the new work items on the target server.

To do this I used some PowerShell. This PowerShell was written before the new TFS REST API was available, hence uses the older C# API. If I was writing it now I would have used the REST API.

function Update-TfsCommentWithMigratedId
{

<# 
.SYNOPSIS 
This function is used as part of the migration for TFVC to Git to help retain checkin associations to work items 
 
.DESCRIPTION 
This function takes two team project references and looks up changset association in the source team project, it then looks for  
the revised work itme IT in the new team project and updates the source changeset 
 
.PARAMETER SourceCollectionUri 
Source TFS Collection URI 
 
.PARAMETER TargetCollectionUri 
Target TFS Collection URI 
 
.PARAMETER SourceTeamProject 
Source Team Project Name 
 
.EXAMPLE 
 
Update-TfsCommentWithMigratedId -SourceCollectionUri "http://server1:8080/tfs/defaultcollection" -TargetCollectionUri "http://server2:8080/tfs/defaultcollection" -SourceTeamProject "Scrumproject" 
 
#> 
 
    Param 
    ( 
    [Parameter(Mandatory=$true)] 
    [uri] $SourceCollectionUri,  
 
    [Parameter(Mandatory=$true)] 
    [uri] $TargetCollectionUri, 
 
    [Parameter(Mandatory=$true)] 
    [string] $SourceTeamProject 
 
    ) 
 
    # get the source TPC 
    $sourceTeamProjectCollection = New-Object Microsoft.TeamFoundation.Client.TfsTeamProjectCollection($sourceCollectionUri) 
    # get the TFVC repository 
    $vcService = $sourceTeamProjectCollection.GetService([Microsoft.TeamFoundation.VersionControl.Client.VersionControlServer]) 
    # get the target TPC 
    $targetTeamProjectCollection = New-Object Microsoft.TeamFoundation.Client.TfsTeamProjectCollection($targetCollectionUri) 
    #Get the work item store 
    $wiService = $targetTeamProjectCollection.GetService([Microsoft.TeamFoundation.WorkItemTracking.Client.WorkItemStore]) 
 
    # Find all the changesets for the selected team project on the source server 
    foreach ($cs in $vcService.QueryHistory(”$/$SourceTeamProject”, [Microsoft.TeamFoundation.VersionControl.Client.RecursionType]::Full, [Int32]::MaxValue)) 
    { 
        if ($cs.WorkItems.Count -gt 0) 
        { 
            foreach ($wi in $cs.WorkItems) 
            { 
                "Changeset {0} linked to workitem {1}" -f $cs.ChangesetId, $wi.Id 
                # find new id for each changeset on the target server 
                foreach ($newwi in $wiService.Query("select id  FROM WorkItems WHERE [TfsMigrationTool.ReflectedWorkItemId] = '" + $wi.id + "'")) 
                { 
                    # if ID found update the source server if the tag has not already been added 
                    # we have to esc the [ as gets treated as a regular expression 
                    # we need the white space around between the [] else the TFS agent does not find the tags  
                    if ($cs.Comment -match "\[ Migrated ID #{0} \]" -f $newwi.Id) 
                    { 
                        Write-Output ("New Id {0} already associated with changeset {1}" -f $newwi.Id , $cs.ChangesetId) 
                    } else { 
                        Write-Output ("New Id {0} being associated with changeset {1}" -f $newwi.Id, $cs.ChangesetId ) 
                        $cs.Comment += "[ Migrated ID #{0} ]" -f $newwi.Id 
                    } 
                } 
            } 
            $cs.Update() 
        } 
    } 
}
     

With the usage:

Update-TfsCommentWithMigratedId -SourceCollectionUri "http://localhost:8080/tfs/defaultcollection" -TargetCollectionUri "http://localhost:8080/tfs/defaultcollection" -SourceTeamProject "Old team project"  

NOTE: This script is written so that it can be run multiple times, but only adds the migration entries once for any given changeset. This means both it and TFS Integration Platform can be run repeatedly on the same migration to do a staged migration e.g. get the bulk of the content over first whilst the team is using the old team project, then do a smaller migration of the later changes when the actual swap over happens.

When this script is run expect to see output similar to:

Article image

You can see the impact of the script in Visual Studio Team Explorer or the TFS web client when looking at changesets in the old team project. Expect to see a changeset comment in the form shown below with new [ Migrated ID #123 ] blocks in the comment field, with 123 being the work item ID on the new team project. Also note the changeset is still associated with the old work item ID on the source server.

Article image

NOTE: The space after the #123 is vital. If it is not there, then the TFS job agent cannot find the tag to associate the commit to a work item after the migration.

Source code migration

The source code can now be migrated. This is done by cloning the TFVC code to a local Git repo and then pushing it up to the new TFS Git repo using Git TF. We clone the source to a local repo in the folder localrepo with the -deep option used to retain history.

git tf clone http://typhoontfs:8080/tfs/defaultcollection '$/Scrum TFVC Source/Main' localrepo --deep

NOTE: I have seen problems with this command. On larger code bases we saw the error 'TF 400732 server cancelled error' as files were said to be missing or we had no permission - neither of which was true. This problem was repeated on a number of machines, including one that had in the past managed to do the clone. It was thought the issue was on the server connectivity, but no errors were logged.

As a work around the Git-TFS tool was used. This community tool uses the .NET TFS API, unlike the Microsoft one which uses the Java TFS API. Unfortunately, it also gave TF400732 errors, but did provide a suggested command line to retry continue, which continued from where it errored.

The command to do the clone was:

Git tfs clone http://typhoontfs:8080/tfs/defaultcollection '$/Scrum TFVC Source/Main' localrepo

The command to continue after an error was (from within the repo folder):

Git tfs fetch  

It should be noted that Git-TFS seems a good deal faster than Git TF, presumably due to being a native .NET client as opposed to using the Java VM. Also, Git-TFS has support for converting TFVC branches to Git branches, something Git TF is not able to do. So for some people, Git-TFS will be a better tool to use.

Once the clone is complete, we need to add the TFS Git repo as a remote target and then push the changes up to the new team project. The exact commands for this stage are shown on the target TFS server. Load the web client, go to the code section and you should see the commands needed:

git remote add origin http://typhoontfs:8080/tfs/DefaultCollection/_git/newproject 
git push -u origin --all  

Once this stage is complete the new TFS Git repo can be used. The Git commits should have the correct historic date and work item associations as shown below. Note now that the migration ID comments match the work item associations.

Article image

NOTE: There may be a lack in the associations being shown immediately after the git push. This is because the associations are done by a background TFS job process which may take a while to catch up when there are a lot of commits. On one system I worked on this took days, not hours! Be patient.

Shared Test Steps

At this point all work items have been moved over and their various associations with source commits are retained e.g. PBIs link to test cases and tasks. However, there is a problem that any test cases that have shared steps will be pointing to the old shared step work items. As there is already an open source tool to do this update, there was no immediate need to rewrite it as a PowerShell tool. So to use the open source tool use the command line: 

UpdateSharedStep.exe http://localhost:8080/tfs/defaultcollection myproject

Test Plans and Suites

Historically in TFS, test plans and suites were not work items, they became work items in TFS 2013.3. This means if you need these moved over too, then you had to use the TFS API.

Though these scripts were written for TFS 2013.2, there is no reason for these same API calls not to work with newer versions of TFS or VSTS. Just remember to make sure you exclude the Test Plans and Suites work items from the migration performed TFS Integration Platform so you don't move them twice.

This script moves the three test suite types as follows:

  1. Static - Creates a new suite, finds the migrated IDs of the test cases on the source suite and adds them to the new suite.
  2. Dynamic - Creates a new suite using the existing work item query. IMPORTANT - The query is NOT edited, so it may or may not work depending on what it actually contained. These suites will need to be checked by a tester manually in all cases and their queries 'tweaked'.
  3. Requirements - Create a new suite based on the migrated IDs of the requirement work items. This is the only test suite type where we edit the name to make it consistent with the new requirement ID not the old.

The script is as follows: 

function Update-TestPlanAfterMigration
{
<# 
.SYNOPSIS 
This function migrates a test plan and all its child test suites to a different team project 
 
.DESCRIPTION 
This function migrates a test plan and all its child test suites to a different team project, reassign work item IDs as required 
 
.PARAMETER SourceCollectionUri 
Source TFS Collection URI 
 
.PARAMETER SourceTeamProject 
Source Team Project Name 
 
.PARAMETER SourceCollectionUri 
Target TFS Collection URI 
 
.PARAMETER SourceTeamProject 
Targe Team Project Name 
 
 
.EXAMPLE 
 
Update-TestPlanAfterMigration -SourceCollectionUri "http://server1:8080/tfs/defaultcollection" -TargetCollectionUri "http://serrver2:8080/tfs/defaultcollection"  -SourceTeamProjectName "Old project" -TargetTeamProjectName "New project" 
 
#> 
    param( 
    [Parameter(Mandatory=$true)] 
    [uri] $SourceCollectionUri, 
 
    [Parameter(Mandatory=$true)] 
    [string] $SourceTeamProjectName, 
 
    [Parameter(Mandatory=$true)] 
    [uri] $TargetCollectionUri, 
 
    [Parameter(Mandatory=$true)] 
    [string] $TargetTeamProjectName 
 
    ) 
 
    # Get TFS connections 
    $sourcetfs = [Microsoft.TeamFoundation.Client.TfsTeamProjectCollectionFactory]::GetTeamProjectCollection($SourceCollectionUri) 
    try 
    { 
        $Sourcetfs.EnsureAuthenticated() 
    } 
    catch 
    { 
        Write-Error "Error occurred trying to connect to project collection: $_ " 
        exit 1 
    } 
    $targettfs = [Microsoft.TeamFoundation.Client.TfsTeamProjectCollectionFactory]::GetTeamProjectCollection($TargetCollectionUri) 
    try 
    { 
        $Targettfs.EnsureAuthenticated() 
    } 
    catch 
    { 
        Write-Error "Error occurred trying to connect to project collection: $_ " 
        exit 1 
    } 
 
    # get the actual services 
    $sourcetestService = $sourcetfs.GetService("Microsoft.TeamFoundation.TestManagement.Client.ITestManagementService") 
    $targettestService = $targettfs.GetService("Microsoft.TeamFoundation.TestManagement.Client.ITestManagementService") 
    $sourceteamproject = $sourcetestService.GetTeamProject($sourceteamprojectname) 
    $targetteamproject = $targettestService.GetTeamProject($targetteamprojectname) 
    # Get the work item store 
    $wiService = $targettfs.GetService([Microsoft.TeamFoundation.WorkItemTracking.Client.WorkItemStore]) 
 
 
    # find all the plans in the source 
     foreach ($plan in $sourceteamproject.TestPlans.Query("Select * From TestPlan")) 
     { 
         if ($plan.RootSuite -ne $null -and $plan.RootSuite.Entries.Count -gt 0) 
         { 
            # copy the plan to the new tp 
            Write-Host("Migrating Test Plan - {0}" -f $plan.Name)  
            $newplan = $targetteamproject.TestPlans.Create(); 
            $newplan.Name = $plan.Name 
            $newplan.AreaPath = $plan.AreaPath 
            $newplan.Description = $plan.Description 
            $newplan.EndDate = $plan.EndDate 
            $newplan.StartDate = $plan.StartDate 
            $newplan.State = $plan.State 
            $newplan.Save(); 
            # we use a function as it can be recursive 
            MoveTestSuite -sourceSuite $plan.RootSuite -targetSuite $newplan.RootSuite -targetProject $targetteamproject -targetPlan $newplan -wiService $wiService 
            # and have to save the test plan again to persit the suites 
            $newplan.Save(); 
 
         } 
     } 
 
 
 

 
# - is missing in name so this method is not exposed when module loaded 
function MoveTestSuite 

<# 
.SYNOPSIS 
This function migrates a test suite and all its child test suites to a different team project 
 
.DESCRIPTION 
This function migrates a test suite and all its child test suites to a different team project, it is a helper function Move-TestPlan and will probably not be called directly from the command line 
 
.PARAMETER SourceSuite 
Source TFS test suite 
 
.PARAMETER TargetSuite 
Target TFS test suite 
 
.PARAMETER TargetPlan 
The new test plan the tests suite are being created in 
 
.PARAMETER targetProject 
The new team project test suite are being created in 
 
.PARAMETER WiService 
Work item service instance used for lookup 
 
 
.EXAMPLE 
 
Move-TestSuite -sourceSuite $plan.RootSuite -targetSuite $newplan.RootSuite -targetProject $targetteamproject -targetPlan $newplan -wiService $wiService 
 
#> 
    param  
    ( 
        [Parameter(Mandatory=$true)] 
        $sourceSuite, 
 
        [Parameter(Mandatory=$true)] 
        $targetSuite, 
 
        [Parameter(Mandatory=$true)] 
        $targetProject, 
 
        [Parameter(Mandatory=$true)] 
        $targetplan, 
 
        [Parameter(Mandatory=$true)] 
        $wiService 
    ) 
 
    foreach ($suite_entry in $sourceSuite.Entries) 
    { 
       # get the suite to a local variable to make it easier to pass around 
       $suite = $suite_entry.TestSuite 
       if ($suite -ne $null) 
       { 
           # we have to build a suite of the correct type 
           if ($suite.IsStaticTestSuite -eq $true) 
           { 
                Write-Host("    Migrating static test suite - {0}" -f $suite.Title)       
                $newsuite = $targetProject.TestSuites.CreateStatic() 
                $newsuite.Title = $suite.Title 
                $newsuite.Description = $suite.Description  
                $newsuite.State = $suite.State  
                # need to add the suite to the plan else you cannot add test cases 
                $targetSuite.Entries.Add($newSuite) >$nul # sent to null as we get output 
                foreach ($test in $suite.TestCases) 
                { 
                    $migratedTestCaseIds = $targetProject.TestCases.Query("Select * from [WorkItems] where [TfsMigrationTool.ReflectedWorkItemId] = '{0}'" -f $Test.Id) 
                    # we assume we only get one match 
                    if ($migratedTestCaseIds[0] -ne $null) 
                    { 
                        Write-Host ("        Test {0} has been migrated to {1} and added to suite {2}" -f $Test.Id , $migratedTestCaseIds[0].Id, $newsuite.Title) 
                        $newsuite.Entries.Add($targetProject.TestCases.Find($migratedTestCaseIds[0].Id))  >$nul # sent to null as we get output 
                    } 
                } 
           } 
 
    
           if ($suite.IsDynamicTestSuite -eq $true) 
           { 
               Write-Host("    Migrating query based test suite - {0} (Note - query may need editing)" -f $suite.Title)       
               $newsuite = $targetProject.TestSuites.CreateDynamic() 
               $newsuite.Title = $suite.Title 
               $newsuite.Description = $suite.Description  
               $newsuite.State = $suite.State  
               $newsuite.Query = $suite.Query 
 
               $targetSuite.Entries.Add($newSuite) >$nul # sent to null as we get output 
               # we don't need to add tests as this is done dynamically 
   
           } 
 
           if ($suite.IsRequirementTestSuite -eq $true) 
           { 
               $newwis = $wiService.Query("select *  FROM WorkItems WHERE [TfsMigrationTool.ReflectedWorkItemId] = '{0}'" -f $suite.RequirementId)   
               if ($newwis[0] -ne $null) 
               { 
                    Write-Host("    Migrating requirement based test suite - {0} to new requirement ID {1}" -f $suite.Title, $newwis[0].Id )     
        
                    $newsuite = $targetProject.TestSuites.CreateRequirement($newwis[0]) 
                    $newsuite.Title = $suite.Title -replace $suite.RequirementId, $newwis[0].Id 
                    $newsuite.Description = $suite.Description  
                    $newsuite.State = $suite.State  
                    $targetSuite.Entries.Add($newSuite) >$nul # sent to null as we get output 
                    # we don't need to add tests as this is done dynamically 
               } 
           } 
   
           # look for child test cases 
           if ($suite.Entries.Count -gt 0) 
           { 
                 MoveTestSuite -sourceSuite $suite -targetSuite $newsuite -targetProject $targetteamproject -targetPlan $newplan -wiService $wiService 
           } 
        } 
    } 
}
     

NOTE: This script needs PowerShell 3.0 installed. This appears to be because some the TFS assemblies are .NET 4.5 which is not supported by previous PowerShell versions. If the version is wrong the test suite migration will fail as the TestPlan (ITestPlanHelper) object will be null.

The command to run the migration of test plans is:

Update-TestPlanAfterMigration -SourceCollectionUri "http://typhoontfs:8080/tfs/defaultcollection" -TargetCollectionUri "http://typhoontfs:8080/tfs/defaultcollection" -SourceTeamProjectName "Scrum TFVC Source" -TargetTeamProjectName "NewProject"  

This will create the new set of test plans and suites in addition to any already in place on the target server. It should give an output similar to:

Article image

Summary

Once all this is done you should have migrated a TFVC team project to a new team project based on Git on either on-premises TFS or VSTS, retaining as much history as is possible. I hope you find this of use!

This article was first published on the Microsoft’s UK Developers site Migrating a TFS TFVC based team project to a Git team project - a practical example originally published August the 15th 2014 updated 7 June 2016


    

Running Test Suites within a network Isolated Lab Management environment when using TFS vNext build and release tooling

Background

As I have posted many times we make use of TFS Lab Management to provide network isolated dev/test environments. Going forward I see us moving to Azure Dev Labs and/or Azure Stack with ARM templates, but that isn’t going to help me today, especially when I have already made the investment in setting up a Lab Management environments and they are ready to use.

One change we are making now is a move from the old TFS Release Management (2013 generation) to the new VSTS and TFS 2015.2 vNext Release tools. This means I need to be able to trigger automated tests on VMs within Lab Management network isolated environments with a command inside my new build/release process. I have posted on how to do this with the older generation Release Management tools, turns out it is in some ways a little simpler with the newer tooling, no need to fiddle with shadow accounts etal.

My Setup

image

Constraints

The constraints are these

  • I need to be able to trigger tests on the Client VM in the network isolated lab environment. These tests are all defined in automated test suites within Microsoft Test Manager.
  • The network isolated lab already has a TFS Test Agent deployed on all the VMs in the environment linked back to the TFS Test Controller on my corporate domain, these agents are automatically installed and managed, and are handling the ‘magic’ for the network isolation – we can’t fiddle with these without breaking the Labs 
  • The new build/release tools assume that you will auto deploy a 2015 generation Test Agent via a build task as part of the build/release process. This is a new test agent install, so removed any already installed Test Agent – we don’t want this as it breaks the existing agent/network isolation.
  • So my only options to trigger the tests by using TCM (as we did in the past) from some machine in the system. In the past (with the old tools) this had to be within the isolated network environment due to the limitation put in place by the use of shadow accounts.  
  • However, TCM (as shipped with VS 2015) does not ‘understand’ vNext builds, so it can’t seem to find them by definition name/number – we have to find builds by their drop location, and I think this needs to be a UNC share, not a drop back onto the TFS server. So using TCM.EXE (and any wrapper scripts) probably is not going to deliver what I want i.e. the test run associated with a vNext build and/or release.

My Solution

The solution I adopted was to write a PowerShell script that performs the same function as the TCMEXEC.PS1 script that used to be run within the network isolated Labe Environment by the older Release Management products.

The difference is the old script shelled out to run TCM.EXE, my new version makes calls to the new TFS REST API (and unfortunately also to the older C# API as some features notably those for Lab Management services are not exposed via REST). This script can be run from anywhere, I chose to run it on the TFS vNext build agent, as this is easiest and this machine already had Visual Studio installed so had the TFS C# API available.

You can find this script on my VSTSPowerShell GitHub Repo.

The usage of the script is

TCMReplacement.ps1
      -Collectionuri http://tfsserver.domain.com:8080/tfs/defaultcollection/
-Teamproject "My Project"
-testplanname "My test plan" 
-testsuitename "Automated tests"
-configurationname "Windows 8"
-buildid  12345
   -environmentName "Lab V.2.0" 
-testsettingsname "Test Setting"
-testrunname "Smoke Tests"
-testcontroller "mytestcontroller.domain.com"
-releaseUri "vstfs:///ReleaseManagement/Release/167"
-releaseenvironmenturi "vstfs:///ReleaseManagement/Environment/247"

Note

  • The last two parameters are optional, all the others are required. If the last two are not used the test results will not be associated with a release
  • The is also a pollinginterval parameter which default to 10 seconds. The script starts a test run then polls on this interval to see if it has completed.
  • If there are any failed test then the script writes to write-error as the TFS build process sees this is a failed step

In some ways I think this script is an improvement over the TCMEXEC script, the old one needed you to know the IDs for many of the settings (loads of poking around in Microsoft Test Manager to find them), I allow the common names of settings to be passed in which I then use to lookup the required values via the APIs (this is where I needed to use the older C# API as I could not find a way to get the Configuration ID, Environment ID or Test Settings ID via REST).

There is nothing stopping you running this script from the command line, but I think it is more likely to make it part of release pipeline using the PowerShell on local machine task in the build system. When used this way you can get many of the parameters from environment variables. So the command arguments become something like the following (and of course you can make all the string values build variables too if you want)

 

   -Collectionuri $(SYSTEM.TEAMFOUNDATIONCOLLECTIONURI) 
-Teamproject $(SYSTEM.TEAMPROJECT)
-testplanname "My test plan"
   -testsuitename "Automated tests"
-configurationname "Windows 8"
-buildid  $(BUILD.BUILDID)
  -environmentName "Lab V.2.0"
   -testsettingsname "Test Settings"
-testrunname "Smoke Tests"
-testcontroller "mytestcontroller.domain.com"
-releaseUri $(RELEASE.RELEASEURI)
-releaseenvironmenturi $(RELEASE.ENVIRONMENTURI)

 

Obviously this script is potentially a good candidate for a TFS build/release task, but as per my usual practice I will make sure I am happy with it’s operation before wrappering it up into an extension.

Known Issues

  • If you run the script from the command line targeting a completed build and release the tests run and are shown in the release report as well as on the test tab as we would expect.

    image

    However, if you trigger the test run from within a release pipeline, the test runs OK and you can see the results in the test tab (and MTM), but they are not associated within the release. My guess is because the release had not completed when the data update is made. I am investigating this to try to address the issue.

So hopefully you will find this a useful tool if you are using network isolated environments and TFS build

Running WebTests as part of a VSTS VNext Release pipeline

Background

Most projects will have a range of tests

  • Unit tests (maybe using a mocking framework) running inside the build process
  • Integration/UX and load tests run as part of a release pipeline
  • and finally manual tests

In a recent project we were using WebTests to provide some integration tests (in addition to integration tests written using unit testing frameworks) as a means to test a REST/ODATA API, injecting data via the API, pausing while a backend Azure WebJob processed the injected data, then checking a second API to make sure the processed data was correctly presented. Basically mimicking user operations.

In past iterations we ran these tests via TFS Lab Management’s tooling, using the Test Agent that is deploys when an environment is created.

The problem was we are migrating to VSTS/TFS 2015.2 Release Management. This uses the new Functional Testing Task, which uses the newer Test Agent that is deployed on demand as part of the release pipeline (not pre-installed) and this agent does not support running WebTests at present.

This means my only option was to use MsTest if I wanted to continue using this form of webtest. However, there is no out the box MsTest task for VSTS, so I needed to write a script to do the job that I could deploy as part of my build artifacts.

Now I could write a build/release task to make this nice and easy to use, but that is more work and I suspect that I am not going to need this script too often in the future (I might be wrong here only time will tell). Also I hope that Microsoft will at some point provide an out the box task to do the job either by providing an MStest task or adding webtest support to the functional test task.

This actually reflects my usual work practice for build tasks, get the script working first locally, use it as PowerShell script in the build, and if I see enough refuse make it a task/extension.

So what did I actually need to do?

Preparation

  1. Install Visual Studio on the VM where the tests will be run from. I need to do this because though MSTest was already present  it fails to run .Webtest tests unless a suitable SKU of Visual Studio is installed
  2. Set the solution configuration so that the projects containing the webtests is not built, we only need the .webtest files copied to the drops location. If you build the project the files get duplicated into the bin folder, which we don’t need as we then need to work out which copy to use.
  3. Make sure the solution contains a .TestSettings file that switches on ‘Think Times’, and this file is copied as a build artifact. This stalled me for ages, could not work out why tests worked in Visual Studio and failed from the command line. Without this file there is no think time at all so my background process never had time to run.

    image
  4. Write a script that finds all my .Webtest files and place it in source control such that it is copied to the builds drop location.
param 

(

$tool = "C:\Program Files (x86)\Microsoft Visual Studio 14.0\Common7\IDE\MSTest.exe",
$path ,
$include = "*.webtest",
$results ,
$testsettings

)

$web_tests = get-ChildItem -Path $paths -Recurse -Include $include

foreach ($item in $web_tests) {
    $args += "/TestContainer:$item "

}


& $tool $args /resultsfile:$Results /testsettings:$testsettings

 

Build

Once the script and other settings are made I altered the build so that the .webtests (including their associated JSON test data sub folders), the script and the .testsettings files are all copied to the drops location

 

image

 

Release

In the release pipeline I need to call my script with suitable parameters so it find the tests, uses the .testsettings and creates a .TRX results file. I then need to use the ‘Publish Test Results’ task to uploaded these MSTest format results

image

So for the PowerShell MSTest task I set the following

  • Script name is $(System.DefaultWorkingDirectory)/MyBuild\drop\Scripts\RunMSTest.ps1 
  • The argument is -path $(System.DefaultWorkingDirectory)\MyBuild\drop\Src\WebtestsProject -results $(System.DefaultWorkingDirectory)\webtests.trx -testsettings $(System.DefaultWorkingDirectory)\MyBuild\drop\src\webtest.testsettings

And for the publish test results task.

  • Format – VSTest
  • Arguments - $(System.DefaultWorkingDirectory)\webtests.trx
  • I also set this task to always run to make sure I got test results even if some test failed

Once all this was done and the build/release run I got my test results I needed

image

 

I can drill into my detailed test reports as needed

image

So I have a functioning release pipeline that can run all the various types of automated tests within my solution.

Building bridges - getting DevOps working through Devs and IT Pros talking and learning from each other

I was lucky enough to attended and be on a panel at yesterdays WinOps London conference, it was a different and very interesting view on DevOps for me. I spend most of my time consulting with test and development teams, with these teams it is very rare to come across a team not using source control and they commonly have some form of automated build too. This means any DevOps discussion usually come from the side of ‘how can I extend my build into deployment…’.

At the conference yesterday, where there seemed to be more IT Pro attendees than developers, this ‘post build’ view of was not the norm. Much of the conference content was focused around the provisioning and configuration of infrastructure, getting the environment ‘ready for deployment of a build’. What surprised me most was how repeatedly speakers stressed the importance of using source control to manage scripts and hence control the version of the environments being provisioning.

So what does this tell us?

The obvious fact to me is that the bifurcation of our industry between Devs and IT Pros  means there is a huge scope for swapping each group’s best practices. What seem ingrained best practice for one role is new and interesting for the other. We can all learn from each other – assuming we communicate.

This goes to the core of DevOps, that it is not a tool but a process based around collaboration.

If you want to find out more about how we see DevOps at Black Marble we are running events and are out and about at user groups. Keep an eye on the Black Marble events site or drop me an email.

Migrating work items to VSTS with custom fields using TFS Integration Platform

If you wish to migrate work items from TFS to VSTS your options are limited. You can of course just pull over work items, without history, using Excel. If you have no work item customisation them OpsHub is an option, but if you have work item customisation then you are going to have to use TFS Integration Platform. And we all know what a lovely experience that is!

Note: TFS Integration Platform will cease to be supported by Microsoft at the end of May 2016, this does not mean the tool is going away, just that there will be no support via forums.

In this post I will show how you can use TFS Integration platform to move over custom fields to VSTS, including the original TFS work item ID, this enabling migrations with history as detailed in my MSDN article

TFS Integration Platform Setup

Reference Assemblies

TFS Integration Platform being a somewhat old tool, design for TFS 2010, does not directly support TFS 2015 or VSTS. You have to select the Dev11 connection options (which is TFS 2012 by its internal code name). However, this will still cause problems as it fails to find all the assemblies it expects

The solution to this problem is provided in this post, the key being to add dummy registry entries

  1. Install either
  2. Add the following registry key after you have installed Team Explorer or equiv.
    Windows Registry Editor Version 5.00
    

    [HKEY_LOCAL_MACHINE\SOFTWARE\Wow6432Node\Microsoft\VisualStudio\11.0\InstalledProducts\Team System Tools for Developers]

    @="#101"

    "LogoID"="#100"

    "Package"="{97d9322b-672f-42ab-b3cb-ca27aaedf09d}"

    "ProductDetails"="#102"

    "UseVsProductID"=dword:00000001


MSI

Once this is done the TFS Integration Tools installation should work.

Accept the default options, you will need to select a SQL server for the tool to use as a database to store its progress. The installer will create a DB called tfs_integrationplatform on the SQL instance

Creating a Mappings File

TFS Integration platform needs a mapping file to work out which fields go where.

  1. We assume there is a local TFS server with the source to migrate from and a VSTS instance containing a team project using a reasonably compatible uncustomised process template
  2. Download the TFS Process Mapper and run it.
  3. You need to load into the process mapper the current work item configuration, the tools provides buttons to do this from XML files (exported with WITADMIN) or directly from the TFS/VSTS server.
  4. You should see a list of fields in both the source and target server definitions of the given work item type.
  5. Use the automap button to match the fields
  6. Any unmatch fields will be left on the left columns

    image
  7. Some field you may be match manually e.g. handing name changes from ‘Area ID’ to ‘AreadID’
  8. If you have local custom fields you can add matching fields on the VSTS instance, this is done using the process on MSDN.
  9. Once you have added your custom filed I have found it best to clear the mapping tool and re-import the VSTS work item definitions. The new fields appear in the list and can be mapped manually to their old equivalents.
  10. I now exported my mappings file.
  11. This process described above is the same as manually editing the mapping file in the form
    <MappedField MapFromSide="Left" LeftName="BM.Custom1" RightName="BMCustom1" />

    There is a good chance one of the fields you want is the old TFS servers work item. If you add the mapping as above for System.ID you would expect it to work. However, it does not the field is empty on the target system. I don’t think this is a bug, just an unexpected behaviour in the way the unique WI IDs are handled by the tool. As a workaround I found I had to also be an aggregate field to force the System.ID to be transferred. In my process customisation on VSTS I created an Integer OldId custom field. I then added the following to my mapping, it is important to note that I don’t use the  line in the mappedfields block, I used a AggregatedField. 
    <MappedFields>
             <-- all auto generated mapping stuff,
    This is where you would expect a line like the one below
             <MappedField MapFromSide="Left" LeftName="System.Id" RightName="OldID" /> –>
    </MappedFields>
    <AggregatedFields>
           <FieldsAggregationGroup MapFromSide="Left" TargetFieldName="OldID" Format="{0}">
               <SourceField Index="0" SourceFieldName="System.Id" valueMap=""/>
           </FieldsAggregationGroup>
    </AggregatedFields>
  12. I could now use my edited mappings file

Running TFS Integration Platform

I could now run the TFS Integration tools using the mappings file

  1. Load TFS Integration Platform
  2. Create a new configuration
  3. Select the option for work items with explicit mappings
  4. Select your source TFS server
  5. Select your target VSTS server
  6. Select the work item query that returns the items we wish to move
  7. Edit the mapping XML, but and past in the edited block from the previous section. Note that if you are moving multiple work item types then you will be combining a number of these mapping sections
  8. Save the mapping file, you are now ready to use it in TFS Integration Platform

 

And hopefully work migration will progress as you hope. It might take some trial and error but you should get there in the end.

But really……

This all said, I would still recommend just bring over the active work item backlog and current source when moving to VSTS. It a easier, faster and give you a chance to sort out structures without bringing in all your poor choices of the past.

New version of my VSTS Generate Release Notes extension - now supports Builds and Release

I am pleased to announce that I have just made public on the VSTS marketplace a new version of my VSTS Generate Release Notes extension.

This new version now supports both VSTS/TFS vNext Builds and vNext Releases. The previous versions only supported the generation of release notes as part of a build.

The adding of support for release has meant I have had to rethink the internals of how the templates is process as well as the way templates are passed into the task and where results are stored

  • You can now provide a template as a file (usually from source control) as before, but also as an inline property. The latter is really designed for Releases where there is usually no access to source control, only to build artifact drops (though you could put the template in one of these if you wanted)
  • With a build the obvious place to put the release notes file is in the drops location. For a release there is no such artifact drop location, so I just leave the releases notes on the release agent, it is up to the user to get this file copied to a sensible location for their release process.

To find out more check out the documentation on my GitHub repo and have a look at my sample templates to get you started generating release notes