But it works on my PC!

The random thoughts of Richard Fennell on technology and software development

Getting ‘… is not a valid URL’ when using Git TF Clone

I have been attempting to use the Git TF technique to migrate some content between TFS servers. I needed to move a folder structure that contains spaces in folder names from a TPC that also contains spaces in its name. So I thought my command line would be

git tf clone “http://tfsserver1:8080/tfs/My Tpc” “$/My Folder”’ oldrepo --deep

But this gave the error

git-tf: “http://tfsserver1:8080/tfs/My Tpc” is not a valid URL

At first I suspected it was the quotes I was using, as I had had problems here before, but swapping from ‘ to “ made no difference.

The answer was to use the ASCII code %20 for the space, so this version of the command worked

git tf clone http://tfsserver1:8080/tfs/My%20Tpc “$/My Folder”’ oldrepo --deep

Interestingly you don’t need to use %20 for the folder name

Build failing post TFS 2013.3 upgrade with ‘Stack empty. (type InvalidOperationException)’

Just started seeing build error on a build that was working until we upgraded the build agent to TFS 2013.3

Exception Message: Stack empty. (type InvalidOperationException)
Exception Stack Trace:    at Microsoft.VisualStudio.TestImpact.Analysis.LanguageSignatureParser.NotifyEndType()
   at Microsoft.VisualStudio.TestImpact.Analysis.SigParser.ParseType()
   at Microsoft.VisualStudio.TestImpact.Analysis.SigParser.ParseRetType()
   at Microsoft.VisualStudio.TestImpact.Analysis.SigParser.ParseMethod(Byte num1)
   at Microsoft.VisualStudio.TestImpact.Analysis.SigParser.Parse(Byte* blob, UInt32 len)
   at Microsoft.VisualStudio.TestImpact.Analysis.LanguageSignatureParser.ParseMethodName(MethodProps methodProps, String& typeName, String& fullName)
   at Microsoft.VisualStudio.TestImpact.Analysis.AssemblyMethodComparer.AddChangeToList(DateTime now, List`1 changes, CodeChangeReason reason, MethodInfo methodInfo, MetadataReader metadataReader, Guid assemblyIdentifier, SymbolReader symbolsReader, UInt32 sourceToken, LanguageSignatureParser& languageParser)
   at Microsoft.VisualStudio.TestImpact.Analysis.AssemblyMethodComparer.CompareAssemblies(String firstPath, String secondPath, Boolean lookupSourceFiles)
   at Microsoft.TeamFoundation.TestImpact.BuildIntegration.BuildActivities.GetImpactedTests.CompareBinary(CodeActivityContext context, String sharePath, String assembly, IList`1 codeChanges)
   at Microsoft.TeamFoundation.TestImpact.BuildIntegration.BuildActivities.GetImpactedTests.CompareBuildBinaries(CodeActivityContext context, IBuildDefinition definition, IList`1 codeChanges)
   at Microsoft.TeamFoundation.TestImpact.BuildIntegration.BuildActivities.GetImpactedTests.Execute(CodeActivityContext context)
   at System.Activities.CodeActivity.InternalExecute(ActivityInstance instance, ActivityExecutor executor, BookmarkManager bookmarkManager)
   at System.Activities.Runtime.ActivityExecutor.ExecuteActivityWorkItem.ExecuteBody(ActivityExecutor executor, BookmarkManager bookmarkManager, Location resultLocation)

I assume the issue is a DLL mismatch between what is installed in as part of the build agent and something in the 2012 generation build process template in use.

As an immediate fix, until I get a chance to swap the template to a newer one, was to disable Test Impact Analysis, which I was not using for this project anyway.

image

Once I did this my build completed OK with the tests ran OK

Reprint: Migrating a TFS TFVC based team project to a Git team project - a practical example

This article was first published on the Microsoft’s UK Developers site Migrating a TFS TFVC based team project to a Git team project - a practical example on August the 15th 2014


In the past I've written on the theory behind migrating TFVC to Git with history. I've recently done this for real, as opposed to as a proof of concept, and this post documents my experiences. The requirement was to move a TFS 2013.2 Scrum Team Project using TFVC to another TFS 2013.2 Scrum Team Project using Git. The process used was as follows:

Create new team project

On the target server create a new team project using the Scrum 2013.2 process template. As we were using the same non-customised process template for both the source and the target we did not have to worry over any work item customisation. However, if you were changing process template this is where you would do any customisation required.

Adding a field to all Work Item Types

We need to be able to associate the old work item ID with the new migrated one. The TFS Integration Platform has a feature to do this automatically, but it suffers a bug. It is meant to automatically add a field for this purpose, but it actually needs it to be manually added prior to the migration.

To do this edit we need to either

  • Edit the process templates in place using the Process Template Editor Power Tool
  • Export the WIT with WITADMIN.exe and edit them in Notepad and re-import them

    In either case the field to add to ALL WORK ITEM TYPES is as follows

    <FIELD refname="TfsMigrationTool.ReflectedWorkItemId" name="ReflectedWorkItemId" type="String">     

    Once the edit is made the revised work item types need to be re-imported back into the new Team project.

    The Work Item Migration

    The actual work item migration is done using the TFS Integration Platform. This will move over all work item types from the source team project to the target team project.

    The process is as follows...

    1. Install TFS Integration Platform.
    2. Load TFS Integration Platform, as it seems it must be loaded after the team project is created, else it gets confused!
    3. Select 'Create New'.
    4. Pick the 'Team Foundation Server\WorkItemTracking' template. As we are migrating with the same process template this is OK. If you need to change field mappings use the template for field matching and look at the TFS Integration Mapper tool.
    5. Provide a sensible name for the migration. Not really needed for a one-off migration, but if testing, it is easy to end up with many test runs all of the same name, which is confusing in the logs.
    6. Pick the source server and team project as the left server.
    7. Pick the target server and team project as the right server.
    8. Accept the defaults and save to database.
    9. On the left menu select Start. The UI on this tool is not great. Avoid looking on the output tab as this seems to slow the process. Also altering the refresh time on the options for once a minute seems to help process performance. All details of actions are placed in log files so nothing is lost by these changes.
    10. The migration should complete without any issues, assuming there are no outstanding template issues that need to be resolved.

    Article image

    Add the New ID to the Changsets on the source server

    The key to this migration process to retain the links between the work items and source code checkins. This is done using the technique I outlined in the previous post i.e. editing the comments field of the changeset on the source team project prior to migration the source to add #123 style references to point to the new work items on the target server.

    To do this I used some PowerShell

            function Update-TfsCommentWithMigratedId
            {
    
            <#
            .SYNOPSIS
            This function is used as part of the migration for TFVC to Git to help retain checkin associations to work items
    
            .DESCRIPTION
            This function takes two team project references and looks up changset association in the source team project, it then looks for 
            the revised work itme IT in the new team project and updates the source changeset
    
            .PARAMETER SourceCollectionUri
            Source TFS Collection URI
    
            .PARAMETER TargetCollectionUri
            Target TFS Collection URI
    
            .PARAMETER SourceTeamProject
            Source Team Project Name
    
            .EXAMPLE
    
            Update-TfsCommentWithMigratedId -SourceCollectionUri "http://server1:8080/tfs/defaultcollection" -TargetCollectionUri "http://server2:8080/tfs/defaultcollection" -SourceTeamProject "Scrumproject"
    
            #>
    
                Param
                (
                [Parameter(Mandatory=$true)]
                [uri] $SourceCollectionUri, 
    
                [Parameter(Mandatory=$true)]
                [uri] $TargetCollectionUri,
    
                [Parameter(Mandatory=$true)]
                [string] $SourceTeamProject
    
                )
    
                # get the source TPC
                $sourceTeamProjectCollection = New-Object Microsoft.TeamFoundation.Client.TfsTeamProjectCollection($sourceCollectionUri)
                # get the TFVC repository
                $vcService = $sourceTeamProjectCollection.GetService([Microsoft.TeamFoundation.VersionControl.Client.VersionControlServer])
                # get the target TPC
                $targetTeamProjectCollection = New-Object Microsoft.TeamFoundation.Client.TfsTeamProjectCollection($targetCollectionUri)
                #Get the work item store
                $wiService = $targetTeamProjectCollection.GetService([Microsoft.TeamFoundation.WorkItemTracking.Client.WorkItemStore])
        
                # Find all the changesets for the selected team project on the source server
                foreach ($cs in $vcService.QueryHistory(”$/$SourceTeamProject”, [Microsoft.TeamFoundation.VersionControl.Client.RecursionType]::Full, [Int32]::MaxValue))
                {
                    if ($cs.WorkItems.Count -gt 0)
                    {
                        foreach ($wi in $cs.WorkItems)
                        {
                            "Changeset {0} linked to workitem {1}" -f $cs.ChangesetId, $wi.Id
                            # find new id for each changeset on the target server
                            foreach ($newwi in $wiService.Query("select id  FROM WorkItems WHERE [TfsMigrationTool.ReflectedWorkItemId] = '" + $wi.id + "'"))
                            {
                                # if ID found update the source server if the tag has not already been added
                                # we have to esc the [ as gets treated as a regular expression
                                # we need the white space around between the [] else the TFS agent does not find the tags 
                                if ($cs.Comment -match "\[ Migrated ID #{0} \]" -f $newwi.Id)
                                {
                                    Write-Output ("New Id {0} already associated with changeset {1}" -f $newwi.Id , $cs.ChangesetId)
                                } else {
                                    Write-Output ("New Id {0} being associated with changeset {1}" -f $newwi.Id, $cs.ChangesetId )
                                    $cs.Comment += "[ Migrated ID #{0} ]" -f $newwi.Id
                                }
                            }
                        }
                        $cs.Update()
                    }
                }
            }
    
    

    With the usage

    Update-TfsCommentWithMigratedId -SourceCollectionUri "http://localhost:8080/tfs/defaultcollection" -TargetCollectionUri "http://localhost:8080/tfs/defaultcollection" -SourceTeamProject "Old team project"
    
    

         

    NOTE: This script is written so that it can be run multiple times, but only adds the migration entries once for any given changeset. This means both it and TFS Integration Platform can be run repeatedly on the same migration to do a staged migration e.g. get the bulk of the content over first whilst the team is using the old team project, then do a smaller migration of the later changes when the actual swap over happens.

    When this script is run expect to see output similar to:

    Article image

    You can see the impact of the script in Visual Studio Team Explorer or the TFS web client when looking at changesets in the old team project. Expect to see a changeset comment in the form shown below with new [ Migrated ID #123 ] blocks in the comment field. When 123 is the work item ID on the new team project. Also note the changeset is still associated with the old work item ID on the source server.

    Article image

    NOTE: The space after the #123 is vital. If it is not there then the TFS job agent cannot find the tag to associate the commit to a work item after the migration.

    Source code migration

    The source code can now be migrated. This is done by cloning the TFVC code to a local Git repo and then pushing it up to the new TFS Git repo using Git TF. We clone the source to a local repo in the folder localrepo with the -deep option is used to retain history.

    git tf clone http://typhoontfs:8080/tfs/defaultcollection '$/Scrum TFVC Source/Main' localrepo --deep
    

    NOTE: I have seen problems with this command. On larger code bases we saw the error 'TF 400732 server cancelled error' as files were said to be missing or we had no permission - neither of which was true. This problem was repeated on a number of machines, including one that had in the past managed to do the clone. It was thought the issue was on the server connectivity, but no errors were logged.

    As a work around the Git-TFS tool was used. This older community tool uses the .NET TFS API, unlike the Microsoft one which uses the Java TFS API. It was hoped this would not suffer the same issue. However it also gave TF400732 errors, but did provide a suggested command line to retry continue, which continued from where it errored.

    The command to do the clone was:

    Git tfs clone http://typhoontfs:8080/tfs/defaultcollection $/Scrum TFVC Source/main e:\repo1
    

    The command to continue after an error was (from within the repo folder)

    Git tfs fetch
    

    It should be noted that Git-TFS seems a good deal faster than Git TF, presumably due to being a native .NET client as opposed to using the Java VM.

    Once the clone was complete, we need to add the TFS Git repo as a remote target and then push the changes up to the new team project. The exact commands for this stage are shown on the target TFS server. Load the web client, go to the code section and you should see the commands needed e.g.

    git remote add origin http://typhoontfs:8080/tfs/DefaultCollection/_git/newproject
    git push -u origin --all      

    Once this stage is complete the new TFS Git repo can be used. The Git commits should have the correct historic date and work item associations as shown below. Note now that the migration id comments match the work item associations.

    Article image

    NOTE: There may be a lack in the associations being show immediately after the git push. This is because the associations are done by a background TFS job process which may take a while to catch up when there are a lot of commits. On one system I worked on this took days not hours! Be patient.

    Shared Test Steps

    At this point all work items have been moved over and their various associations with source commits are retained e.g. PBIs link to test cases and tasks. However there is a problem that any test cases that have shared steps will be pointing to the old shared set work items. As there is already an open source tool to do this update there was no immediate need to rewrite it as a PowerShell tool. So to use the open source tool use the command line:

    UpdateSharedStep.exe http://localhost:8080/tfs/defaultcollection myproject
    
    

    Test Plans and Suites

    Historically in TFS, test plans and suites are not work items (a change coming in TFS 2013.3). This means if you need these moved over too there is more PowerShell needed.

    This script moves the three test suite types as follows:

  • Static - Creates a new suite, finds the migrated IDs of the test cases on the source suite and adds them to the new suite.
  • Dynamic - Creates a new suite using the existing work item query. IMPORTANT - The query is NOT edited so may or may not work depending on what it actually contained. So these suites will need to be checked by a tester manually in all cases and their queries probably 'tweaked'.
  • Requirements - Create a new suite based on the migrated IDs of the requirement work items. This is the only test suite type where we edit the name to make it consistent with the new requirement ID not the old.

    The script is:

     

            function Update-TestPlanAfterMigration
            {
            <#
            .SYNOPSIS
            This function migrates a test plan and all its child test suites to a different team project
    
            .DESCRIPTION
            This function migrates a test plan and all its child test suites to a different team project, reassign work item IDs as required
    
            .PARAMETER SourceCollectionUri
            Source TFS Collection URI
    
            .PARAMETER SourceTeamProject
            Source Team Project Name
    
            .PARAMETER SourceCollectionUri
            Target TFS Collection URI
    
            .PARAMETER SourceTeamProject
            Targe Team Project Name
    
    
            .EXAMPLE
    
            Update-TestPlanAfterMigration -SourceCollectionUri "http://server1:8080/tfs/defaultcollection" -TargetCollectionUri "http://serrver2:8080/tfs/defaultcollection"  -SourceTeamProjectName "Old project" -TargetTeamProjectName "New project"
    
            #>
                param(
                [Parameter(Mandatory=$true)]
                [uri] $SourceCollectionUri,
    
                [Parameter(Mandatory=$true)]
                [string] $SourceTeamProjectName,
    
                [Parameter(Mandatory=$true)]
                [uri] $TargetCollectionUri,
    
                [Parameter(Mandatory=$true)]
                [string] $TargetTeamProjectName
    
                )
    
                # Get TFS connections
                $sourcetfs = [Microsoft.TeamFoundation.Client.TfsTeamProjectCollectionFactory]::GetTeamProjectCollection($SourceCollectionUri)
                try
                {
                    $Sourcetfs.EnsureAuthenticated()
                }
                catch
                {
                    Write-Error "Error occurred trying to connect to project collection: $_ "
                    exit 1
                }
                $targettfs = [Microsoft.TeamFoundation.Client.TfsTeamProjectCollectionFactory]::GetTeamProjectCollection($TargetCollectionUri)
                try
                {
                    $Targettfs.EnsureAuthenticated()
                }
                catch
                {
                    Write-Error "Error occurred trying to connect to project collection: $_ "
                    exit 1
                }
    
                # get the actual services
                $sourcetestService = $sourcetfs.GetService("Microsoft.TeamFoundation.TestManagement.Client.ITestManagementService")
                $targettestService = $targettfs.GetService("Microsoft.TeamFoundation.TestManagement.Client.ITestManagementService")
                $sourceteamproject = $sourcetestService.GetTeamProject($sourceteamprojectname)
                $targetteamproject = $targettestService.GetTeamProject($targetteamprojectname)
                # Get the work item store
                $wiService = $targettfs.GetService([Microsoft.TeamFoundation.WorkItemTracking.Client.WorkItemStore])
       
     
                # find all the plans in the source
                 foreach ($plan in $sourceteamproject.TestPlans.Query("Select * From TestPlan"))
                 {
                     if ($plan.RootSuite -ne $null -and $plan.RootSuite.Entries.Count -gt 0)
                     {
                        # copy the plan to the new tp
                        Write-Host("Migrating Test Plan - {0}" -f $plan.Name) 
                        $newplan = $targetteamproject.TestPlans.Create();
                        $newplan.Name = $plan.Name
                        $newplan.AreaPath = $plan.AreaPath
                        $newplan.Description = $plan.Description
                        $newplan.EndDate = $plan.EndDate
                        $newplan.StartDate = $plan.StartDate
                        $newplan.State = $plan.State
                        $newplan.Save();
                        # we use a function as it can be recursive
                        MoveTestSuite -sourceSuite $plan.RootSuite -targetSuite $newplan.RootSuite -targetProject $targetteamproject -targetPlan $newplan -wiService $wiService
                        # and have to save the test plan again to persit the suites
                        $newplan.Save();
     
                     }
                 }
    
    
    
            }
    
            # - is missing in name so this method is not exposed when module loaded
            function MoveTestSuite
            {
            <#
            .SYNOPSIS
            This function migrates a test suite and all its child test suites to a different team project
    
            .DESCRIPTION
            This function migrates a test suite and all its child test suites to a different team project, it is a helper function Move-TestPlan and will probably not be called directly from the command line
    
            .PARAMETER SourceSuite
            Source TFS test suite
    
            .PARAMETER TargetSuite
            Target TFS test suite
    
            .PARAMETER TargetPlan
            The new test plan the tests suite are being created in
    
            .PARAMETER targetProject
            The new team project test suite are being created in
    
            .PARAMETER WiService
            Work item service instance used for lookup
    
    
            .EXAMPLE
    
            Move-TestSuite -sourceSuite $plan.RootSuite -targetSuite $newplan.RootSuite -targetProject $targetteamproject -targetPlan $newplan -wiService $wiService
    
            #>
                param 
                (
                    [Parameter(Mandatory=$true)]
                    $sourceSuite,
    
                    [Parameter(Mandatory=$true)]
                    $targetSuite,
    
                    [Parameter(Mandatory=$true)]
                    $targetProject,
    
                    [Parameter(Mandatory=$true)]
                    $targetplan,
            
                    [Parameter(Mandatory=$true)]
                    $wiService
                )
    
                foreach ($suite_entry in $sourceSuite.Entries)
                {
                   # get the suite to a local variable to make it easier to pass around
                   $suite = $suite_entry.TestSuite
                   if ($suite -ne $null)
                   {
                       # we have to build a suite of the correct type
                       if ($suite.IsStaticTestSuite -eq $true)
                       {
                            Write-Host("    Migrating static test suite - {0}" -f $suite.Title)      
                            $newsuite = $targetProject.TestSuites.CreateStatic()
                            $newsuite.Title = $suite.Title
                            $newsuite.Description = $suite.Description 
                            $newsuite.State = $suite.State 
                            # need to add the suite to the plan else you cannot add test cases
                            $targetSuite.Entries.Add($newSuite) >$nul # sent to null as we get output
                            foreach ($test in $suite.TestCases)
                            {
                                $migratedTestCaseIds = $targetProject.TestCases.Query("Select * from [WorkItems] where [TfsMigrationTool.ReflectedWorkItemId] = '{0}'" -f $Test.Id)
                                # we assume we only get one match
                                if ($migratedTestCaseIds[0] -ne $null)
                                {
                                    Write-Host ("        Test {0} has been migrated to {1} and added to suite {2}" -f $Test.Id , $migratedTestCaseIds[0].Id, $newsuite.Title)
                                    $newsuite.Entries.Add($targetProject.TestCases.Find($migratedTestCaseIds[0].Id))  >$nul # sent to null as we get output
                                }
                            }
                       }
    
               
                       if ($suite.IsDynamicTestSuite -eq $true)
                       {
                           Write-Host("    Migrating query based test suite - {0} (Note - query may need editing)" -f $suite.Title)      
                           $newsuite = $targetProject.TestSuites.CreateDynamic()
                           $newsuite.Title = $suite.Title
                           $newsuite.Description = $suite.Description 
                           $newsuite.State = $suite.State 
                           $newsuite.Query = $suite.Query
    
                           $targetSuite.Entries.Add($newSuite) >$nul # sent to null as we get output
                           # we don't need to add tests as this is done dynamically
              
                       }
    
                       if ($suite.IsRequirementTestSuite -eq $true)
                       {
                           $newwis = $wiService.Query("select *  FROM WorkItems WHERE [TfsMigrationTool.ReflectedWorkItemId] = '{0}'" -f $suite.RequirementId)  
                           if ($newwis[0] -ne $null)
                           {
                                Write-Host("    Migrating requirement based test suite - {0} to new requirement ID {1}" -f $suite.Title, $newwis[0].Id )    
                   
                                $newsuite = $targetProject.TestSuites.CreateRequirement($newwis[0])
                                $newsuite.Title = $suite.Title -replace $suite.RequirementId, $newwis[0].Id
                                $newsuite.Description = $suite.Description 
                                $newsuite.State = $suite.State 
                                $targetSuite.Entries.Add($newSuite) >$nul # sent to null as we get output
                                # we don't need to add tests as this is done dynamically
                           }
                       }
              
                       # look for child test cases
                       if ($suite.Entries.Count -gt 0)
                       {
                             MoveTestSuite -sourceSuite $suite -targetSuite $newsuite -targetProject $targetteamproject -targetPlan $newplan -wiService $wiService
                       }
                    }
                }
             }
    
    

    NOTE: This script needs PowerShell 3.0 installed. This appears to be because some the TFS assemblies are .NET 4.5 which is not supported by previous PowerShell versions. If the version is wrong the test suite migration will fail as the TestPlan (ITestPlanHelper) object will be null.

    The command run to do the migration of test plans is:

    Update-TestPlanAfterMigration -SourceCollectionUri "http://typhoontfs:8080/tfs/defaultcollection" -TargetCollectionUri "http://typhoontfs:8080/tfs/defaultcollection" -SourceTeamProjectName "Scrum TFVC Source" -TargetTeamProjectName "NewProject”
    
    

    This will create the new set of test plans and suites in addition to any already in place on the target server. It should give an output similar to:

    Article image

    Summary

    So once all this is done you should have migrated TFVC team project on a new team project based on Git retaining as much history as is possible.

    Hope you find this of use.


    This article was first published on the Microsoft’s UK Developers site Migrating a TFS TFVC based team project to a Git team project - a practical example on August the 15th 2014

  • Listing all the PBIs that have no acceptance criteria

    Update 24 Aug 2014:  Changed the PowerShell to use a pipe based filter as opposed to nested foreach loops

    The TFS Scrum process template’s Product Backlog Item work item type has an acceptance criteria field. It is good practice to make sure any PBI has this field completed; however it is not always possible to enter this content when the work item is initially create i.e. before it is approved. We oftan find we add a PBI that is basically a title and add the summary and acceptance criteria as the product is planned.

    It would be really nice to have a TFS work item query that listed all the PBIs that did not have the acceptance criteria field complete. Unfortunately there is not way to check a rich text or html field is empty in TFS queries It has been requested via UserVoice, but there is no sign of it appearing in the near future.

    So we are left the TFS API to save the day, the following PowerShell function does the job, returning a list of non-completed PBI work items that have empty Acceptance Criteria.

     

    # Load the one we have to find, might be more than we truly need for this single function
    # but I usually keep all these functions in a single module so share the references
    $ReferenceDllLocation = "C:\Program Files (x86)\Microsoft Visual Studio 12.0\Common7\IDE\ReferenceAssemblies\v2.0\"
    Add-Type -Path $ReferenceDllLocation"Microsoft.TeamFoundation.Client.dll" -ErrorAction Stop -Verbose
    Add-Type -Path $ReferenceDllLocation"Microsoft.TeamFoundation.Common.dll" -ErrorAction Stop -Verbose
    Add-Type -Path $ReferenceDllLocation"Microsoft.TeamFoundation.WorkItemTracking.Client.dll"  -ErrorAction Stop –Verbose


     

    function Get-TfsPBIWIthNoAcceptanceCriteria { <# .SYNOPSIS This function get the list of PBI work items that have no acceptance criteria .DESCRIPTION This function allows a check to be made that all PBIs have a set of acceptance criteria .PARAMETER CollectionUri TFS Collection URI .PARAMETER TeamProject Team Project Name .EXAMPLE Get-TfsPBIWIthNoAcceptanceCriteria -CollectionUri "http://server1:8080/tfs/defaultcollection" -TeamProject "My Project" #> Param ( [Parameter(Mandatory=$true)] [uri] $CollectionUri , [Parameter(Mandatory=$true)] [string] $TeamProject ) # get the source TPC $teamProjectCollection = New-Object Microsoft.TeamFoundation.Client.TfsTeamProjectCollection($CollectionUri) try { $teamProjectCollection.EnsureAuthenticated() } catch { Write-Error "Error occurred trying to connect to project collection: $_ " exit 1 } #Get the work item store $wiService = $teamProjectCollection.GetService([Microsoft.TeamFoundation.WorkItemTracking.Client.WorkItemStore]) # find each work item, we can't check for acceptance crieria state in the query $pbi = $wiService.Query("SELECT [System.Id] FROM WorkItems WHERE [System.TeamProject] = '{0}' AND [System.WorkItemType] = 'Product Backlog Item' AND [System.State] <> 'Done' ORDER BY [System.Id]" -f $teamproject) $pbi |  where-Object { $_.fields | where-object {$_.ReferenceName -eq 'Microsoft.VSTS.Common.AcceptanceCriteria' -and $_.Value -eq ""}} # Using a single piped line to filter the wi # this is equivalent to the following nested loops for those who like a more winded structure # $results = @() # foreach ($wi in $pbi) # { # foreach ($field in $wi.Fields) # { # if ($field.ReferenceName -eq 'Microsoft.VSTS.Common.AcceptanceCriteria' -and $field.Value -eq "") # { # $results += $wi # } # } # } # $results }

    Why is my TFS report not failing when I really think it should ?

    Whilst creating some custom reports for a client we hit a problem that though the reports worked on my development system and their old TFS server it failed on their new one. The error being that the Microsoft_VSTS_Scheduling_CompletedWork was an invalid column name

    image

    Initially I suspected the problem was a warehouse reprocessing issue, but other reports worked so it could not have been that.

    It must really be the column is missing, and that sort of makes sense. On the new server the team was using the Scrum process template, the Microsoft_VSTS_Scheduling_CompletedWork  and Microsoft_VSTS_Scheduling_OriginalEstimate fields are not included in this template, the plan had been to add them to allow some analysis of estimate accuracy. This had been done on my development system, but not on the client new server. Once these fields were added to the Task work item the report leapt into life.

    The question is then, why did this work on the old TFS server? The team project on the old server being used to test the reports also did not have the customisation either. However, remember the OLAP cube for the TFS warehouse is shared between ALL team projects on a server, so as one of these other team projects was using the MSF Agile template the fields are present, hence the report worked.

    Remember that shared OLAP cube, it can trip you up over and over again

    Where have my freeview tuners gone?

    I have been a long time happy user of Windows Media Center since it’s XP days. My current systems is Windows 8.1 an ATOM based Acer Revo with a pair of  USB PCTV Nanostick T2 Freeview HD tuners. For media storage I used a USB attached StarTech RAID disk sub system. This has been working well for a good couple of years, sitting in a cupboard under that stairs. However, I am about to move house and all the kit is going to have to go under the TV. The Revo is virtually silent, but the RAID crate was going to be an issue. It sounds like and aircraft taking off as the disks spin up.

    A change of kit was needed….

    I decided the best option was to move to a NAS, thus allowing the potentially noisy disks to be anywhere in the house. So I purchased a Netgear ReadyNAS 104. It shows how price have dropped over the past few years as this was about half the price of my StarTech RAID, and holds well over twice as much and provide much more functionality. I wait to see if it is reliable, only time will tell!

    So I popped the NAS on the LAN and started to copy over content from the RAID crate, at the same time (and this seems was the mistake) reconfiguring MCE to point at the NAS. All seemed OK, MCE reconfigured and background copies running, until I tried to watch live TV. MCE said it was trying to find a tuner, I waited. In the end I gave up and went to bed, assuming all would be OK in the morning when the media copy was finished and I could reboot the PC.

    Unfortunately it was not, after a reboot it still said it could find no tuner. if I tried to rescan for TV channels it just hung (for well over 48 hours, I left it while I went away). All the other functions of MCE seemed fine. I tried removing the USB tuners, both physically and un-installing the drivers, it had not effect. I had corrupted the MCE DB it seemed, something I had done before looking back at older posts.

    In the end I had to reset MCE as detailed on Ben Drawbaugh’s blog. Basically I deleted the contents of c:\programdata\microsoft\ehome  and reran the MCE Live TV setup wizard. I was not bothered over my channel list order, or series recording settings, so I did not bother with mcbackup for the backup and restore steps.

    Once this was done the tuners both worked again, though the channel scan took a good hour.

    Interestingly I had assume clearing out the ehome folder would mean I lost all my MCE settings including the media library settings, but I didn’t my MCE was still pointing at the new NAS shares so a small win.

    One point I had not considered over the move to a NAS, is that MCE cannot record TV to a network shares. Previously I had written all media to the locally attached RAID crate. The solution was to let MCE save TV to the local C:, but use a scheduled job to run ROBOCOPY to move the files to the NAS over night. Can’t see why it shouldn’t work, again only time will tell.

    Update:

    Forgot to mention another advantage of moving to the NAS. Previously I had to use the Logitech media server to serve music to my old Roku 1000 unit connected to my even older HiFi, now the Roku can use the NAS directly, thus making the system setup far easier

    Getting the Typemock TFS build activities to work on a TFS build agent running in interactive mode

    Windows 8 store applications need to be built on a TFS build agent running in interactive mode if you wish to run any tests. So whilst rebuilding all our build systems I decided to try to have all the agents running interactive. As we tend to run one agent per VM this was not going to be a major issue I thought.

    However, whilst testing we found that any of our builds that use the Typemock build activities failed when the build agent was running interactive, but work perfectly when it was running as a service. The error was

     

    Exception Message: Access to the registry key 'HKEY_LOCAL_MACHINE\SOFTWARE\Wow6432Node\TypeMock' is denied. (type UnauthorizedAccessException)
    Exception Stack Trace:    at Microsoft.Win32.RegistryKey.Win32Error(Int32 errorCode, String str)
       at Microsoft.Win32.RegistryKey.CreateSubKeyInternal(String subkey, RegistryKeyPermissionCheck permissionCheck, Object registrySecurityObj, RegistryOptions registryOptions)
       at Microsoft.Win32.RegistryKey.CreateSubKey(String subkey, RegistryKeyPermissionCheck permissionCheck)
       at Configuration.RegistryAccess.CreateSubKey(RegistryKey reg, String subkey)
       at TypeMock.Configuration.IsolatorRegistryManager.CreateTypemockKey()
       at TypeMock.Deploy.AutoDeployTypeMock.Deploy(String rootDirectory)
       at TypeMock.CLI.Common.TypeMockRegisterInfo.Execute()
       at TypeMock.CLI.Common.TypeMockRegisterInfo..ctor()   at System.Activities.Statements.Throw.Execute(CodeActivityContext context)
       at System.Activities.CodeActivity.InternalExecute(ActivityInstance instance, ActivityExecutor executor, BookmarkManager bookmarkManager)
       at System.Activities.Runtime.ActivityExecutor.ExecuteActivityWorkItem.ExecuteBody(ActivityExecutor executor, BookmarkManager bookmarkManager, Location resultLocation)

     

    So the issue was registry access. Irrespective of whether running interactive or as a service I used the same domain service account, which was a local admin on the build agent. The only thing that changed as the mode of running.

    After some thought I focused on UAC being the problem, but disabling this did not seem to fix the issue. I was stuck or so I thought.

    However, Robert Hancock unknown to me, was suffering a similar problem with a TFS build that included a post build event that was failing to xcopy a Biztalk custom functoid DLL to ‘Program Files’. He kept getting an ‘exit code 4 access denied’ error when the build agent was running interactive. Turns out the solution he found on Daniel Petri Blog also fixed my issues as they were both UAC/desktop interaction related.

    The solution was to create a group policy for the build agent VMs that set the following

    • User Account Control: Behavior of the elevation prompt for administrators in Admin Approval Mode - Set its value to Elevate without prompting.
    • User Account Control: Detect application installations and prompt for elevation - Set its value to Disabled.
    • User Account Control: Only elevate UIAccess applications that are installed in secure locations - Set its value to Disabled.
    • User Account Control: Run all administrators in Admin Approval Mode - Set its value to Disabled.

    Once this GPO was pushed out to the build agent VMs and they were rebooted my Typemock based build and Robert Biztalk builds all worked as expected

    AddBizTalkHiddenReferences error in TFS build when installing ProjectBuildComponent via a command line setup

    I have been trying to script the installation of all the tools and SDKs we need on our TFS Build Agent VMs. This included BizTalk. A quick check on MSDN showed the setup command line parameter I need to install the build components was

     

    /ADDLOCAL ProjectBuildComponent

    So I ran this via my VMs setup PowerShell script, all appeared OK, but when I tried a build I got the error

     

    C:\Program Files (x86)\MSBuild\Microsoft\BizTalk\BizTalkCommon.targets (189): The "AddBizTalkHiddenReferences" task failed unexpectedly.
    System.ArgumentNullException: Value cannot be null.
    Parameter name: path1
       at System.IO.Path.Combine(String path1, String path2)
       at Microsoft.VisualStudio.BizTalkProject.Base.HiddenReferencesHelper.InitializeHiddenReferences()
       at Microsoft.VisualStudio.BizTalkProject.Base.HiddenReferencesHelper.get_HiddenReferences()
       at Microsoft.VisualStudio.BizTalkProject.Base.HiddenReferencesHelper.GetHiddenReferencesNotAdded(IList`1 projectReferences)
       at Microsoft.VisualStudio.BizTalkProject.BuildTasks.AddBizTalkHiddenReferences.Execute()
       at Microsoft.Build.BackEnd.TaskExecutionHost.Microsoft.Build.BackEnd.ITaskExecutionHost.Execute()
       at Microsoft.Build.BackEnd.TaskBuilder.<ExecuteInstantiatedTask>d__20.MoveNext()

    The strange thing is, if I run the BizTalk installer via the UI and select just the ‘Project Build Components’ my build did not give this error.

    On checking the Biztalk setup logs I saw that the UI based install does not run

     

    /ADDLOCAL ProjectBuildComponent

    but

     

    /ADDLOCAL WMI,BizTalk,AdditionalApps,ProjectBuildComponent

    Once this change was made to my PowerShell script the TFS build worked OK

    TFS 2013 wizard allows you to proceed to verification even if you have no SQL admin access

    Had an interesting issue during and upgrade from TFS 2012 to 2013.2 today. The upgrade of the files proceeded as expect and the wizard ran. It picked up the correct Data Tier, found the tfs_configuration Db and I was able to fill in the service account details.

    However, when I got to the reporting section it found the report server URLs, but when it tried to find the tfs_warehouse DB it seemed to lock up, though the test of the SQL instance on the same page worked OK.

    In the end I used task manager to kill the config wizard.

    I then re-ran the wizard, switching off the reporting. This time it got to the verification step, but seemed to hang again. After a very long wait it came back with an error that the account being using to do the upgrade did not have SysAdmin rights on the SQL instance.

    On checking this turned out to be true, the user’s rights had been removed since the system was originally installed by a DBA. Once the rights were re-added the upgrade proceed perfectly; though interestingly the first page where you confirm the tfs_configuration DB now also had a check box about Always On, which it had not before.

    So the strange things was not that it failed, I would expect that, but that any of the wizard worked at all. I would have expected a failure to even find the tfs_configuration DB at the start of the wizard. Not have to wait until the verification (or reporting) step

    Why is the Team Project drop down in Release Management empty?

    The problem

    Today I found I had a problem when trying to associate a Release Management 2013.2 release pipeline with a TFS build. When I tried to select a team project the drop down for the release properties was empty.

    image

    The strange thing was this installation of Release Management has been working OK last week. What had changed?

    I suspected an issue connecting to TFS, so in the Release Management Client’s ‘Managing TFS’ tab I tried to verify the  active TFS server linked to the Release Management. As soon as I tried this I got the following error that the TFS server was not available.

    clip_image002

    I switched the TFS URL to HTTP from HTTPS and retired the verification and it worked. Going back to my release properties I could now see the build definitions again in the drop down. So I knew I had an SSL issue.

    The strange thing was we use SSL as out default connection, and none of our developers were complaining they could not connect via HTTPS.

    However, on checking I found on some of our build VMs there was an issue. If on those VMs I tried to connect to TFS in a browser with an HTTPS URL you got a certificate chain error.

    But stranger, on my PC, where I was running the Release Management client, I could access TFS over HTTPS from a browser and Visual Studio, but the Release Management verification failed.

    The solution

    It turns out the issue was we had an intermediate cert issue with our TFS server. An older Digicert intermediate certificate had expired over the weekend, and though the new cert was in place, and had been for a good few months since we renewed our wildcard cert, the active wildcard cert insisted on using the old version of the intermediate cert on some machines.

    As an immediate fix we ended up having to delete the old intermediate cert manually on machines showing the error. Once this was done the HTTPS connect worked again.

    Turns the real culprit was a group policy used to push out intermediate certs that are required to be trusted for some document automation we use. This old group policy was pushing the wrong version of the cert to some server VMs. Once this policy was update with the correct cert and pushed out it overwrote the problem cert and the problem went away.

    One potentially confusing thing here is that the ‘verity the TFS link’ in Release Management verifies that the Release Management server can see the TFS server, not the PC running the Release Management client. It was on the Release Management server I had to delete the dead cert (run a gpupdate /force to get the new policy). Hence why I was confused by my own PC working for Visual Studio and not for Release Management

    So I suspect the issue with drop down being empty is always going to really mean the Release Management server cannot see the TFS server for some reason, so check certs, permissions or basic network failure.