But it works on my PC!

The random thoughts of Richard Fennell on technology and software development

Moving our BlogEngine.NET server to Azure

As part of our IT refresh we have decided to move this BlogEngine.NET server from a Hyper-V VM in our office to an Azure website.

BlogEngine.NET is now a gallery item for Azure website, so a few clicks and your should be up and running.


However, if you want to use SQL as opposed to XML as the datastore you need to do a bit more work. This process is well documented in the video ‘Set BlogEngine.NET to use SQL provider in Azure’, but we found we needed to perform some extra steps due to where our DB was coming from.

Database Fixes

The main issue was that our on premises installation of BlogEngine.NET used a SQL 2012 availability group. This amongst other things, adds some extra settings that stop the ‘Deploy Database to Azure’ feature in SQL Management Studio from working. To address these issues I did the following:

Took a SQL backup of the DB from our production server and restored it to a local SQL 2012 Standard edition. I then tried the  Deploy to Azure


But got the errors I was expecting


There were three types

Error SQL71564: Element User: [BLACKMARBLE\AUser] has an unsupported property AuthenticationType set and is not supported when used as part of a data package.
Error SQL71564: Element Column: [dbo].[be_Categories].[CategoryID] has an unsupported property IsRowGuidColumn set and is not supported when used as part of a data package.
Error SQL71564: Table Table: [dbo].[be_CustomFields] does not have a clustered index.  Clustered indexes are required for inserting data in this version of SQL Server.

The first fixed by simply deleting the listed users in SQL Management Studio or via the query


The second were addressed by removing the  ‘IsRowGuidColumn’  property in Management Studio


or via the query


Finally II had to replace the non-cluster index with a cluster one. I got the required definition form the setup folder of our BlogEngine.NET installation, and ran the command

DROP INDEX [idx_be_CustomType_ObjectId_BlogId_Key] ON [dbo].[be_CustomFields]

CREATE CLUSTERED INDEX [idx_be_CustomType_ObjectId_BlogId_Key] ON [dbo].[be_CustomFields]
    [CustomType] ASC,
    [ObjectId] ASC,
    [BlogId] ASC,
    [Key] ASC

Once all this was done in Management Studio I could Deploy DB to Azure, so after a minute or two had a BlogEngine.NET DB on Azure

Azure SQL Login

The new DB did not have user accounts associated with it. So I had to create one

On the SQL server’s on Master  DB I ran

CREATE LOGIN usrBlog WITH password='a_password';

And then on the new DB I ran

EXEC sp_addrolemember N'db_owner', usrBlog

Azure Website

At this point we could have created a new Azure website using the BlogEngine.NET template in the gallery. However, I chose to create an empty site as our version of BlogEngine.NET (3.x) is newer than the version in the Azure gallery (2.9).

Due to the history of our blog server we have a non-default structure, the BlogEngine.NET code is not in the root. We retain some folders with redirection to allow old URLs to still work. So via an FTP client we create the following structure, copying up the content from our on premises server

  • \site\wwwroot  - the root site, we have a redirect here to the blogs folder
  • \site\wwwroot\bm-bloggers – again a redirect to the blogs folder, dating back to our first shared blog
  • \site\wwwroot\blogs – our actual server, this needs to be a virtual application

    Next I set the virtual application on the Configure section for the new website, right at the bottom, of the page


    At this point I was back in line with the video, so need to link our web site to the DB. This is done using the link button on the Azure  web site’s management page. I entered the new credentials for the new SQL DB and the DB and web site were linked. I could then get the connection string for the DB and enter it into the web.config.

  • Unlike  in the video the only edit I need to make was to the connection string, as all the other edits had already been made for the on premises SQL

    Once the revised web.config was uploaded the site started up, and you should be seeing it now

    Publishing more than one Azure Cloud Service as part of a TFS build

    Using the process in my previous post you can get a TFS build to create the .CSCFG and .CSPKG files needed to publish a Cloud Service. However, you hit a problem if your solution contains more that one Cloud Service project; as opposed to a single cloud service project with multiple roles, which is not a problem.

    The method outlined in the previous post drops the two files into a Packages folder under the drops location. The .CSPKG files are fine, as they have unique names. However there is only one ServiceConfiguration.cscfg, whichever one was created last.

    Looking in the cloud service projects I could find no way to rename the ServiceConfiguration file. It looks like it is like a app.config or web.config file i.e. it’s name is hard coded.

    The only solution I could find was to add a custom target that is set to run after the publish target. This was added to the end of each .CCPROJ files using a text editor just before the closing </project>

     <Target Name="CustomPostPublishActions" AfterTargets="Publish">
        <Exec Command="IF '$(BuildingInsideVisualStudio)'=='true' exit 0
        echo Post-PUBLISH event: Active configuration is: $(ConfigurationName) renaming the .cscfg file to avoid name clashes
        echo Renaming the .CSCFG file to match the project name $(ProjectName).cscfg
        ren $(OutDir)Packages\ServiceConfiguration.*.cscfg $(ProjectName).cscfg
        " />
        <PostBuildEvent>echo NOTE: This project has a post publish event</PostBuildEvent>


    Using this I now get unique name for the .CSCFG files as well as for .CSPKG files in my drops location. All ready for Release Management to pickup


    • I echo out a message in the post build event too just as a reminder that I have added a custom target that cannot be seen in Visual Studio, so is hard to discover
    • I use an if test to make sure the commands are only run on the TFS build box, not on a local build. The main reason for this is the path names are different for local builds as opposed to TFS build. If you do want a rename on a local build you need to change the $(OutDir)Packages path to $(OutDir)app.publish. However, it seemed more sensible to leave the default behaviour occur when running locally

    Getting the correct path and name for a project to pass as an MSBuild argument in TFS Build

    I have been sorting out some builds for use with Release Management that include Azure Cloud Solutions. To get the correct packages built by TFS I have followed the process in my past blog post. The problem was I kept getting the build error

    The target "Azure Packages\BlackMarble.Win8AppBuilder.AzureApi" does not exist in the project.

    The issue was I could not get the solution folder/project name right for the MSBUILD target parameter. Was it the spaces in the folder? I just did not know.

    The solution was to check the .PROJ file that was actually being run by MSBUILD. As you may know a .SLN file is not in MSBUILD format so you can’t just open it in notepad and look (unlike a .CSPROJ or .VBPROJ files), it is created by MSBUILD on the fly. To see this generated code, at a developer’s command prompt, run the following commands

    cd c:\mysolutionroot
    Set MSBuildEmitSolution=1

    When the MSBUILD command is run, whether the build works or not, there should be mysolution.sln.metaproj  file created. If you look in this file you will see the actual targets MSBUILD thinks it is dealing with.

    In my case I could see

    <Target Name="Azure Packages\BlackMarble_Win8AppBuilder_AzureApi:Publish">

    So the first issue was my . were replaced by _

    I changed my MSBUILD target argument to that shown in the file, but still had a problem. However, once I changed by space in the solution folder to %20 all was OK. So my final MSBUILD argument was



    Deploying a Windows service with Release Management

     recently needed to deploy a Windows service as part of a Release Management pipeline. In the past, our internal systems I have only need to deploy DB (via SSDT Dacpacs) and Websites (via MSDeploy), so a new experience.

    WIX Contents

    The first step to to create an MSI installer for the service. This was done using WIX, with all the fun that usually entails. The key part was a component to do the actual registration and starting of the service

    <Component Id ="ModuleHostInstall" Guid="{3DF13451-6A04-4B62-AFCB-731A572C12C9}" Win64="yes">
       <CreateFolder />
       <Util:User Id="ModuleHostServiceUser" CreateUser="no" Name="[SERVICEUSER]" Password="[PASSWORD]" LogonAsService="yes" />
       <File Id="CandyModuleHostService" Name ="DataFeed.ModuleHost.exe" Source="$(var.ModuleHost.TargetDir)\ModuleHost.exe" KeyPath="yes" Vital="yes"/>
       <ServiceInstall Id="CandyModuleHostService" Name ="ModuleHost" DisplayName="Candy Module Host" Start="auto" ErrorControl="normal" Type="ownProcess"  Account="[SERVICEUSER]" Password="[PASSWORD]" Description="Manages the deployment of Candy modules" />
       <ServiceControl Id="CandyModuleHostServiceControl" Name="ModuleHost" Start="install" Stop="both" Wait="yes" Remove="uninstall"/>

    So nothing that special here, but worth remembering if you miss out the ServiceControl block the service will not automatically start or be uninstalled with the MSI’s uninstall

    You can see that we pass in the service account to be used to run the service as a property. This is an important technique for using WIX with Release Management, you will want to be able to pass in anything you may want to change as installation time as a parameter. This means we ended up with a good few properties such as

      <Property Id="DBSERVER" Value=".\sqlexpress" />
      <Property Id="DBNAME" Value ="=CandyDB" />
      <Property Id="SERVICEUSER" Value="Domain\serviceuser" />
      <Property Id="PASSWORD" Value="Password1" />

    These tended to equate to app.config settings. In all cases I tried to set sensible default values so in most cases I could avoid passing in an override value.

    These property values were then used to re-write the app.config file after the copying of the files from the MSI onto the target server. This was done using the XMLFile tools and some XPath e.g.

    <Util:XmlFile Id="CacheDatabaseName" 
    ElementPath="/configuration/applicationSettings/DataFeed.Properties.Settings/setting[\[]@name='CacheDatabaseName'[\]]/value" Value="[CACHEDATABASENAME]" Sequence="1" />

    Command Line Testing

    Once the MSI was built it could be tested from the command line using the form

    msiexec /i Installer.msi /Lv msi.log SERVICEUSER="domain\svc_acc" PASSWORD="Password1" DBSERVER="dbserver" DBSERVER="myDB" …..

    I soon spotted a problem. As I was equating properties with app.config settings I was passing in connections strings and URLs, so the command line got long very quickly. It was really unwieldy to handle

    A check of the log file I was creating, msi.log, showed the command line seemed to be truncated. This seemed to occur around 1000 characters. I am not sure if this was an artefact of the logging or the command line, but either way a good reason to try to shorten the property list.

    I  therefore decided that I would not pass in whole connection strings, but just the properties that might change, especially effective for connection strings to things such as Entity Framework. This meant I did some string building in WIX during the transformation of the app.config file e.g.

    <Util:XmlFile Id='CandyManagementEntities1'
       File='[#ModuleHost.exe.config]' Value='metadata=res://*/MyEntities.csdl|res://*/MyEntities.ssdl|res://*/MyEntities.msl;provider=System.Data.SqlClient;provider connection string=&quot;data source=[DBSERVER];initial catalog=[DBNAME];integrated security=True;MultipleActiveResultSets=True;App=EntityFramework&quot;' />

    This technique had another couple of advantages

    • It meant I did not need to worry over spaces in strings, I could therefore lose the “ in the command line – Turns out this is really important later.
    • As I was passing in just a ‘secret value’ as opposed to a whole URL I could use the encryption features of Release Management to hide certain values

    It is at this point I was delayed for a long time. You have to be really careful when installing Windows services via an MSI that your service can actually start. If it cannot then you will get errors saying "… could not be installed. Verify that you have sufficient privileges to install system services". This is probably not really a rights issue, just that some configuration setting is wrong so the service has failed to start. In my case it was down to an incorrect connection string, stray commas and quotes, and a missing DLL that should have been in the installer. You often end up working fairly blind at this point as Windows services don’t give too much information when they fail to load. Persistence, SysInternals Tools and comparing to the settings/files on a working development PC are the best options

    Release Management Component

    Once I had working command line I could create a component in Release Management. On the Configure Apps > Components page I already had a MDI Deployer, but this did not expose any properties. I therefore copied this component to create a MSI deployer specific to my new service installer and started to edit it.

    All the edits were on the deployment tab, adding the extra properties that could be configured.


    Note: Now it might be possible to do something with the pre/post deployment configuration variables as we do with MSDeploy, allowing the MSI to run then editing the app.config later. However, given that MSI service installers tends to fail they cannot start the new service I think passing in the correct properties into MSIEXEC is a better option. Also means it is consistent for anyone using the MSI via the command line.

    On the Deployment tab I changed the Arguments to

    -File ./msiexec.ps1 -MsiFileName "__Installer__"  -MsiCustomArgs ‘SERVICEUSER=”__SERVICEUSER__”  PASSWORD=”__PASSWORD__” DBSERVER=”__DBSERVER__”  DBNAME=”__DBNAME__” …. ’

    I had initially assumed I needed the quotes around property values. Turns out I didn’t, and due to the way Release Management runs the component they made matters much, much worse. MSIEXEC kept failing instantly. if I ran the command line by hand on the target machine it was actually showing the Help dialog, so I knew the command line was invalid.

    Turns out the issue is Release Management calls PowerShell.EXE to run the script passing in the Arguments. This in turn calls a PowerShell Script which does some argument processing before running a process to run MSIEXEC.EXE with some parameters. You can see there are loads of places where the escaping and quotes around parameters could get confused.

    After much fiddling, swapping ‘ for “ I realised I could just forget most of the quotes. I had already edited my WIX package to build complex strings, so the actual values were simple with no spaces. Hence my command line became

    -File ./msiexec.ps1 -MsiFileName "__Installer__"  -MsiCustomArgs “SERVICEUSER=__SERVICEUSER__  PASSWORD=__PASSWORD__ DBSERVER=__DBSERVER__  DBNAME=__DBNAME__ …. “

    Once this was set my release pipeline worked resulting in a system with DBs, web services and window service all up and running.

    As is often the case it took a while to get this first MSI running, but I am sure the next one will be much easier.

    Got around to updating my Nokia 820 to WP81 Update 1

    I had been suffering with the 0x80188308 error when I tried to update my Nokia 820 to the WP81 Update 1 because I had the developer preview installed. I had been putting off what appeared to be the only solution of doing a reset as discussed in the forums as it seem a bit drastic, thought I would wait for Microsoft to sort out the process. I got bored waiting..

    Turns out as long as you do the backup first it is fairly painless, took about an hour of uploads and downloads over WiFi

    1. Created a manual backup of the phone: Settings>backup>apps+settings>backup now.
    2. Reset the phone to factory settings (DP 8.1), leaving any SD card alone: Settings>about>reset your phone.
    3. When prompted logged in with the same ID as used for the backup
    4. Restored the phone using the backup just created.  
    5. Reconnected to all of the other accounts and let the phone download all of the apps.
    6. Signed back into the Preview for Developers app – else you won’t see the updates!
    7. The updates comes down without a problem as one large package

    Lets have a go with a UK aware version of Cortana….

    Getting ‘… is not a valid URL’ when using Git TF Clone

    I have been attempting to use the Git TF technique to migrate some content between TFS servers. I needed to move a folder structure that contains spaces in folder names from a TPC that also contains spaces in its name. So I thought my command line would be

    git tf clone “http://tfsserver1:8080/tfs/My Tpc” “$/My Folder”’ oldrepo --deep

    But this gave the error

    git-tf: “http://tfsserver1:8080/tfs/My Tpc” is not a valid URL

    At first I suspected it was the quotes I was using, as I had had problems here before, but swapping from ‘ to “ made no difference.

    The answer was to use the ASCII code %20 for the space, so this version of the command worked

    git tf clone http://tfsserver1:8080/tfs/My%20Tpc “$/My Folder”’ oldrepo --deep

    Interestingly you don’t need to use %20 for the folder name

    Build failing post TFS 2013.3 upgrade with ‘Stack empty. (type InvalidOperationException)’

    Just started seeing build error on a build that was working until we upgraded the build agent to TFS 2013.3

    Exception Message: Stack empty. (type InvalidOperationException)
    Exception Stack Trace:    at Microsoft.VisualStudio.TestImpact.Analysis.LanguageSignatureParser.NotifyEndType()
       at Microsoft.VisualStudio.TestImpact.Analysis.SigParser.ParseType()
       at Microsoft.VisualStudio.TestImpact.Analysis.SigParser.ParseRetType()
       at Microsoft.VisualStudio.TestImpact.Analysis.SigParser.ParseMethod(Byte num1)
       at Microsoft.VisualStudio.TestImpact.Analysis.SigParser.Parse(Byte* blob, UInt32 len)
       at Microsoft.VisualStudio.TestImpact.Analysis.LanguageSignatureParser.ParseMethodName(MethodProps methodProps, String& typeName, String& fullName)
       at Microsoft.VisualStudio.TestImpact.Analysis.AssemblyMethodComparer.AddChangeToList(DateTime now, List`1 changes, CodeChangeReason reason, MethodInfo methodInfo, MetadataReader metadataReader, Guid assemblyIdentifier, SymbolReader symbolsReader, UInt32 sourceToken, LanguageSignatureParser& languageParser)
       at Microsoft.VisualStudio.TestImpact.Analysis.AssemblyMethodComparer.CompareAssemblies(String firstPath, String secondPath, Boolean lookupSourceFiles)
       at Microsoft.TeamFoundation.TestImpact.BuildIntegration.BuildActivities.GetImpactedTests.CompareBinary(CodeActivityContext context, String sharePath, String assembly, IList`1 codeChanges)
       at Microsoft.TeamFoundation.TestImpact.BuildIntegration.BuildActivities.GetImpactedTests.CompareBuildBinaries(CodeActivityContext context, IBuildDefinition definition, IList`1 codeChanges)
       at Microsoft.TeamFoundation.TestImpact.BuildIntegration.BuildActivities.GetImpactedTests.Execute(CodeActivityContext context)
       at System.Activities.CodeActivity.InternalExecute(ActivityInstance instance, ActivityExecutor executor, BookmarkManager bookmarkManager)
       at System.Activities.Runtime.ActivityExecutor.ExecuteActivityWorkItem.ExecuteBody(ActivityExecutor executor, BookmarkManager bookmarkManager, Location resultLocation)

    I assume the issue is a DLL mismatch between what is installed in as part of the build agent and something in the 2012 generation build process template in use.

    As an immediate fix, until I get a chance to swap the template to a newer one, was to disable Test Impact Analysis, which I was not using for this project anyway.


    Once I did this my build completed OK with the tests ran OK

    Reprint: Migrating a TFS TFVC based team project to a Git team project - a practical example

    This article was first published on the Microsoft’s UK Developers site Migrating a TFS TFVC based team project to a Git team project - a practical example on August the 15th 2014

    In the past I've written on the theory behind migrating TFVC to Git with history. I've recently done this for real, as opposed to as a proof of concept, and this post documents my experiences. The requirement was to move a TFS 2013.2 Scrum Team Project using TFVC to another TFS 2013.2 Scrum Team Project using Git. The process used was as follows:

    Create new team project

    On the target server create a new team project using the Scrum 2013.2 process template. As we were using the same non-customised process template for both the source and the target we did not have to worry over any work item customisation. However, if you were changing process template this is where you would do any customisation required.

    Adding a field to all Work Item Types

    We need to be able to associate the old work item ID with the new migrated one. The TFS Integration Platform has a feature to do this automatically, but it suffers a bug. It is meant to automatically add a field for this purpose, but it actually needs it to be manually added prior to the migration.

    To do this edit we need to either

  • Edit the process templates in place using the Process Template Editor Power Tool
  • Export the WIT with WITADMIN.exe and edit them in Notepad and re-import them

    In either case the field to add to ALL WORK ITEM TYPES is as follows

    <FIELD refname="TfsMigrationTool.ReflectedWorkItemId" name="ReflectedWorkItemId" type="String">     

    Once the edit is made the revised work item types need to be re-imported back into the new Team project.

    The Work Item Migration

    The actual work item migration is done using the TFS Integration Platform. This will move over all work item types from the source team project to the target team project.

    The process is as follows...

    1. Install TFS Integration Platform.
    2. Load TFS Integration Platform, as it seems it must be loaded after the team project is created, else it gets confused!
    3. Select 'Create New'.
    4. Pick the 'Team Foundation Server\WorkItemTracking' template. As we are migrating with the same process template this is OK. If you need to change field mappings use the template for field matching and look at the TFS Integration Mapper tool.
    5. Provide a sensible name for the migration. Not really needed for a one-off migration, but if testing, it is easy to end up with many test runs all of the same name, which is confusing in the logs.
    6. Pick the source server and team project as the left server.
    7. Pick the target server and team project as the right server.
    8. Accept the defaults and save to database.
    9. On the left menu select Start. The UI on this tool is not great. Avoid looking on the output tab as this seems to slow the process. Also altering the refresh time on the options for once a minute seems to help process performance. All details of actions are placed in log files so nothing is lost by these changes.
    10. The migration should complete without any issues, assuming there are no outstanding template issues that need to be resolved.

    Article image

    Add the New ID to the Changsets on the source server

    The key to this migration process to retain the links between the work items and source code checkins. This is done using the technique I outlined in the previous post i.e. editing the comments field of the changeset on the source team project prior to migration the source to add #123 style references to point to the new work items on the target server.

    To do this I used some PowerShell

            function Update-TfsCommentWithMigratedId
            This function is used as part of the migration for TFVC to Git to help retain checkin associations to work items
            This function takes two team project references and looks up changset association in the source team project, it then looks for 
            the revised work itme IT in the new team project and updates the source changeset
            .PARAMETER SourceCollectionUri
            Source TFS Collection URI
            .PARAMETER TargetCollectionUri
            Target TFS Collection URI
            .PARAMETER SourceTeamProject
            Source Team Project Name
            Update-TfsCommentWithMigratedId -SourceCollectionUri "http://server1:8080/tfs/defaultcollection" -TargetCollectionUri "http://server2:8080/tfs/defaultcollection" -SourceTeamProject "Scrumproject"
                [uri] $SourceCollectionUri, 
                [uri] $TargetCollectionUri,
                [string] $SourceTeamProject
                # get the source TPC
                $sourceTeamProjectCollection = New-Object Microsoft.TeamFoundation.Client.TfsTeamProjectCollection($sourceCollectionUri)
                # get the TFVC repository
                $vcService = $sourceTeamProjectCollection.GetService([Microsoft.TeamFoundation.VersionControl.Client.VersionControlServer])
                # get the target TPC
                $targetTeamProjectCollection = New-Object Microsoft.TeamFoundation.Client.TfsTeamProjectCollection($targetCollectionUri)
                #Get the work item store
                $wiService = $targetTeamProjectCollection.GetService([Microsoft.TeamFoundation.WorkItemTracking.Client.WorkItemStore])
                # Find all the changesets for the selected team project on the source server
                foreach ($cs in $vcService.QueryHistory(”$/$SourceTeamProject”, [Microsoft.TeamFoundation.VersionControl.Client.RecursionType]::Full, [Int32]::MaxValue))
                    if ($cs.WorkItems.Count -gt 0)
                        foreach ($wi in $cs.WorkItems)
                            "Changeset {0} linked to workitem {1}" -f $cs.ChangesetId, $wi.Id
                            # find new id for each changeset on the target server
                            foreach ($newwi in $wiService.Query("select id  FROM WorkItems WHERE [TfsMigrationTool.ReflectedWorkItemId] = '" + $wi.id + "'"))
                                # if ID found update the source server if the tag has not already been added
                                # we have to esc the [ as gets treated as a regular expression
                                # we need the white space around between the [] else the TFS agent does not find the tags 
                                if ($cs.Comment -match "\[ Migrated ID #{0} \]" -f $newwi.Id)
                                    Write-Output ("New Id {0} already associated with changeset {1}" -f $newwi.Id , $cs.ChangesetId)
                                } else {
                                    Write-Output ("New Id {0} being associated with changeset {1}" -f $newwi.Id, $cs.ChangesetId )
                                    $cs.Comment += "[ Migrated ID #{0} ]" -f $newwi.Id

    With the usage

    Update-TfsCommentWithMigratedId -SourceCollectionUri "http://localhost:8080/tfs/defaultcollection" -TargetCollectionUri "http://localhost:8080/tfs/defaultcollection" -SourceTeamProject "Old team project"


    NOTE: This script is written so that it can be run multiple times, but only adds the migration entries once for any given changeset. This means both it and TFS Integration Platform can be run repeatedly on the same migration to do a staged migration e.g. get the bulk of the content over first whilst the team is using the old team project, then do a smaller migration of the later changes when the actual swap over happens.

    When this script is run expect to see output similar to:

    Article image

    You can see the impact of the script in Visual Studio Team Explorer or the TFS web client when looking at changesets in the old team project. Expect to see a changeset comment in the form shown below with new [ Migrated ID #123 ] blocks in the comment field. When 123 is the work item ID on the new team project. Also note the changeset is still associated with the old work item ID on the source server.

    Article image

    NOTE: The space after the #123 is vital. If it is not there then the TFS job agent cannot find the tag to associate the commit to a work item after the migration.

    Source code migration

    The source code can now be migrated. This is done by cloning the TFVC code to a local Git repo and then pushing it up to the new TFS Git repo using Git TF. We clone the source to a local repo in the folder localrepo with the -deep option is used to retain history.

    git tf clone http://typhoontfs:8080/tfs/defaultcollection '$/Scrum TFVC Source/Main' localrepo --deep

    NOTE: I have seen problems with this command. On larger code bases we saw the error 'TF 400732 server cancelled error' as files were said to be missing or we had no permission - neither of which was true. This problem was repeated on a number of machines, including one that had in the past managed to do the clone. It was thought the issue was on the server connectivity, but no errors were logged.

    As a work around the Git-TFS tool was used. This older community tool uses the .NET TFS API, unlike the Microsoft one which uses the Java TFS API. It was hoped this would not suffer the same issue. However it also gave TF400732 errors, but did provide a suggested command line to retry continue, which continued from where it errored.

    The command to do the clone was:

    Git tfs clone http://typhoontfs:8080/tfs/defaultcollection $/Scrum TFVC Source/main e:\repo1

    The command to continue after an error was (from within the repo folder)

    Git tfs fetch

    It should be noted that Git-TFS seems a good deal faster than Git TF, presumably due to being a native .NET client as opposed to using the Java VM.

    Once the clone was complete, we need to add the TFS Git repo as a remote target and then push the changes up to the new team project. The exact commands for this stage are shown on the target TFS server. Load the web client, go to the code section and you should see the commands needed e.g.

    git remote add origin http://typhoontfs:8080/tfs/DefaultCollection/_git/newproject
    git push -u origin --all      

    Once this stage is complete the new TFS Git repo can be used. The Git commits should have the correct historic date and work item associations as shown below. Note now that the migration id comments match the work item associations.

    Article image

    NOTE: There may be a lack in the associations being show immediately after the git push. This is because the associations are done by a background TFS job process which may take a while to catch up when there are a lot of commits. On one system I worked on this took days not hours! Be patient.

    Shared Test Steps

    At this point all work items have been moved over and their various associations with source commits are retained e.g. PBIs link to test cases and tasks. However there is a problem that any test cases that have shared steps will be pointing to the old shared set work items. As there is already an open source tool to do this update there was no immediate need to rewrite it as a PowerShell tool. So to use the open source tool use the command line:

    UpdateSharedStep.exe http://localhost:8080/tfs/defaultcollection myproject

    Test Plans and Suites

    Historically in TFS, test plans and suites are not work items (a change coming in TFS 2013.3). This means if you need these moved over too there is more PowerShell needed.

    This script moves the three test suite types as follows:

  • Static - Creates a new suite, finds the migrated IDs of the test cases on the source suite and adds them to the new suite.
  • Dynamic - Creates a new suite using the existing work item query. IMPORTANT - The query is NOT edited so may or may not work depending on what it actually contained. So these suites will need to be checked by a tester manually in all cases and their queries probably 'tweaked'.
  • Requirements - Create a new suite based on the migrated IDs of the requirement work items. This is the only test suite type where we edit the name to make it consistent with the new requirement ID not the old.

    The script is:


            function Update-TestPlanAfterMigration
            This function migrates a test plan and all its child test suites to a different team project
            This function migrates a test plan and all its child test suites to a different team project, reassign work item IDs as required
            .PARAMETER SourceCollectionUri
            Source TFS Collection URI
            .PARAMETER SourceTeamProject
            Source Team Project Name
            .PARAMETER SourceCollectionUri
            Target TFS Collection URI
            .PARAMETER SourceTeamProject
            Targe Team Project Name
            Update-TestPlanAfterMigration -SourceCollectionUri "http://server1:8080/tfs/defaultcollection" -TargetCollectionUri "http://serrver2:8080/tfs/defaultcollection"  -SourceTeamProjectName "Old project" -TargetTeamProjectName "New project"
                [uri] $SourceCollectionUri,
                [string] $SourceTeamProjectName,
                [uri] $TargetCollectionUri,
                [string] $TargetTeamProjectName
                # Get TFS connections
                $sourcetfs = [Microsoft.TeamFoundation.Client.TfsTeamProjectCollectionFactory]::GetTeamProjectCollection($SourceCollectionUri)
                    Write-Error "Error occurred trying to connect to project collection: $_ "
                    exit 1
                $targettfs = [Microsoft.TeamFoundation.Client.TfsTeamProjectCollectionFactory]::GetTeamProjectCollection($TargetCollectionUri)
                    Write-Error "Error occurred trying to connect to project collection: $_ "
                    exit 1
                # get the actual services
                $sourcetestService = $sourcetfs.GetService("Microsoft.TeamFoundation.TestManagement.Client.ITestManagementService")
                $targettestService = $targettfs.GetService("Microsoft.TeamFoundation.TestManagement.Client.ITestManagementService")
                $sourceteamproject = $sourcetestService.GetTeamProject($sourceteamprojectname)
                $targetteamproject = $targettestService.GetTeamProject($targetteamprojectname)
                # Get the work item store
                $wiService = $targettfs.GetService([Microsoft.TeamFoundation.WorkItemTracking.Client.WorkItemStore])
                # find all the plans in the source
                 foreach ($plan in $sourceteamproject.TestPlans.Query("Select * From TestPlan"))
                     if ($plan.RootSuite -ne $null -and $plan.RootSuite.Entries.Count -gt 0)
                        # copy the plan to the new tp
                        Write-Host("Migrating Test Plan - {0}" -f $plan.Name) 
                        $newplan = $targetteamproject.TestPlans.Create();
                        $newplan.Name = $plan.Name
                        $newplan.AreaPath = $plan.AreaPath
                        $newplan.Description = $plan.Description
                        $newplan.EndDate = $plan.EndDate
                        $newplan.StartDate = $plan.StartDate
                        $newplan.State = $plan.State
                        # we use a function as it can be recursive
                        MoveTestSuite -sourceSuite $plan.RootSuite -targetSuite $newplan.RootSuite -targetProject $targetteamproject -targetPlan $newplan -wiService $wiService
                        # and have to save the test plan again to persit the suites
            # - is missing in name so this method is not exposed when module loaded
            function MoveTestSuite
            This function migrates a test suite and all its child test suites to a different team project
            This function migrates a test suite and all its child test suites to a different team project, it is a helper function Move-TestPlan and will probably not be called directly from the command line
            .PARAMETER SourceSuite
            Source TFS test suite
            .PARAMETER TargetSuite
            Target TFS test suite
            .PARAMETER TargetPlan
            The new test plan the tests suite are being created in
            .PARAMETER targetProject
            The new team project test suite are being created in
            .PARAMETER WiService
            Work item service instance used for lookup
            Move-TestSuite -sourceSuite $plan.RootSuite -targetSuite $newplan.RootSuite -targetProject $targetteamproject -targetPlan $newplan -wiService $wiService
                foreach ($suite_entry in $sourceSuite.Entries)
                   # get the suite to a local variable to make it easier to pass around
                   $suite = $suite_entry.TestSuite
                   if ($suite -ne $null)
                       # we have to build a suite of the correct type
                       if ($suite.IsStaticTestSuite -eq $true)
                            Write-Host("    Migrating static test suite - {0}" -f $suite.Title)      
                            $newsuite = $targetProject.TestSuites.CreateStatic()
                            $newsuite.Title = $suite.Title
                            $newsuite.Description = $suite.Description 
                            $newsuite.State = $suite.State 
                            # need to add the suite to the plan else you cannot add test cases
                            $targetSuite.Entries.Add($newSuite) >$nul # sent to null as we get output
                            foreach ($test in $suite.TestCases)
                                $migratedTestCaseIds = $targetProject.TestCases.Query("Select * from [WorkItems] where [TfsMigrationTool.ReflectedWorkItemId] = '{0}'" -f $Test.Id)
                                # we assume we only get one match
                                if ($migratedTestCaseIds[0] -ne $null)
                                    Write-Host ("        Test {0} has been migrated to {1} and added to suite {2}" -f $Test.Id , $migratedTestCaseIds[0].Id, $newsuite.Title)
                                    $newsuite.Entries.Add($targetProject.TestCases.Find($migratedTestCaseIds[0].Id))  >$nul # sent to null as we get output
                       if ($suite.IsDynamicTestSuite -eq $true)
                           Write-Host("    Migrating query based test suite - {0} (Note - query may need editing)" -f $suite.Title)      
                           $newsuite = $targetProject.TestSuites.CreateDynamic()
                           $newsuite.Title = $suite.Title
                           $newsuite.Description = $suite.Description 
                           $newsuite.State = $suite.State 
                           $newsuite.Query = $suite.Query
                           $targetSuite.Entries.Add($newSuite) >$nul # sent to null as we get output
                           # we don't need to add tests as this is done dynamically
                       if ($suite.IsRequirementTestSuite -eq $true)
                           $newwis = $wiService.Query("select *  FROM WorkItems WHERE [TfsMigrationTool.ReflectedWorkItemId] = '{0}'" -f $suite.RequirementId)  
                           if ($newwis[0] -ne $null)
                                Write-Host("    Migrating requirement based test suite - {0} to new requirement ID {1}" -f $suite.Title, $newwis[0].Id )    
                                $newsuite = $targetProject.TestSuites.CreateRequirement($newwis[0])
                                $newsuite.Title = $suite.Title -replace $suite.RequirementId, $newwis[0].Id
                                $newsuite.Description = $suite.Description 
                                $newsuite.State = $suite.State 
                                $targetSuite.Entries.Add($newSuite) >$nul # sent to null as we get output
                                # we don't need to add tests as this is done dynamically
                       # look for child test cases
                       if ($suite.Entries.Count -gt 0)
                             MoveTestSuite -sourceSuite $suite -targetSuite $newsuite -targetProject $targetteamproject -targetPlan $newplan -wiService $wiService

    NOTE: This script needs PowerShell 3.0 installed. This appears to be because some the TFS assemblies are .NET 4.5 which is not supported by previous PowerShell versions. If the version is wrong the test suite migration will fail as the TestPlan (ITestPlanHelper) object will be null.

    The command run to do the migration of test plans is:

    Update-TestPlanAfterMigration -SourceCollectionUri "http://typhoontfs:8080/tfs/defaultcollection" -TargetCollectionUri "http://typhoontfs:8080/tfs/defaultcollection" -SourceTeamProjectName "Scrum TFVC Source" -TargetTeamProjectName "NewProject”

    This will create the new set of test plans and suites in addition to any already in place on the target server. It should give an output similar to:

    Article image


    So once all this is done you should have migrated TFVC team project on a new team project based on Git retaining as much history as is possible.

    Hope you find this of use.

    This article was first published on the Microsoft’s UK Developers site Migrating a TFS TFVC based team project to a Git team project - a practical example on August the 15th 2014

  • Listing all the PBIs that have no acceptance criteria

    Update 24 Aug 2014:  Changed the PowerShell to use a pipe based filter as opposed to nested foreach loops

    The TFS Scrum process template’s Product Backlog Item work item type has an acceptance criteria field. It is good practice to make sure any PBI has this field completed; however it is not always possible to enter this content when the work item is initially create i.e. before it is approved. We oftan find we add a PBI that is basically a title and add the summary and acceptance criteria as the product is planned.

    It would be really nice to have a TFS work item query that listed all the PBIs that did not have the acceptance criteria field complete. Unfortunately there is not way to check a rich text or html field is empty in TFS queries It has been requested via UserVoice, but there is no sign of it appearing in the near future.

    So we are left the TFS API to save the day, the following PowerShell function does the job, returning a list of non-completed PBI work items that have empty Acceptance Criteria.


    # Load the one we have to find, might be more than we truly need for this single function
    # but I usually keep all these functions in a single module so share the references
    $ReferenceDllLocation = "C:\Program Files (x86)\Microsoft Visual Studio 12.0\Common7\IDE\ReferenceAssemblies\v2.0\"
    Add-Type -Path $ReferenceDllLocation"Microsoft.TeamFoundation.Client.dll" -ErrorAction Stop -Verbose
    Add-Type -Path $ReferenceDllLocation"Microsoft.TeamFoundation.Common.dll" -ErrorAction Stop -Verbose
    Add-Type -Path $ReferenceDllLocation"Microsoft.TeamFoundation.WorkItemTracking.Client.dll"  -ErrorAction Stop –Verbose


    function Get-TfsPBIWIthNoAcceptanceCriteria { <# .SYNOPSIS This function get the list of PBI work items that have no acceptance criteria .DESCRIPTION This function allows a check to be made that all PBIs have a set of acceptance criteria .PARAMETER CollectionUri TFS Collection URI .PARAMETER TeamProject Team Project Name .EXAMPLE Get-TfsPBIWIthNoAcceptanceCriteria -CollectionUri "http://server1:8080/tfs/defaultcollection" -TeamProject "My Project" #> Param ( [Parameter(Mandatory=$true)] [uri] $CollectionUri , [Parameter(Mandatory=$true)] [string] $TeamProject ) # get the source TPC $teamProjectCollection = New-Object Microsoft.TeamFoundation.Client.TfsTeamProjectCollection($CollectionUri) try { $teamProjectCollection.EnsureAuthenticated() } catch { Write-Error "Error occurred trying to connect to project collection: $_ " exit 1 } #Get the work item store $wiService = $teamProjectCollection.GetService([Microsoft.TeamFoundation.WorkItemTracking.Client.WorkItemStore]) # find each work item, we can't check for acceptance crieria state in the query $pbi = $wiService.Query("SELECT [System.Id] FROM WorkItems WHERE [System.TeamProject] = '{0}' AND [System.WorkItemType] = 'Product Backlog Item' AND [System.State] <> 'Done' ORDER BY [System.Id]" -f $teamproject) $pbi |  where-Object { $_.fields | where-object {$_.ReferenceName -eq 'Microsoft.VSTS.Common.AcceptanceCriteria' -and $_.Value -eq ""}} # Using a single piped line to filter the wi # this is equivalent to the following nested loops for those who like a more winded structure # $results = @() # foreach ($wi in $pbi) # { # foreach ($field in $wi.Fields) # { # if ($field.ReferenceName -eq 'Microsoft.VSTS.Common.AcceptanceCriteria' -and $field.Value -eq "") # { # $results += $wi # } # } # } # $results }

    Why is my TFS report not failing when I really think it should ?

    Whilst creating some custom reports for a client we hit a problem that though the reports worked on my development system and their old TFS server it failed on their new one. The error being that the Microsoft_VSTS_Scheduling_CompletedWork was an invalid column name


    Initially I suspected the problem was a warehouse reprocessing issue, but other reports worked so it could not have been that.

    It must really be the column is missing, and that sort of makes sense. On the new server the team was using the Scrum process template, the Microsoft_VSTS_Scheduling_CompletedWork  and Microsoft_VSTS_Scheduling_OriginalEstimate fields are not included in this template, the plan had been to add them to allow some analysis of estimate accuracy. This had been done on my development system, but not on the client new server. Once these fields were added to the Task work item the report leapt into life.

    The question is then, why did this work on the old TFS server? The team project on the old server being used to test the reports also did not have the customisation either. However, remember the OLAP cube for the TFS warehouse is shared between ALL team projects on a server, so as one of these other team projects was using the MSF Agile template the fields are present, hence the report worked.

    Remember that shared OLAP cube, it can trip you up over and over again