But it works on my PC!

The random thoughts of Richard Fennell on technology and software development

Using git tf to migrate code between TFS servers retaining history

Martin Hinshelwood did a recent post on moving source code between TFS servers using  git tf. He mentioned that you could use the --deep option to get the whole changeset check-in history.

Being fairly new to using Git, in anything other than the simplest scenarios, it took me a while to get the commands right. This is what I used in the end (using the Brian Keller VM for sample data) …

C:\tmp\git> git tf clone http://vsalm:8080/tfs/fabrikamfibercollection $/fabrikamfiber/Main oldserver --deep

Connecting to TFS...

Cloning $/fabrikamfiber/Main into C:\Tmp\git\oldserver: 100%, done.

Cloned 5 changesets. Cloned last changeset 24 as 8b00d7d

C:\tmp\git> git init newserver

Initialized empty Git repository in C:/tmp/git/newserver/.git/

C:\tmp\git> cd newserver

C:\tmp\git\newserver [master]> git pull ..\oldserver --depth=100000000

remote: Counting objects: 372, done.

remote: Compressing objects: 100% (350/350), done.

96% (358/372), 2.09 MiB | 4.14 MiB/s

Receiving objects: 100% (372/372), 2.19 MiB | 4.14 MiB/s, done.

Resolving deltas: 100% (110/110), done.

From ..\oldserver

* branch HEAD -> FETCH_HEAD

C:\tmp\git\newserver [master]> git tf configure http://vsalm:8080/tfs/fabrikamfibercollection $/fabrikamfiber/NewLocation

Configuring repository

C:\tmp\git\newserver [master]> git tf checkin --deep --autosquash

Connecting to TFS...

Checking in to $/fabrikamfiber/NewLocation: 100%, done.

Checked in 5 changesets, HEAD is changeset 30

The key was I had missed the –autosquash option on the final checkin.

Once this was run I could see my checking history, the process is quick and once you have the right command line straight forward. However, just like TFS Integration Platform time is compressed, and unlike TFS Integration Platform you also lose the ownership of the original edits.


This all said, another useful tool in the migration arsenal.

Where did my parameters go when I edited that standard TFS report?

I have been doing some editing of the standard scrum TFS 2012 Sprint Burndown report in SQL 2012 Report Builder. When I ran the report after editing the MDX query in the dsBurndown DataSet to return an extra column I got an error:

  • on a remote PC it just said error with dsBurndown dataset
  • on the server hosting reporting services, or in Report Builder, I got a bit more information, it said the TaskName parameter was not defined.

On checking the state of the dataset parameters before and after my edit I could see that the TaskName parameter had been lost


Manually re-adding it fixed the problem.

Interestingly which parameters were lost seemed to depend on the MDX query edit I made, I assume something is inferring the parameters from the MDX query.

So certainly one to keep an eye on. I suspect this is a feature of Report Builder, maybe I am better just using trusty Notepad to edit the .RDL file. Oh how I love to edit XML in Notepad

Visual Studio 2013 announcement at TechEd USA

Today at TechEd USA Brian Harry announced Visual Studio 2013, have a look at his blog for details of the new ALM features. These include…

  • Agile Portfolio Management
  • Git source control on premises
  • Revised team explorer including pop out windows
  • Improvements in code editing and annotation
  • Improvement in web based test management
  • Team Room – chat like collaboration
  • Cloud based web load testing
  • The start of addition of release management to TFS via the purchase of InRelease

For more info see the various sessions up on Channel 9

My session on TFS at the ‘Building Applications for the Future’

Thanks to everyone who attended my session on ‘TFS for Developers’ at the Grey Matter’s ‘Building Applications for the Future’ event today. As you will have noticed my session was basically slide free, so not much to share there.

As I said at the end of my session to find out more have a look at

Also a couple of people asked by about TFS and Eclipse, which I only mentioned briefly at the end. For more on Team Explorer Everywhere look at the video I did last year on that very subject

Webinar on PreEmptive Analytics tools on the 28th of May

A key requirement for any DevOps strategy is the reporting on how your solution is behaving in the wild. PreEmptive Analytics™ for Team Foundation Server (TFS) can provide a great insight in this area, and there is a good chance you are already licensed for it as part of MSDN.

So why not have a look on the UK MSDN site for more details the free Microsoft hosted event.

MSDN Webinar Improve Software Quality, User Experience and Developer Productivity with Real Time Analytics
Tuesday, May 28 2013: 4:00 – 5:00 pm (UK Time)

Also why not sign up for Black Marble’s webinar event in June on DevOps process and tools in the Microsoft space.

Why do all my TFS labels appear to be associated with the same changeset?

If you look at the labels tab in the source control history in Visual Studio 2012 you could be confused by the changeset numbers. How can all the labels added by my different builds, done over many days, be associated with the same changeset?


If you look at the same view in VS 2010 the problem is not so obvious, but that is basically due to the column not being shown.


The answer is that the value shown in the first screen shot is for the root element associated with the label. If you drill into the label you can see all the labelled folders and files with their changeset values when the label was created


So it is just that the initial screen is confusing drilling in makes it all clearer.

Setting up a TFS 2012 proxy in a cross domain system

Today I have been setting up a cross domain TFS proxy. The developers are in one domain and the TFS server in another. Given there is no trust between these domains you have use a trick to get it to work.

So I created a local user tfsproxy.local on both the TFS server and proxy with the same password on each. At the proxy end I made this local user a local admin.

Next I ran the TFS 2012.2 wizard setting the proxy account  to the tfsproxy.local user. It all passed verification, but then I got an error

TF400371: Failed to add the service account 'TFSPROXY\TFSProxy.local' to Proxy Service Accounts Group. Details: TF14045: The identity with type 'System.Security.Principal.WindowsIdentity' and identifier 'S-1-5-21-4198714966-1643845615-1961851592-1024' could not be found..

It seems this is a known issue with TFS2012. It is meant to be fixed in TFS2012.3, so I pulled down the ’go live’ CTP and installed this on the proxy. It made no difference, I assumed it actually needs to be installed on the server end and not just the proxy as this is where the user lookup occurs. However, I did not access to do that upgrade today.

I was about to follow the workaround of removing the proxy from the domain, configuring it and then putting it back. But I then had an idea; the step it was failing on was granting rights, so I did it manually. On the TFS server end I added the tfsproxy.local user to the ‘Proxy Service Accounts Group’. Once this was done the configuration completed without error.

A quick test showed the proxy was working as expected.

How healthy is my TFS server?

If you want to know the health of the TFS server there are a number of options from a full System Center MOM pack downwards. A good starting point are the performance reports and administrative report pack produced by Grant Holiday. Though the performance pack is  designed for TFS 2008 they work on 2010 and 2012, but you do need to do a bit of editing.

  1. As the installation notes state, create a new shared data source called “TfsActivityReportDS”
    1. Set the connection string to: Data Source=[your SQL server];Initial Catalog=Tfs_[your TPC name]    -  this is the big change as it this used to point to the tfs_ActivityLogging DB, this (as of TFS 2010) is now all rolled into you Team project Collection DB, so you need to alter the connection string to match your TPC. Also note if you use multiple TPCs you will need multiple data sources and reports.
    2. Credentials: domain\user that has access to the Tfs_[TPC Name] database
    3. Use as windows credentials when connecting to the data source
    4. Once uploaded each report needs to be edited via the web manage option to change it Data Source to match the newly created source

    5. You also need to edit each report in the pack via Report Builder as the SQL queries all contain the full path. For each dataset, (and each report can have a few) you need to edit the query to only contain the table name not the whole SQL path

      i.e. From TfsActivityLogging.dbo.tbl_Command to tbl_Command


Once this is done most of the reports should working and giving a good insight into the performance of your server.

Some reports such as the Source Control Requests and Top user bypassing proxy take a bit more SQL query fiddling.

  • Server Status - Top Users Bypassing Proxies – you need to alter the Users part of the query to something like (note the hard coded table path, i am sure we could do better, but I don’t usually need this report as have few proxies, so not made much effort on it)

        ) AS

            SELECT [personSK]
                  ,[Domain] + '\' + [Alias] as FullyQualifiedAlias
              FROM [Tfs_2012_Warehouse].[dbo].[DimPerson] with (nolock)

  • Source Control Requests – runs from a straight web service endpoint, so you need to edit the Url it targets to something like


Unlike the performance reports the admin report packs is designed for TFS 2010/2012 so it works once you make sure the reports are connected to the correct shared data sources.

However, remember the new web based Admin Tools on TFS 2012 actually address many of these areas out the box.

Getting going with the TFS Java API

If you are using the TFS 2012 Java API it is important you read the release notes. It is not enough to just reference the com.microsoft.tfs.sdk-11.0.0.jar file in your classpath as you might expect. You also have to pass a Java system property that associates com.microsoft.tfs.jni.native.base-directory with the location of the native library files that provide platform specific implementation for method calls.  The command line for this is done in the form

java.exe -D"com.microsoft.tfs.jni.native.base-directory=C:\Users\Username\YourApplication\native"

If you don’t set this property you get an exception similar to

Exception in thread "main" java.lang.UnsatisfiedLinkError: com.microsoft.tfs.jni.internal.platformmisc.NativePlatformMisc.nativeGetEnvironmentVariable(Ljava/lang/String;)Ljava/lang/String;

Now setting this property on the command line is all well and good, but how do you do this if you are working in Eclipse?

The answer is you set the argument via the Run > Run Configuration. Select your configuration and enter the VM argument as shown below.


Once this is set you can run and debug you application inside Eclipse

Accessing TFS work item tags via the API

With TFS 2012.2 Microsoft have added tags to work items. These provide a great way to add custom information to work items without the need to customise the process template to add custom fields. This is important for users of the hosted http://tfs.visualstudio.com as this does not, at this time, allow any process customisation.

It is easy to add tags to any work item via the TFS web client, just press the Add.. button and either select an existing tag or add a new one. In the following PBI work item example I have added two tags, Tag1 and Tag2.


However, the problem with tags, at present, is that they can only be used as filters within the result of a work item query in the web client, as shown below.


They are not available inside work item queries and are not published to the TFS warehouse/cube for reporting purposes. Hopefully these limitations will be addressed in the future, but not today.

Given all this, I was recently asked by a client if they could use tags to mark PBI work items scheduled for a given release with a view to using this information to produce release notes. Obviously given the current limitations this cannot be done via work item queries or reporting, but you can use the TFS 2012.2 API to do this easily in .NET or Java.

The tags are stored as a ; separated list in a string field property. In C# there is a property in the API to get the tags …

using System;
using Microsoft.TeamFoundation.Client;
using Microsoft.TeamFoundation.WorkItemTracking.Client;
using System.Linq;

namespace BlackMarble
    public class TFSDemo
        public static string[] GetTagsForWorkItem(Uri tfsUri, int workItemId)
            // get a reference to the team project collection
            using (var projectCollection = TfsTeamProjectCollectionFactory.GetTeamProjectCollection(tfsUri))
                // get a reference to the work item tracking service
                var workItemStore = projectCollection.GetService<WorkItemStore>();

                // and get the work item
                var wi = workItemStore.GetWorkItem(workItemId);
                return wi.Tags.Split(';');

but in Java you have to get the field yourself …

import java.net.URI;
import java.net.URISyntaxException;

import com.microsoft.tfs.core.TFSTeamProjectCollection;
import com.microsoft.tfs.core.clients.workitem.WorkItem;
import com.microsoft.tfs.core.clients.workitem.WorkItemClient;
import com.microsoft.tfs.core.httpclient.Credentials;
import com.microsoft.tfs.core.httpclient.DefaultNTCredentials;

public class TFSDemo {
      public static String[] GetTagsForWorkItem(URI tfsUri, int workItemId) 
          // get a reference to the team project collection
          Credentials credentials = new DefaultNTCredentials();
          TFSTeamProjectCollection projectCollection = new TFSTeamProjectCollection(tfsUri, credentials);
          // get a reference to the work item tracking service
          WorkItemClient wic = projectCollection.getWorkItemClient();
          // get the work item and return the tags
          WorkItem wi = wic.getWorkItemByID(workItemId);
          // there is no method for the tags, but can pull it out of the fields
          return wi.getFields().getField("Tags").getValue().toString().split(";");


Given these methods it is possible to write a tool that can select matching work items. Thus allowing you generate any output you require.

Update 14 May 2013

Just had confirmed that at present there is no API to write tags, I had not tried, I only need a read only solution. Keep an eye open for future releases of the SDKs to get a write call method.