But it works on my PC!

The random thoughts of Richard Fennell on technology and software development

Making the drops location for a TFS build match the assembly version number

A couple of years ago I wrote about using the TFSVersion build activity to try to sync the assembly and build number. I did not want to see build names/drop location in the format 'BuildCustomisation_20110927.17’, I wanted to see the version number in the build something like  'BuildCustomisation'. The problem as I outlined in that post was that by fiddling with the BuildNumberFormat you could easily cause an error where duplicated drop folder names were generated, such as

TF42064: The build number 'BuildCustomisation_20110927.17 (' already exists for build definition '\MSF Agile\BuildCustomisation'.

I had put this problem aside, thinking there was no way around the issue, until I was recently reviewing the new ALM Rangers ‘Test Infrastructure Guidance’. This had a solution to the problem included in the first hands on lab. The trick is that you need to use the TFSVersion community extension twice in you build.

  • You use it as normal to set the version of your assemblies after you have got the files into the build workspace, just as the wiki documentation shows
  • But you also call it in ‘get mode’ at the start of the build process prior to calling the ‘Update Build Number ‘ activity. The core issue being you cannot call ‘Update Build Number’ more than once else you tend to see the TF42064 issues. By using it in this manner you will set the BuildNumberFomat to the actual version number you want, which will be used for the drops folder and any assembly versioning.

So what do you need to do?

  1. Open you process template for editing (see the custom build activities documentation if you don’t know how to do this)
  2. Find the sequence ‘ Update Build Number for Triggered Builds’ and at the top of the process template

    • Add TFSVersion activity – I called mine ‘Generate Version number for drop’
    • Add an Assign activity – I called mine ‘Set new BuildNumberFormat’
    • Add a WriteBuildMessage activity – This is option but I do like to see what it generated
  3. Add a string variable GeneratedBuildNumber with the scope of ‘Update Build Number for Triggered Builds’

  4. The properties for the TFSVersion activity should be set as shown below

    • The Action is the key setting, this needs to be set to GetVersion, we only need to generate a version number not set any file versions
    • You need to set the Major, Minor and StartDate settings to match the other copy of the activity in your build process. I good tip is to just cut and paste from the other instance to create this one, so that the bulk of the properties are correct
    • The Version needs to be set to you variable GeneratedBuildNumber this is the outputed version value
  5. The properties for the Assign activities are as follows

    • Set To to BuildNumberFormat
    • Set Value to String.Format("$(BuildDefinitionName)_{0}", GeneratedBuildNumber), you can vary this format to meet your own needs [updated 31 Jul 13 - better to use an _ rarther than a space as this will be used in the drop path)
  6. I also added a WriteMessage activity that outputs the generated build value, but that is optional

Once all this was done and saved back to TFS it could be used for a build. You now see that the build name, and drops location is in the form

[Build name] [Major].[Minor].[Days since start date].[TFS build number]


This is a slight change from what I previously attempted where the 4th block was the count of builds of a given type on a day, now it is the unique TFS generate build number, the number shown before the build name is generated. I am happy with that. My key aim is reached that the drops location contains the product version number so it is easy to relate a build to a given version without digging into the build reports.

Minor issue on TFS 2012.3 upgrade if you are using host headers in bindings

Yesterday I upgraded our production 2012.2 TFS server to update 3. All seemed to go OK and it completed with no errors, it was so much easier now that the update supports the use of SQL 2012 Availability Groups within the update process, no need to remove the DBs from the availability group prior to the update.

However, though there were no errors it did reported a warning, and on a quick check users could not connects to the upgraded server on our usually https URL.

On checking the update log I saw

[Warning@09:06:13.578] TF401145: The Team Foundation Server web application was previously configured with one or more bindings that have ports that are currently unavailable.  See the log for detailed information.
[Info   @09:06:13.578]
[Info   @09:06:13.578] +-+-+-+-+-| The following previously configured ports are not currently available... |+-+-+-+-+-
[Info   @09:06:13.584]
[Info   @09:06:13.584] 1          - Protocol          : https
[Info   @09:06:13.584]            - Host              : tfs.blackmarble.co.uk
[Info   @09:06:13.584]            - Port              : 443
[Info   @09:06:13.584] port: 443
[Info   @09:06:13.585] authMode: Windows
[Info   @09:06:13.585] authenticationProvider: Ntlm

The issue appears if you use host headers, as we do for our HTTPS bindings. The TFS configuration tool does not understand these, so sees more than one binding in our case on 443  (our TFS server VM also hosts as a nuget server on https 443, we use host headers to separate the traffic) . As the tool does not know what to do with host headers, it just deletes the bindings it does no understand.

Anyway the fix was to  manually reconfigured the HTTPS bindings in IIS and all was OK.

On checking with Microsoft it seems this is a know issue, and on their radar to sort out in future.

A day of TFS upgrades

After last nights release of new TFS and Visual Studio bits at the Build conference I spent this morning upgrading my demo VMs. Firstly I upgraded to TFS 2012.3 and then snapshotting before going onto 2013 Preview. So by changing snapshot I can now demo either version. In both cases the upgrade process was as expected, basically a rerun of the configuration wizard with all the fields bar the password prefilled. Martin Hinshelwood has done a nice post if you want more details on the process

Looking at the session at Build on Channel9 there are not too many on TFS, to find out more about the new features then you are probably better to check out the TechEd USA or TechEd Europe streams.

Why can’t I find my build settings on a Git based project on TFS Service?

Just wasted a bit of time trying to find the build tab on a TFS Team Project hosted on the hosted http://tfs.visualstudio.com using a Git repository. I was looking on team explorer expecting to see something like


But all I was seeing the the Visual Studio Git Changes option (just the top bit on the left panel above).

It took to me ages to realise that the issue was I had cloned the Git repository to my local PC using the Visual Studio Tools for Git. So I was just using just Git tools, not TFS tools. As far as Visual Studio was concerned this was just some Git repository it could have been local, GitHub, TFS Service or anything that hosts Git.

To see the full features of TFS Service you need to connect to the service using Team Explorer (the green bits), not just as a Git client (the red bits)


Of course if you only need Git based source code management tools, just clone the repository and use the Git tooling, where inside or outside Visual Studio. The Git repository in TFS is just a standard Git repro so all tools should work. From the server end TFS does not care what client you use, in fact it will still associate you commits, irrespective of client, with TFS work items if you use the #1234 syntax for work item IDs in your comments.

However if you are using hosted TFS from Visual Studio, it probably makes more sense to use a Team Explorer connection so all the other TFS feature light up, such as build. The best bit is that all the Git tools are still there as Visual Studio knows it is still just a Git repository. Maybe doing this will be less confusing when I come to try to use a TFS feature!

Using SYSPREP’d VM images as opposed to Templates in a new TFS 2012 Lab Management Environment

An interesting change with Lab Management 2012 and SCVMM 2012 is that templates become a lot less useful. In the SCVMM 2008 versions you had a choice when you stored VMs in the SCVMM library. …

  • You could store a fully configured VM
  • or a generalised template.

When you added the template to a new environment you could enter details such as the machine name, domain to join and product key etc. If you try this with SCVMM 2012 you just see the message ‘These properties cannot be edited from Microsoft Test Manager’


So you are meant to use SCVMM to manage everything about the templates, not great if you want to do everything from MTM. However, is that the only solution?

An alternative is to store a SYSPREP’d VM as a Virtual Machine in the SCVMM library. This VM can be added as many times as is required to an environment (though if added more than once you are asked if you are sure)


This method does however bring problems of its own. When the environment is started, assuming it is network isolated, the second network adaptor is added as expected. However, as there is no agent on the VM it cannot be configured, usually for a template Lab Management would sort all this out, but because the VM is SYSPREP’d it is left sitting at the mini setup ‘Pick your region’ screen.

You need to manually configure the VM. So the best process I have found is

  1. Create the environment with you standard VMs and the SYSPRED’d one
  2. Boot the environment, the standard ready to use VMs get configured OK
  3. Manually connect to the SYSPREP’d VM and complete the mini setup. You will now have a PC on a workgroup
  4. The PC will have two network adapters, neither connected to you corporate network, both are connected to the network isolated virtual LAN. You have a choice
    • Connect the legacy adaptor to your corporate LAN, to get at a network share via SCVMM
    • Mount the TFS Test Agent ISO
  5. Either way you need to manually install the Test Agent and run the configuration (just select the defaults it should know where the test controller is). This will configure network isolated adaptor to the 192.168.23.x network
  6. Now you can manually join the isolated domain
  7. A reboot the VM (or the environment) and all should be OK

All a bit long winded, but does mean it is easier to build generalised VMs from MTM without having to play around in SCVMM too much. 

I think all would be a good deal easier of the VM had the agents on it before the SYSPREP, I have not tried this yet, but that is true in my option of all VMs used for Lab Management. Get the agents on early as you can, just speeds everything up.

Using git tf to migrate code between TFS servers retaining history

Martin Hinshelwood did a recent post on moving source code between TFS servers using  git tf. He mentioned that you could use the --deep option to get the whole changeset check-in history.

Being fairly new to using Git, in anything other than the simplest scenarios, it took me a while to get the commands right. This is what I used in the end (using the Brian Keller VM for sample data) …

C:\tmp\git> git tf clone http://vsalm:8080/tfs/fabrikamfibercollection $/fabrikamfiber/Main oldserver --deep

Connecting to TFS...

Cloning $/fabrikamfiber/Main into C:\Tmp\git\oldserver: 100%, done.

Cloned 5 changesets. Cloned last changeset 24 as 8b00d7d

C:\tmp\git> git init newserver

Initialized empty Git repository in C:/tmp/git/newserver/.git/

C:\tmp\git> cd newserver

C:\tmp\git\newserver [master]> git pull ..\oldserver --depth=100000000

remote: Counting objects: 372, done.

remote: Compressing objects: 100% (350/350), done.

96% (358/372), 2.09 MiB | 4.14 MiB/s

Receiving objects: 100% (372/372), 2.19 MiB | 4.14 MiB/s, done.

Resolving deltas: 100% (110/110), done.

From ..\oldserver

* branch HEAD -> FETCH_HEAD

C:\tmp\git\newserver [master]> git tf configure http://vsalm:8080/tfs/fabrikamfibercollection $/fabrikamfiber/NewLocation

Configuring repository

C:\tmp\git\newserver [master]> git tf checkin --deep --autosquash

Connecting to TFS...

Checking in to $/fabrikamfiber/NewLocation: 100%, done.

Checked in 5 changesets, HEAD is changeset 30

The key was I had missed the –autosquash option on the final checkin.

Once this was run I could see my checking history, the process is quick and once you have the right command line straight forward. However, just like TFS Integration Platform time is compressed, and unlike TFS Integration Platform you also lose the ownership of the original edits.


This all said, another useful tool in the migration arsenal.

Where did my parameters go when I edited that standard TFS report?

I have been doing some editing of the standard scrum TFS 2012 Sprint Burndown report in SQL 2012 Report Builder. When I ran the report after editing the MDX query in the dsBurndown DataSet to return an extra column I got an error:

  • on a remote PC it just said error with dsBurndown dataset
  • on the server hosting reporting services, or in Report Builder, I got a bit more information, it said the TaskName parameter was not defined.

On checking the state of the dataset parameters before and after my edit I could see that the TaskName parameter had been lost


Manually re-adding it fixed the problem.

Interestingly which parameters were lost seemed to depend on the MDX query edit I made, I assume something is inferring the parameters from the MDX query.

So certainly one to keep an eye on. I suspect this is a feature of Report Builder, maybe I am better just using trusty Notepad to edit the .RDL file. Oh how I love to edit XML in Notepad

Visual Studio 2013 announcement at TechEd USA

Today at TechEd USA Brian Harry announced Visual Studio 2013, have a look at his blog for details of the new ALM features. These include…

  • Agile Portfolio Management
  • Git source control on premises
  • Revised team explorer including pop out windows
  • Improvements in code editing and annotation
  • Improvement in web based test management
  • Team Room – chat like collaboration
  • Cloud based web load testing
  • The start of addition of release management to TFS via the purchase of InRelease

For more info see the various sessions up on Channel 9

My session on TFS at the ‘Building Applications for the Future’

Thanks to everyone who attended my session on ‘TFS for Developers’ at the Grey Matter’s ‘Building Applications for the Future’ event today. As you will have noticed my session was basically slide free, so not much to share there.

As I said at the end of my session to find out more have a look at

Also a couple of people asked by about TFS and Eclipse, which I only mentioned briefly at the end. For more on Team Explorer Everywhere look at the video I did last year on that very subject

Webinar on PreEmptive Analytics tools on the 28th of May

A key requirement for any DevOps strategy is the reporting on how your solution is behaving in the wild. PreEmptive Analytics™ for Team Foundation Server (TFS) can provide a great insight in this area, and there is a good chance you are already licensed for it as part of MSDN.

So why not have a look on the UK MSDN site for more details the free Microsoft hosted event.

MSDN Webinar Improve Software Quality, User Experience and Developer Productivity with Real Time Analytics
Tuesday, May 28 2013: 4:00 – 5:00 pm (UK Time)

Also why not sign up for Black Marble’s webinar event in June on DevOps process and tools in the Microsoft space.