But it works on my PC!

The random thoughts of Richard Fennell on technology and software development

Problem with CollectionView.CurrentChanged event not being fired in a WPF application

Had an interesting issue on one of our WPF applications that is using MVVM Lite.


This application is a front end to upload and download folder structures to TFS. On my development PC all was working fine i.e. when we upload a new folder structure to the TFS backend the various combo’s on the download tab are also updated. However, on another test system they were not updated.

After a bit of tracing we could see in both cases the RefreshData method was being called OK, and the CollectionViews  recreated and bound without errors.

private void RefreshData()
                (item, error) =>
                    if (error != null)
                        logger.Error("MainViewModel: Cannot find dataservice");

                    this.ServerUrl = item.ServerUrl.ToString();
                    this.TeamProjects = new CollectionView(item.TeamProjects);
                    this.Projects = new CollectionView(item.Projects);

            this.TeamProjects.CurrentChanged += new EventHandler(this.TeamProjects_CurrentChanged);


So what was the problem? To me it was not obvious.

It turned out it was that on my development system the TeamProject CollectionView contained 2 items, but on the test system only 1.

This means that even though we had recreated the CollectionView and rebound that data and events the calling of MoveCurrentToFirst (or any other move to for that matter) had no effect as there was no place to move to in a collection of one item. Hence the changed event never got called and this in turn stopped the calling of the methods that repopulated the other combos on the download tab.

The solution was to add the following line at the end of the method, and all was OK


Setting up a TFS 2012 proxy in a cross domain system

Today I have been setting up a cross domain TFS proxy. The developers are in one domain and the TFS server in another. Given there is no trust between these domains you have use a trick to get it to work.

So I created a local user tfsproxy.local on both the TFS server and proxy with the same password on each. At the proxy end I made this local user a local admin.

Next I ran the TFS 2012.2 wizard setting the proxy account  to the tfsproxy.local user. It all passed verification, but then I got an error

TF400371: Failed to add the service account 'TFSPROXY\TFSProxy.local' to Proxy Service Accounts Group. Details: TF14045: The identity with type 'System.Security.Principal.WindowsIdentity' and identifier 'S-1-5-21-4198714966-1643845615-1961851592-1024' could not be found..

It seems this is a known issue with TFS2012. It is meant to be fixed in TFS2012.3, so I pulled down the ’go live’ CTP and installed this on the proxy. It made no difference, I assumed it actually needs to be installed on the server end and not just the proxy as this is where the user lookup occurs. However, I did not access to do that upgrade today.

I was about to follow the workaround of removing the proxy from the domain, configuring it and then putting it back. But I then had an idea; the step it was failing on was granting rights, so I did it manually. On the TFS server end I added the tfsproxy.local user to the ‘Proxy Service Accounts Group’. Once this was done the configuration completed without error.

A quick test showed the proxy was working as expected.

Accessing TFS work item tags via the API

With TFS 2012.2 Microsoft have added tags to work items. These provide a great way to add custom information to work items without the need to customise the process template to add custom fields. This is important for users of the hosted http://tfs.visualstudio.com as this does not, at this time, allow any process customisation.

It is easy to add tags to any work item via the TFS web client, just press the Add.. button and either select an existing tag or add a new one. In the following PBI work item example I have added two tags, Tag1 and Tag2.


However, the problem with tags, at present, is that they can only be used as filters within the result of a work item query in the web client, as shown below.


They are not available inside work item queries and are not published to the TFS warehouse/cube for reporting purposes. Hopefully these limitations will be addressed in the future, but not today.

Given all this, I was recently asked by a client if they could use tags to mark PBI work items scheduled for a given release with a view to using this information to produce release notes. Obviously given the current limitations this cannot be done via work item queries or reporting, but you can use the TFS 2012.2 API to do this easily in .NET or Java.

The tags are stored as a ; separated list in a string field property. In C# there is a property in the API to get the tags …

using System;
using Microsoft.TeamFoundation.Client;
using Microsoft.TeamFoundation.WorkItemTracking.Client;
using System.Linq;

namespace BlackMarble
    public class TFSDemo
        public static string[] GetTagsForWorkItem(Uri tfsUri, int workItemId)
            // get a reference to the team project collection
            using (var projectCollection = TfsTeamProjectCollectionFactory.GetTeamProjectCollection(tfsUri))
                // get a reference to the work item tracking service
                var workItemStore = projectCollection.GetService<WorkItemStore>();

                // and get the work item
                var wi = workItemStore.GetWorkItem(workItemId);
                return wi.Tags.Split(';');

but in Java you have to get the field yourself …

import java.net.URI;
import java.net.URISyntaxException;

import com.microsoft.tfs.core.TFSTeamProjectCollection;
import com.microsoft.tfs.core.clients.workitem.WorkItem;
import com.microsoft.tfs.core.clients.workitem.WorkItemClient;
import com.microsoft.tfs.core.httpclient.Credentials;
import com.microsoft.tfs.core.httpclient.DefaultNTCredentials;

public class TFSDemo {
      public static String[] GetTagsForWorkItem(URI tfsUri, int workItemId) 
          // get a reference to the team project collection
          Credentials credentials = new DefaultNTCredentials();
          TFSTeamProjectCollection projectCollection = new TFSTeamProjectCollection(tfsUri, credentials);
          // get a reference to the work item tracking service
          WorkItemClient wic = projectCollection.getWorkItemClient();
          // get the work item and return the tags
          WorkItem wi = wic.getWorkItemByID(workItemId);
          // there is no method for the tags, but can pull it out of the fields
          return wi.getFields().getField("Tags").getValue().toString().split(";");


Given these methods it is possible to write a tool that can select matching work items. Thus allowing you generate any output you require.

Update 14 May 2013

Just had confirmed that at present there is no API to write tags, I had not tried, I only need a read only solution. Keep an eye open for future releases of the SDKs to get a write call method.

Workaround for TF900546: An unexpected error occurred while running the RunTests activity

The problem

I have been working on a project that contains SharePoint 2010 WSP packages and a MSI distributed WPF application. These projects are all in a single solution with their unit tests. I have been getting a problem with our TFS 2012.2 build system, all the projects compile but at the test point I get the unhelpful

TF900546: An unexpected error occurred while running the RunTests activity: 'Unable to load one or more of the requested types. Retrieve the LoaderExceptions property for more information.'.

If I remote onto the build box and loaded the solution in Visual Studio (which was luckily installed on the build box) and tried to run the test in the test explorer I got


and the event log showed

Application: vstest.executionengine.x86.exe
Framework Version: v4.0.30319
Description: The process was terminated due to an unhandled exception.
Exception Info: System.InvalidProgramException
   at System.ServiceModel.ServiceHost..ctor(System.Object, System.Uri[])
   at Microsoft.VisualStudio.TestPlatform.TestExecutor.TestExecutorMain.Run(System.String[])
   at Microsoft.VisualStudio.TestPlatform.TestExecutor.ServiceMain.Main(System.String[])


Faulting application name: vstest.executionengine.x86.exe, version: 11.0.60315.1, time stamp: 0x5142b4b6
Faulting module name: KERNELBASE.dll, version: 6.1.7601.18015, time stamp: 0x50b83c8a
Exception code: 0xe0434352
Fault offset: 0x0000c41f
Faulting process id: 0x700
Faulting application start time: 0x01ce4663824905bd
Faulting module path: C:\Windows\syswow64\KERNELBASE.dll
Report Id: c2689d15-b256-11e2-80aa-00155d0a5201

Next I tried creating in a simple new test project with one unit test, this failed with the same error.

As all the tests work locally on my development PC it was all pointing to a corrupted installed of Visual Studio (and/or the components installed as part of TFS build) on the build box. It should be noted that this was a build box with a good number of additional packages installed to support SharePoint, so patching order could be an issue.

The workaround

Robert, another of our ALM consultants, said he had seen a similar problem at client and suggested changing the test runner.

So in the build definition > process > Basic > Automated Tests > I edited the test run settings and changed to the MSTest VS2010 runner from the default.


Once this was done my tests then ran. However I then got a publishing error

API restriction: The assembly 'file:///C:\Builds\2\BM\CTAppBox.Main.CI\Binaries\_PublishedWebsites\WebServiceTestClient\bin\WebServiceTestClient.dll' has already loaded from a different location. It cannot be loaded from a new location within the same appdomain.

The problem was the build was set to the default test search criteria of *test*. This meant it picked a project it should not have, due to a poor naming convention. As soon as I changed the filter to *.tests all was OK.

I did retry the VS2012 test runner after fixing this naming issue, it had no effect.

I know I need to sort (rebuild) this build box, but now is not the time, I have a working solution that will do for now

Getting SQL 2012 SSIS packages built on TFS 2012.2

I have been trying to get SQL 2012 SSIS packages built on a TFS2012.2 build system. As has been pointed out by many people the problem is you cannot build SQL SSIS packages with MSBuild. This means you have to resort to call Visual Studio DevEnv.exe from within your build.

Jakob Ehn did a great post on this subject, but it is a little dated now due the release of VS 2012 and SQL 2012

The command line

But before we get to TFS, let us sort the actual command line we need to run. So assume VS2012 is in use, the basic command line build for a solution will be

“C:\Program Files (x86)\Microsoft Visual Studio 11.0\Common7\IDE\devenv”  “MySolution.sln” /build "Release|Any CPU"

If you solution only contains SSIS packages then this command line might be OK. However you might just want to build a single SSIS project within a larger solution. In this case you might use

“C:\Program Files (x86)\Microsoft Visual Studio 11.0\Common7\IDE\devenv”  “MySolution.sln” /build "Release|Any CPU"  /project “SSISBits\SSISBis.dtproj”

So to work out the command line you need,  first you need to make sure VS2012 and the Business Intelligence tools are installed on your build box. Once this is done you can try the command line. I decided for my project that I would create a second solution file in the root of the source code that just contained my two SSIS projects, thus making the command line easier (basically one solution for SSIS packages and another for everything else).

So I ran the command line

“C:\Program Files (x86)\Microsoft Visual Studio 11.0\Common7\IDE\devenv”  “MySolution.sln” /build "Release|Any CPU"

and got

Microsoft (R) Microsoft Visual Studio 2012 Version 11.0.60315.1.
Copyright (C) Microsoft Corp. All rights reserved.
Error: Catastrophic failure (Exception from HRESULT: 0x8000FFFF (E_UNEXPECTED))
Error: Catastrophic failure (Exception from HRESULT: 0x8000FFFF (E_UNEXPECTED))
========== Build: 0 succeeded or up-to-date, 2 failed, 0 skipped ==========

Not good. So I checked that if I loaded the same solution in the same copy of Visual Studio 2012.2 it built OK, and it did.

So it seems there is an issue with command line build of SSIS packages in VS2012. A quick search showed it was a logged issue on Microsoft Connect. Luckily a workaround was mentioned, so I tried it, to use the VS2010 version of the tools. So my command line became

“C:\Program Files (x86)\Microsoft Visual Studio 10.0\Common7\IDE\devenv”  “MySolution.sln” /build "Release|Any CPU"

To try this I had to install the SQL Data Tools from my SQL 2012 ISO (not the SSDT tools from the web as these free ones don’t have the BI features). Once this had installed I could issue my command line and it all built OK.

So I knew I had a working command line. I started to put the same version of VS2010 SSDT tools on my TFS build box and moved onto the build process.

The TFS Build Process

So as now I had the command line, I could apply this knowledge to the process Jakob outlined. There are two basic steps

  1. Run the command line build – this was basically the same
  2. Find the files created and copy them to the drops location – the change here is the old post mentions .MSI files, now we are looking for .ISPAC files

As I had decided to have two solutions within my build, I used an if block (based on a solution name convention) to choose if needed to do a MSBuild or DevEnv build. So my process flow for the build phase was.



Also I had to edit the xcopy block to look for .ISPAC files extensions i.e.


Other than these changes the templates was exactly as Jakob described – even down to using VS2010!


So once all this was done I had a build that create my SSIS packages.

All seems a lot of work, life would be so much easier if SSDT

  • Work properly under VS2012
  • Or even better support MSBuild!

Our upgrade to TFS 2012.2 has worked OK

I have mentioned in past posts the issues we had doing our first quarterly update for TFS 2012. Well today we had scheduled our upgrade to 2012.2 and I am please to say it all seems to have worked.

Unlike the last upgrade, this time we were doing nothing complex such as moving DB tier SQL instances; so it was a straight upgrade of a dual tier TFS 2012.1 instance with the DB being stored on a SQL2012 Availability Group (in previous updates you had to remove the DBs from the availability group for the update, with update 2 this is no longer required).

So we ran the EXE, all the files were copied on OK. So when we got to the verify stage of the wizard we had expected no issues, but the tool reported problems with the servers HTTPS Url. A quick check showed the issue was the server had the TFS ODATA service bound to HTTP on port 433, but using a different IP address to that used by TFS itself. As soon as this web site was stopped the wizard passed verification and the upgrade proceeded without an errors.

So it would seem that the verification does a rather basic check to see if port 443 is used on any IP address on the server, not just the ones being used TFS as identified via either IP address or host header bindings.

The only other thing we have had to do is upgrade Tiago’s Team Foundation Task Board Enhancer, without the upgrade the previous version of this extension did not work.

So not too bad an experience.

Error TF400129: Verifying that the team project collection has space for new system fields when upgrading TFS to 2012.2

Whist testing an upgrade of TFS 2010 to TFS 2012.2 I was getting a number of verification errors in the TFS configuration upgrade wizard. They were all TF400129 based such as

TF400129: Verifying that the team project collection has space for new system fields

but also mention models and schema.

A quick search threw up this thread on the subject, but on checking the DB tables I could see my problem was all together more basic. The thread talked of TPCs in incorrect states. In my case I had been provided with an empty DB, so TFS could find not tables at all. So I suppose the error message was a bit too specific, should have been ‘DB is empty!!!!’ error. Once I got a valid file backup restored for the TPC in question all was ok.

A bit more digging showed that I could also see an error if I issued the command

tfsconfig remapdbs /sqlinstances:TFS1 /databaseName:TFS1;Tfs_Configuration

As this too reported it could not find a DB it was expecting.

So the tip is make sure you really have the Dbs restored you think you have.

What machine name is being used when you compose an environment from running VMs in Lab Management?

This is a follow up to my older post on a similar subject 

When composing a new Lab Environment from running VMs the PC you are running MTM on needs to be able to connect to the running VMs. It does this using IP so at the most basic level you need to be able to resolve the name of the VM to an IP address.

If your VM is connected to the same LAN as your PC, but not in the same domain the chances are that DNS name resolution will not work. I find the best option is to put a temporary entry in your local hosts file, keeping it for just as long as the creation process takes.

But what should this entry be? Should it be the name of the VM as it appears in the MTM new environment wizard?

Turns out the answer is no, it needs to be the name as appears in the SC-VMM console


So the hosts table contains the correct entries for the FQDN (watch out for typo’s here, a mistype IP address only adds to the confusion) e.g. wyfrswin7.wyfrs.local shamrockbay.wyfrs.local

Once all this is set then just follow the process in my older post to enable the connection so the new environment wizard can verify OK.

Remember the firewall on the VMs may also be an issue. Just for the period of the environment creation I often disable this.

Also Wireshark is your friend, it will show if the machine you think is responding is the one you really want.

Lab Management with SCVMM 2012 and /labenvironmentplacementpolicy:aggressive

I did a post a year or so ago about setting up TFS Labs and mentioned command

C:\Program Files\Microsoft Team Foundation Server 2010\Tools>tfsconfig lab /hostgroup /collectionName:myTpc  ​/labenvironmentplacementpolicy:aggressive /edit /Name:"My hosts group"

This can be used tell TFS Lab Management to place VMs using any memory that is assigned stopped environments. This allowed a degree of over commitment of resources.

As I discovered today this command only works for SCVMM 2010 based system. if you try it you just get a message saying not support on SCVMM 2012. There appears to be no equivalent for 2012.

However you can use features such as dynamic memory with in SCVMM 2012 so all is not lost

Installing a DB from a DACPAC using Powershell as part of TFS Lab Management deployment

I have been battling setting up a DB deployed via the SQL 2012 DAC tools and Powershell.  My environment was a network isolated pair of machines

  • DC – the domain controller and SQL 2012 server
  • IIS – A web front end

As this is network isolated I could only run scripts on the IIS server, so my DB deploy needed to be remote. So the script I ended up with was

    [string]$sqlserver = $( throw "Missing: parameter sqlserver"),
    [string]$dacpac = $( throw "Missing: parameter dacpac"),
    [string]$dbname = $( throw "Missing: parameter dbname") )

Write-Host "Deploying the DB with the following settings"
Write-Host "sqlserver:   $sqlserver"
Write-Host "dacpac: $dacpac"
Write-Host "dbname: $dbname"

# load in DAC DLL (requires config file to support .NET 4.0)
# change file location for a 32-bit OS
add-type -path "C:\Program Files (x86)\Microsoft SQL Server\110\DAC\bin\Microsoft.SqlServer.Dac.dll"

# make DacServices object, needs a connection string
$d = new-object Microsoft.SqlServer.Dac.DacServices "server=$sqlserver"

# register events, if you want 'em
register-objectevent -in $d -eventname Message -source "msg" -action { out-host -in $Event.SourceArgs[1].Message.Message } | Out-Null

# Load dacpac from file & deploy to database named pubsnew
$dp = [Microsoft.SqlServer.Dac.DacPackage]::Load($dacpac)
$d.Deploy($dp, $dbname, $true) # the true is to allow an upgrade, could be parameterised, also can add further deploy params

# clean up event
unregister-event -source "msg"

Remember the SQL 2012 DAC tools only work with PowerShell 3.0 as they have a .NET 4 dependency.

This was called within the Lab Build using the command line


cmd /c powershell $(BuildLocation)\SQLDeploy.ps1 dc $(BuildLocation)\Database.dacpac sabs

All my scripts worked correctly locally when I ran it on the command line, they were also starting from within the build, but failing with errors along the lines of

Deployment Task Logs for Machine: IIS
Accessing the following location using the lab service account: blackmarble\tfslab, \\store\drops.
Deploying the DB with the following settings
sqlserver:   dc
dbname: Database1
Initializing deployment (Start)
Exception calling "Deploy" with "3" argument(s): "Could not deploy package."
Initializing deployment (Failed)
+  $d.Deploy($dp, $dbname, $true) # the true is to allow an upgrade
+  ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo          : NotSpecified: (:) [], MethodInvocationException
+ FullyQualifiedErrorId : DacServicesException
Stopped accessing the following location using the lab service account: blackmarble\tfslab, \\store\drops.

Though not obvious from the error message the issue was who the script was running as. The TFS agent runs as a machine account, this had no rights to access the SQL on the DC. Once I granted the computer account IIS$ suitable rights to the SQL box all was OK. The alternative would have been to enable mixed mode authentication and user a connection string in the form 

“server=dc;User ID=sa;Password=mypassword”

So now I can deploy my DB on a new build.