The blogs of Black Marble staff

What is an .xesc file?

Test Professional, after the Lab Management update, now uses Expression Encoder 4.0 to create it video of screen activity. This means that when you run a test and record a video you end up with an attachment called ScreenCapture.xesc.

Now my PC did not have the Expression Encoder 4.0 installed, so did not know what to do with an .xesc file created within our Lab Management environment. To address this the answer is simple. On any PC that might want to view the video either:

  1. Install the Expression Encoder 4 
  2. or install just the Screen Capture Code

Once either of these is done, Media Player can play the .xesc file.

Cannot run CodeUI tests in Lab Management – getting a ’Build directory of the test run is not specified or does not exist’

Interesting user too stupid error today whist adding some CodeUI tests to a Lab Management deployment scenario.

I added the Test Case and associated it with Coded UI test in Visual Studio


I made sure my deployment build had the tests selected


I then ran my Lab Deployment build, but got the error

Build directory of the test run is not specified or does not exist.

This normally means the test VM cannot see the share containing the build. I checked the agent login on the test VM could view the drop location, that was OK, but when I looked for the assembly containing my coded UI tests was just not there.

Then I remembered……..

The Lab build can take loads of snapshots and do a sub-build of the actual product. This all very good for production scenarios, but when you are learning about Lab Management or debugging scripts it can be really slow. To speed up the process I had told my Deploy build to not take snapshots and the use the last compile/build drop it could find. I had just forgotten to rebuild my application on the build server after I had added the coded UI tests. So I rebuild that and tried again, but I got the same problem.

It turns out that though I was missing the assembly the error was before it was required. The true real error was not who the various agents were running as, but the account the test controller was running as. The key was to check the test run log. This can be accessed from the Test Run results (I seemed to have a blind spot looking for these result)


This showed problem, I had selected the default ‘Network Service’ account for the test controller and had not granted it rights to the drop location.


I changed the account to my tfs210lab account as used by the agents and all was OK.


DefaultSiteCollectionTermStore == null

Between writing the previous post about provisioning Managed Metadata fields declaratively we decided to remove the hard-coded term store name shown below

var termStore = session.TermStores["Managed Metadata Service"];

and replace it with

TermStore termStore = session.DefaultSiteCollectionTermStore;

We found that while the solution worked locally when we deployed the feature to the SharePoint 2010 UAT server it didn’t work. Turns out that the DefaultSiteCollectionTermStore was null.

After much scratching of heads we narrowed the issue down the way the Managed Metadata Service Proxy had been set up during the installation of the UAT SharePoint server. The IT team responsible for building the UAT servers used PowerShell to automate the building of the server. For the Managed Metadata Service this had a particular impact.

If you highlight the Managed Metadata Service Connection in Central Administration, then click Properties as shown below:


you will see the properties available…


On our development PCs, which had been setup using the Configuration Wizard, all of the available check boxes above were ticked. On the UAT server the opposite was true. Dutifully we ticked all the boxes, hit OK and tried again. Still no joy, so we set up another site collection and activated the feature. This time the feature worked as expected.

Activating the default storage location for keywords and column specific term sets only works for site collections created after the options have been selected (no surprise). At this point we haven’t been able to work out how to associate the default storage with an existing site collection and the easiest option was to delete the site collection, set up a new one and activate the required features.

Don’t hardcode that build option

I have been using the ExternalTestRunner 2010 Build activity I wrote. I realised that at least one of the parameters I need to set, the ProjectCollection used to publish the test results, was hard coded in my sample. It was set in the form


This is not that sensible, as this value is available using the build API as


It makes no sense to hard code the name of the server if the build system already knows it.

This simple change means that the build templates can be fair easier past between Team Projects Collections

Declaratively Provision Managed Metadata Column in SharePoint 2010

As part of a project I have been working on we wanted to assign categories to items in SharePoint 2010 and decided to use Managed Metadata. Wictor Wilén has a good post explaining what Managed Metadata is and how to set it up. Like Wictor, I also prefer to provision site columns and content types using a combination of declarative CAML and feature receivers. I followed the Wictors post but found he didn’t add the extra steps required to make the solution work.

I’ve outlined the steps we took to provision a managed metadata column to a site content type.

Step 1 - Create a TaxonomyField and a Note field

<?xml version="1.0" encoding="utf-8" ?>
<Elements xmlns="">
  <Field ID="{087C759A-7BD2-4B66-9CF5-277A3399636D}"
  <Field ID="{437B0ED2-A31B-47F7-8C69-6B9DE2C4A4F6}"
Step 2 - Add the TaxonomyField and Note Field to a content type
<?xml version="1.0" encoding="utf-8"?>
<Elements xmlns="">
  <!-- Parent ContentType: Item (0x01) -->
  <ContentType ID="0x0100aa33de693811427c886a5d27f17ed23d"
               Name="Taxonomy Spike - MMSContentType"
               Group="Custom Content Types"
               Description="My Content Type"
      <FieldRef ID="{437B0ED2-A31B-47F7-8C69-6B9DE2C4A4F6}" Name="MMSCategoriesTaxHTField0"/>
      <FieldRef ID="{087C759A-7BD2-4B66-9CF5-277A3399636D}" Name="MMSCategories"/>

Step 3 - Wire up the Note field to the TaxonomyField using a feature receiver

The TaxonomyField has a member called TextField, with the following remark on the MSDN page…

“Every TaxonomyField object contains a related hidden text field that contains a string representation of the taxonomy field value. The hidden text field is identified by the GUID returned by this property.”

…so as well as the defining the Taxonomy Field we also need to define something to store the string representation.

   1:  public override void FeatureActivated(SPFeatureReceiverProperties properties)
   2:  {
   3:      SPSite site = properties.Feature.Parent as SPSite;
   4:      Guid fieldId = new Guid("{087C759A-7BD2-4B66-9CF5-277A3399636D}");
   5:      if (site.RootWeb.Fields.Contains(fieldId))
   6:      {
   7:          TaxonomySession session = new TaxonomySession(site);
   9:          if (session.TermStores.Count != 0)
  10:          {
  11:              var termStore = session.TermStores["Managed Metadata Service"];
  12:              var group = termStore.Groups["Test Store"];
  13:              var termSet = group.TermSets["Categories"];
  14:              TaxonomyField field = site.RootWeb.Fields[fieldId] as TaxonomyField;
  15:              // Connect to MMS 
  16:              field.SspId = termSet.TermStore.Id;
  17:              field.TermSetId = termSet.Id;
  18:              field.TargetTemplate = string.Empty;
  19:              field.AnchorId = Guid.Empty;
  20:              field.TextField = new Guid("{437B0ED2-A31B-47F7-8C69-6B9DE2C4A4F6}");
  21:              field.Update(true);
  22:          }
  23:      }
  24:  }

Line 20 is the important one, this is the code that wires the Note field created declaratively to the TaxonomyField.

&quot;Program too big to fit in memory&quot; when installing a TFS 2010 Test Controller

Just spent a while battling a problem whilst install the TFS 2010 Test Controller. When I launched the install setup program off the .ISO  I could select the Test Controller installer, but then a command prompt flashed up and exited with no obvious error. If I went into the TestControllers directory on the mounted .ISO and ran the setup from a command prompt I saw the error "program too big to fit in memory".

As the box I was trying to use only had 1Gb of memory (below the recommended minimum), I upped it to 2Gb and then to 4Gb but still got the same error.

Turns out the problem was a corrupt .ISO once I had downloaded it again, and dropped by target VM to 2Gb of memory all was fine.

Great iPlayer Media Center Plugin

I am very impressed with the iPlayer Media Center Plugin I found on the Australian Media Center Community, I like most people found it installed fine but that it failed to add itself to the start menu. However this was easily fixed using Media Center Studio once I got my head around the Media Center Studio’s user interface. The basic process is:

  1. Load Media Center Studio
  2. Go onto the Start Menu tab
  3. At the bottom of the screen look for the entry points section (needs expanding usually)
  4. In here you should find the iPlayer application (has a red’ish pay button logo)
  5. Drag it onto whichever part of the start menu you want, but be aware there are some limitation as to where it can go, Extensions worked fine for me
  6. Save the changes
  7. Restart Media Center

Once this is done you should be able to view WMV based iPlayer content from within Media Center. I have seen it take a while to start buffering content, but other than that it seems to work well and certainly looks the part.