But it works on my PC!

The random thoughts of Richard Fennell on technology and software development

Problems connecting a Netgear WG111 USB Wifi Dongle to a Netgear DG834GT router

Just spent an interesting hour trying to connect a Netgear WG111v2 USB WiFi Dongle to Netgear (Sky Branded) DG834GT router. They are both from the same manufacturer so you would they would work together!

This router was setup with its default Sky settings so WiFi was setup as WPA.

I installed the WG111 onto an XP laptop installed the newly downloaded V 5.1.1308 (26 Dec 2007) drivers and tried to connect. The router was spotted without problems and I was prompted to enter my WPA password, which was printed onto the bottom of the router (I had logged in to router via the web admin console to check this was correct). After what seemed like a long delay I was left not corrected to the router, but with no obvious error.

I fired up my work laptop which has built-in Wifi, this saw the router and connected as soon as the password as entered. Strange I thought, is this an XP or a WG111 problem?

I did a bit of searching and saw this was not an uncommon problem, the WG111 seems a troublesome child. In the end I got it working, this was the process I followed:

  • Via the network connection window in XP I looked at the properties of the WG111
  • On the wireless tab I switched of ‘Use windows to configure my wireless network setting’.


  • This allowed me to open the Netgear Wireless Assistance tools to get more diagnostics. I saw that the router was running on the same channel as another local router.
  • Via the web based admin console of the router I changed the channel to a free one, in my case 10 – However, I don’t think this actually fixed the problem.
  • Via the web based admin console of the router I changed the WiFi mode from ‘b and g’ to ‘g only’ – This is the important one I think
  • I saved the changes and rebooted the router and it all worked
  • Just to tidy up, via the network connection window in XP I went back into the properties of the WG111 and on the wireless tab I switched on ‘Use windows to configure my wireless network setting’
  • Finally rebooted the laptop just to check it all worked, it did

I suspect the issue here is the WG111 getting confused if it is in 802.11b or 802.11g network, so removing the confusion fixed the problem

TF215097 error when using a custom build activity

Whist trying to make use of a custom build activity I got the error

TF215097: An error occurred while initializing a build for build definition \Tfsdemo1\Candy: Cannot create unknown type '{clr-namespace:TfsBuildExtensions.Activities.CodeQuality;assembly=TfsBuildExtensions.Activities.StyleCop}StyleCop'

This occurred when the TFS 2010 build controller tried to parse the build process .XAML at the start of the build process. A check of all the logs gave no other information other than this error message, nothing else appeared to have occurred.

If I removed the custom activity from the build process all was OK and the build worked fine.

So my initial though was that the required assembly was not loaded into source control and the ‘version control path to custom assemblies’ set. However on checking the file was there and the path set.

What I had forgotten was that this custom activity assembly had a reference to a TfsBuildExtensions.Activities assembly that contained a base class. It was not that the named assembly was missing but that it could not be loaded because a required assembly was missing. Unfortunately there was no clue to this in the error message or logs.

So if you see this problem check for references you might have forgotten and make sure ALL the required assemblies are loaded into source control on the control path for custom assemblies used by the build controller

0x80004004 when trying to upgrade Live Writer and Messenger

For ages now I have have been prompted when I loaded Live Writer that there was an upgrade available, and every time I tried it get it, at the end of the install it failed and rolled back. As I did not have time to dig into it I just used the older version.

Well today, due to upgrades in our LAN, I need to upgraded Live Messenger and as this is of part of the same Live Essentials 2011 package it not unsurprising I hit the same problem. A bit of experimentation showed the issues was that the upgrade was not able to remove the old version. If i tried to remove it via Control Panel it failed with a 0x80004004 error. In the error log I saw

Product: Windows Live Messenger -- Error 1402. Could not open key: UNKNOWN\Components\A49B6681220C2EA49826913B104EE03B\B55DF58AB1984134795AAE690CDB085B.  System error 5.  Verify that you have sufficient access to that key, or contact your support personnel.

A bit of web research show this seems to be related to 32/64bt issues and maybe debris from the beta version of Live Writer.

The answer was to use Windows Clean Up Utility (remember this is take no prisoners tool so use it with care) and remove all the package with the words ‘Microsoft’ and ‘Live’ in their names. Once this was done the Live Essentials 2011 installer was happy to do a new install, and it even remembered my blog settings!

Adding a Visual Basic 6 project to a TFS 2010 Build

Adding a Visual Basic 6 project to your TFS 2010 build process is not are hard as I had expected it to be. I had assumed I would have to write a custom build workflow template, but it turned out I was able to use the default template with just a few parameters changed from their defaults. This is the process I followed.

I created a basic ‘Hello world’ VB6 application. I had previously made sure that my copy of VB6 (SP6) could connect to my TFS 2010 server using the Team Foundation Server MSSCCI Provider so was able to check this project into source control.

Next I created a MSbuild script capable building the VB project, as follows

<Project ToolsVersion="4.0" DefaultTargets="Default" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
    <TPath>C:\Program Files\MSBuild\ExtensionPack\4.0\MSBuild.ExtensionPack.tasks</TPath>
    <TPath Condition="Exists('C:\Program Files (x86)\MSBuild\ExtensionPack\4.0\MSBuild.ExtensionPack.tasks')">C:\Program Files (x86)\MSBuild\ExtensionPack\4.0\MSBuild.ExtensionPack.tasks </TPath>
  <Import Project="$(TPath)"/>
    <VBPath>C:\Program Files\Microsoft Visual Studio\VB98\VB6.exe</VBPath>
    <VBPath Condition="Exists('C:\Program Files (x86)\Microsoft Visual Studio\VB98\VB6.exe')">C:\Program Files (x86)\Microsoft Visual Studio\VB98\VB6.exe</VBPath>
    <ProjectsToBuild Include="Project1.vbp">
      <!-- Note the special use of ChgPropVBP metadata to change project properties at Build Time -->
  <Target Name="Default">
    <!-- Build a collection of VB6 projects -->
    <MSBuild.ExtensionPack.VisualStudio.VB6 TaskAction="Build" Projects="@(ProjectsToBuild)" VB6Path="$(VBPath)"/>
  <Target Name="clean">
    <Message Text="Cleaning - this is where the deletes would go"/>

This used the MSBuildExtensions task to call VB6 from MSBuild, this MSI needed to be installed on the PC being used for development. Points to note about this script are:

  • I wanted this build to work on both 32bit and 64bit machines so I had to check both the “Program Files” and “Program Files (x86)” directories, the Condition flag is useful for this (I could have used an environment variable as an alternative method).
  • The output directory is set to $(OutDir). This is a parameter that will be passed into the MSBuild process (and is in turn set to a Team Build variable by the workflow template so that the build system can find the built files and copy them to the TFS drop directory).

This MSBuild script file can be tested locally on a development PC using the MSBUILD.EXE from the .NET Framework directory. When I was happy with the build script, I stored it under source control in the same location as the VB project files (though any location in source control would have done)

The next step was to create a new Team Build using the default build template with a workspace containing my VB6 project.

The first thing to edit was the ‘Items to Build’. I deleted whatever was in the list (sorry can’t remember what was there by default). I then added the build.xml file I had just created and stored in source control


I then tried to run the build, this if course failed as I needed to install VB6 (SP6) and the MSBuildExtensions on the build server. Once this was done I tried the build again and it work. The only issue was I got a warning that there were no assemblies that Code Analysis could be run against. So I went into the build’s parameters and switched of code analysis and testing as these were not required on this build.

So the process of build ingVB6 on TFS 2010 turned out to much easier than I expect, it just goes to show how flexible the build system in TFS 2010 is. As long as you can express your build as an MSBUILD file it should just work.

You can’t edit a TFS 2010 build workflow template with just Team Explorer installed

I tried to open a TFS 2010 build template within the Visual Studio shell (the bit that gets installed when you put Team Explorer onto a PC) and saw the error “The document contains errors that must be fixed before the designer can be loaded”.


At the bottom of the screen it showed that all the underling assemblies could not be found.

The solution is simple, install a ‘real version’ of Visual Studio, I put on Premium. It seems that the shell does not provide all the assemblies that are needed. Once I did this I could edit the XAML with no problems

[More] Fun with WCF, SharePoint and Kerberos

This is a follow up to the post Fun with WCF, SharePoint and Kerberos – well it looks like fun with hindsight

When I wrote the last post I thought I had our WCF Kerberos issues sorted, I was wrong. I had not checked what happened when I tried to access the webpart from outside our TMG firewall. When I did this I was back with the error that I had no security token. To sort this we had to make some more changes.

This is the architecture we ended  with.


The problem was that the Sharepoint access rule used a listener in TMG that was setup to HTML form authentication against our AD


and the rule then tried to authenticate our Sharepoint server via Kerberos using the negotiated setting in the rule. This worked for accessing the Sharepoint site itself but the second hop to the WCF service failed. This was due to use transitioning between authentication methods.

The solution was to change the access rule to Constrained Kerberos (still with the same Sharepoint server web application SPN)


The TMG gateway computer (in the AD) then needed to be set to allow delegation. In my previous post we had just set up any machines requiring delegation to ‘Trust this computer for delegation to any service’. This did not work this time as we had forms authentication in the mix. We had to use ‘Trust this computer for delegation to specific services only’ AND ‘use any authentication protocol’. We then added the server hosting the WCF web service and the Sharepoint front end into the list of services that could be delegated too


So now we had it so that the firewall could delegate to the Sharepoint server SPN, but this was the wrong SPN for the webpart to use when trying to talk to the WCF web service. To address this final problem I had to specifically set the SPN in the programmatic creation of the WCF endpoint

this.callServiceClient = new CallService.CallsServiceClient(
    new EndpointAddress(new Uri("http://mywcfbox:8080/CallsService.svc"), EndpointIdentity.CreateSpnIdentity("http/mywcfbox:8080")));

By doing this a different SPN is used to connect to the WCF web service (from inside the webpart hosted in Sharepoint) to the one used by the firewall to connect to the Sharepoint server itself.

Simple isn’t it! The key is that you never authenticated with the firewall using Kerberos, so it could not delegate what it did not have.

Error –4002 on Access services on Sharepoint 2010

We have had an internal timesheeting system written in Access services running without any problems for the past through months. At the end of last week, when people tried to submit their timesheets they started getting a -4002 error saying the macro (that saves the weekly sheet) could not be started.

Checking the server event logs, Sharepoint logs and Access services log tables showed nothing. So as all good IT staff do we tried the traditional IISRESET command (on both our Sharepoint web servers) and it all leapt back into life. The only change on our server in the past week has been been the ASP.NET security fix, and associated reboot, but I cannot see why this should effect Access Services, it looked as if it had basically Access services just failed to restart fully after the server reboot.

One to keep an eye on.

Experiences running multiple instances of 2010 build service on a single VM

I think my biggest issue with TFS2010 is the problem that a build controller is tied to a single Team Project Collection (TPC). For a company like mine where we run a TPC for each client this means we have had to start to generate a good number of virtualised build controller/agents. It is especially irritating as I know that the volume of builds on any given controller is low.

A while ago Jim Lamb blogged about how you could define multiple build services on a single box, but the post was full caveats on how it was not supported/recommended etc. Well since this post there has been some discussion on this technique and I think the general feeling is, yes it is not supported, but there is no reason it will not function perfectly well as long as you consider some basic limitations:

  1. The two build controllers don’t know about each other, so you can easily have two build running at the same time, this will have an unpredictable effect on performance.
  2. You have to make sure that the two instances don’t share any workspace disk locations, else they will potentially start overwriting each other
  3. Remember building code is usually IO locked not CPU locked, so when creating your build system think a lot about the disk, throwing memory and CPU will have little effect. The fact we run our build services on VMs and these us a SAN should mitigate much of this potential issue.
  4. The default when you install a controller/agent on a box is for one agent to be created for each core on the box. This rule is still a good idea, but if you are installing two controller/agent sets on a box make sure you don’t define more agents than cores (for me this means on by build VM I have to 2 virtual CPUs as I am running 2 controller/agent pairs)

Jims instructions are straight forward, but I did hit a couple of snags:

  • When you enter the command line to create the instance, make sure there a spaces after the equals for the parameters, else you get an error

sc.exe create buildMachine-collection2 binpath= "C:\Program Files\Microsoft Team Foundation Server 2010\Tools\TfsBuildServiceHost.exe /NamedInstance:buildMachine-collection2" DisplayName= "Visual Studio Team Foundation Build Service Host (Collection2)"

  • I cannot stress enough how important it is give the new instances sensible names, especially as their numbers grow. Jim suggested naming after the TPC they service, for me this is bad move as at any given time were are working for a fairly small number of clients, but the list is changing as projects start and stop. It is therefore easier for me to name a controller for the machine is it hosted on as they will be reassigned between TPC based on need. So I settle on the names in the form ‘build1-collection2’ not a TPC base done. These are easy to associate with the VMs in use when you see them in VS2010
  • When I first tried to get this all up and ran the admin console for the command prompt I got the error shown below


After a bit of retyping this went away. I think it was down to stray spaces at end of SET variable, but not 100% sure over this. I would just make sure you strings match if you see this problem.

[Updated 26 Nov 2010] The batch file to start the management console is in the form

      set TFSBUILDSERVICEHOST=buildMachine-collection2 
      "C:\Program Files\Microsoft Team Foundation Server 2010\Tools\tfsmgmt.exe"

Make sure that you run this batch file as administration (right click run as admin) if you don't the management console picks up the default instance

  • Also it is a good Idea to go into the PCs service and make sure your new build service instance is set to auto start, to avoid surprises on a reboot.
  • When you configure the new instance make sure you alter the port it runs on (red box below) I am just incrementing it for each new instance e.g. 9191 –> 9192. If you don’t alter this the service will not start as it’s endpoint will already be in use.
  • Also remember to set the identity of the build service run as (green box), usually [Domain]\TFSBuild, too easy to forget as well as you click through the create dialogs.


Once this is set you can start the service and configure the controller and agent(s) exactly as normal.

You might want to consider how the workspace is mapped to the your multiple controllers, so you use different root directories, but that is your call. Thus far leaving it all as it was when I was using a separate VM for each build is working fine for me.

We shall see how many services I can put onto single VM, but it is certainly something I don’t want to push to hard. However that said if you are like use with a relatively low load on the build system this has to be worth looking at to avoid proliferation of build VMs.

Stupid mistake over Javascript parameters

I have been using the Google Maps JavaScript API today. I lost too much time over a really stupid error. I was trying to set the zoom level on a map using the call


I had set my initial zoom level to 5 (the scale is 1-17 I think) in the map load, when I called setZoom to 11 all was fine, but if I set it to any other number is reverted to 5. This different effect for different numbers was a real red herring. The problem was down to how I was handling the variable containing the zoom level prior to passing it to setZoom method. When it was set to 11 it was set explicitly e.g.

var zoomNumber = 11;

However when it was any other value it was being pulled from the value property of a combo box, so was actually a string. My problem was that setZoom does not return an error if if pass something in it does not understand, it just reverts to it’s initial value.

The solution was simple, cast the value to a string and it works as expected


Problem faking multiple SPLists with Typemock Isolator in a single test

I have found a problem with repeated calls to indexed SharePoint Lists with Typemock Isolator 6.0.3. This what I am trying to do…

The Problem

I am using Typemock Isolator to allow me to develop a SharePoint Webpart outside of the SharePoint environment  (there is a video about this on the Typemock site). My SharePoint Webpart uses data drawn from a pair of SharePoint lists to draw a map using Google maps API; so in my test harness web site page I have the following code in the constructor that fakes out the two SPLists and populates them with test content.


   1: public partial class TestPage : System.Web.UI.Page
   2:  {
   3:     public TestPage()
   4:     {
   6:        var fakeWeb = Isolate.Fake.Instance<SPWeb>();
   7:        Isolate.WhenCalled(() => SPControl.GetContextWeb(null)).WillReturn(fakeWeb);
   9:        // return value for 1st call
  10:        Isolate.WhenCalled(() => fakeWeb.Lists["Centre Locations"].Items).WillReturnCollectionValuesOf(CreateCentreList());
  11:        // return value for all other calls
  12:        Isolate.WhenCalled(() => fakeWeb.Lists["Map Zoom Areas"].Items).WillReturnCollectionValuesOf(CreateZoomAreaList());
  13:     }
  15:     private static List<SPListItem> CreateZoomAreaList()
  16:     {
  17:        var fakeZoomAreas = new List<SPListItem>();
  18:        fakeZoomAreas.Add(CreateZoomAreaSPListItem("London", 51.49275, -0.137722222, 2, 14));
  19:        return fakeZoomAreas;
  20:     }
  22:     private static List<SPListItem> CreateCentreList()
  23:     {
  24:        var fakeSites = new List<SPListItem>();
  25:        fakeSites.Add(CreateCentreSPListItem("Aberdeen ", "1 The Road,  Aberdeen ", "Aberdeen@test.com", "www.Aberdeen.test.com", "1111", "2222", 57.13994444, -2.113333333));
  26:        fakeSites.Add(CreateCentreSPListItem("Altrincham ", "1 The Road,  Altrincham ", "Altrincham@test.com", "www.Altrincham.test.com", "3333", "4444", 53.38977778, -2.349916667));
  27:        return fakeSites;
  28:     }
  30:     private static SPListItem CreateCentreSPListItem(string title, string address, string email, string url, string telephone, string fax, double lat, double lng)
  31:     {
  32:         var fakeItem = Isolate.Fake.Instance<SPListItem>();
  33:         Isolate.WhenCalled(() => fakeItem["Title"]).WillReturn(title);
  34:         Isolate.WhenCalled(() => fakeItem["Address"]).WillReturn(address);
  35:         Isolate.WhenCalled(() => fakeItem["Email Address"]).WillReturn(email);
  36:         Isolate.WhenCalled(() => fakeItem["Site URL"]).WillReturn(url);
  37:         Isolate.WhenCalled(() => fakeItem["Telephone"]).WillReturn(telephone);
  38:         Isolate.WhenCalled(() => fakeItem["Fax"]).WillReturn(fax);
  39:         Isolate.WhenCalled(() => fakeItem["Latitude"]).WillReturn(lat.ToString());
  40:         Isolate.WhenCalled(() => fakeItem["Longitude"]).WillReturn(lng.ToString());
  41:         return fakeItem;
  42:     }
  44:     private static SPListItem CreateZoomAreaSPListItem(string areaName, double lat, double lng, double radius, int zoom)
  45:     {
  46:         var fakeItem = Isolate.Fake.Instance<SPListItem>();
  47:         Isolate.WhenCalled(() => fakeItem["Title"]).WillReturn(areaName);
  48:         Isolate.WhenCalled(() => fakeItem["Latitude"]).WillReturn(lat.ToString());
  49:         Isolate.WhenCalled(() => fakeItem["Longitude"]).WillReturn(lng.ToString());
  50:         Isolate.WhenCalled(() => fakeItem["Radius"]).WillReturn(radius.ToString());
  51:         Isolate.WhenCalled(() => fakeItem["Zoom"]).WillReturn(zoom.ToString());
  52:         return fakeItem;
  53:     }
  55: }

The problem is that if I place the following logic in my Webpart

   1: SPWeb web = SPControl.GetContextWeb(Context);
   2: Debug.WriteLine (web.Lists["Centre Locations"].Items.Count);
   3: Debug.WriteLine (web.Lists["Map Zoom Areas"].Items.Count);

I would expect this code to return


But I get


If I reverse two Isolate.WhenCalled lines in the constructor I get


So basically only the last Isolate.WhenCalled is being used, this is not what I expect from the Typemock documentation. .This states that, worst case, the first Isolate.WhenCalled should be used for the first call and the second for all subsequent calls, and actually the index string should be used to differentiate anyway. This is obviously not working. I actually also tried using null in place of the both the index strings and got the same result.

A Workaround

I have managed to workaround this problem with a refactor of my code. In my web part I used to moved all the SPList logic into a pair of methods

   1: private List<GISPoint> LoadFixedMarkersFromSharepoint(SPWeb web, string listName)
   2: {
   3:     var points = new List<GISPoint>();
   5:     foreach (SPListItem listItem in web.Lists[listName].Items)
   6:     {
   7:             points.Add(new GISPoint(
   8:                 listItem["title"], 
   9:                 listItem["address"], 
  10:                 listItem["email addess"], 
  11:                 listItem["site Url"], 
  12:                 listItem["telephone"], 
  13:                 listItem["fax"], 
  14:                 listItem["latitude"], 
  15:                 listItem["longitude"]));
  16:     }
  17:     return points;
  18: }
  20: private List<ZoomArea> LoadZoomAreasFromSharepoint(SPWeb web, string listName)
  21: {
  22:          var points = new List<ZoomArea>();
  24:          foreach (SPListItem listItem in web.Lists[listName].Items)
  25:          {
  26:            points.Add(new ZoomArea(
  27:                 listItem["title"],
  28:                 listItem["latitude"], 
  29:                 listItem["longitude"], 
  30:                 listItem["radius"], 
  31:                 listItem["zoom"]));
  32:          }
  33:          return points;
  34: }

I then used Isolator to intercept the calls to these methods, this can be done by using the Members.CallOriginal flag to wrapper the actual class and intercept the calls to the private methods. Note that I am using different helper methods to create the list of my own data objects as opposed to List<SPListItems>

   1: var controlWrapper = Isolate.Fake.Instance<LocationMap>(Members.CallOriginal);
   2: Isolate.Swap.NextInstance<LocationMap>().With(controlWrapper);
   4: Isolate.NonPublic.WhenCalled(controlWrapper, "LoadFixedMarkersFromSharepoint").WillReturn(CreateCentreListAsGISPoint());
   5: Isolate.NonPublic.WhenCalled(controlWrapper, "LoadZoomAreasFromSharepoint").WillReturn(CreateZoomAreaListAsZoomItems());

My workaround, in my opinion, is a weaker test as I am not testing my conversion of SPListItems to my internal data types, but at least it works

I have had to go down this route due to a bug in Typemock Isolator (which has been logged and recreated by Typemock, so I am sure we can expect a fix soon). However it does show how powerful Isolator can be when you have restrictions in the changes you can make to a code base.Wrapping a class with Isolator can upon up a whole range of options.