But it works on my PC!

The random thoughts of Richard Fennell on technology and software development

Somewhat nasty upgrade experience with Seesmic Desktop 2

A few days ago I got the regular ‘there is a new version of seesmic’ message.

image

So I pressed update and got the less than helpful message

image

After a quick search on the web I found the log files are in C:\Users\[username]\Documents\Seesmic\Seesmic Desktop 2\Logs, why that cannot be listed on the error dialog I don’t know!.

Unfortunately there was nothing obvious in the log, just a list of plug-ins it loaded. So I went back and tried the upgrade again, reading the release notes (I know a strange idea) and noticed that it mentioned a specific version of Silverlight. A quick check of this showed I was not up to date, so I went to http://www.microsoftr.com/silverlight and ran the installer, which updated my install (I would have expected Windows Update to have done this).

I tried the update again it failed, I then remembered that IE was running and this could have blocked some of the Silverlight assembly updates so I closed all browsers down. I also deleted the Seesmic log files, so I could get a clean look at any errors. I reloaded Seesmic and was present with an updating dialog.

image

And all seems to be working, not sure which step actually fixed it!

Lets seem what the new UI is like to use……

Problems connecting a Netgear WG111 USB Wifi Dongle to a Netgear DG834GT router

Just spent an interesting hour trying to connect a Netgear WG111v2 USB WiFi Dongle to Netgear (Sky Branded) DG834GT router. They are both from the same manufacturer so you would they would work together!

This router was setup with its default Sky settings so WiFi was setup as WPA.

I installed the WG111 onto an XP laptop installed the newly downloaded V 5.1.1308 (26 Dec 2007) drivers and tried to connect. The router was spotted without problems and I was prompted to enter my WPA password, which was printed onto the bottom of the router (I had logged in to router via the web admin console to check this was correct). After what seemed like a long delay I was left not corrected to the router, but with no obvious error.

I fired up my work laptop which has built-in Wifi, this saw the router and connected as soon as the password as entered. Strange I thought, is this an XP or a WG111 problem?

I did a bit of searching and saw this was not an uncommon problem, the WG111 seems a troublesome child. In the end I got it working, this was the process I followed:

  • Via the network connection window in XP I looked at the properties of the WG111
  • On the wireless tab I switched of ‘Use windows to configure my wireless network setting’.

image

  • This allowed me to open the Netgear Wireless Assistance tools to get more diagnostics. I saw that the router was running on the same channel as another local router.
  • Via the web based admin console of the router I changed the channel to a free one, in my case 10 – However, I don’t think this actually fixed the problem.
  • Via the web based admin console of the router I changed the WiFi mode from ‘b and g’ to ‘g only’ – This is the important one I think
  • I saved the changes and rebooted the router and it all worked
  • Just to tidy up, via the network connection window in XP I went back into the properties of the WG111 and on the wireless tab I switched on ‘Use windows to configure my wireless network setting’
  • Finally rebooted the laptop just to check it all worked, it did

I suspect the issue here is the WG111 getting confused if it is in 802.11b or 802.11g network, so removing the confusion fixed the problem

TF215097 error when using a custom build activity

Whist trying to make use of a custom build activity I got the error

TF215097: An error occurred while initializing a build for build definition \Tfsdemo1\Candy: Cannot create unknown type '{clr-namespace:TfsBuildExtensions.Activities.CodeQuality;assembly=TfsBuildExtensions.Activities.StyleCop}StyleCop'

This occurred when the TFS 2010 build controller tried to parse the build process .XAML at the start of the build process. A check of all the logs gave no other information other than this error message, nothing else appeared to have occurred.

If I removed the custom activity from the build process all was OK and the build worked fine.

So my initial though was that the required assembly was not loaded into source control and the ‘version control path to custom assemblies’ set. However on checking the file was there and the path set.

What I had forgotten was that this custom activity assembly had a reference to a TfsBuildExtensions.Activities assembly that contained a base class. It was not that the named assembly was missing but that it could not be loaded because a required assembly was missing. Unfortunately there was no clue to this in the error message or logs.

So if you see this problem check for references you might have forgotten and make sure ALL the required assemblies are loaded into source control on the control path for custom assemblies used by the build controller

0x80004004 when trying to upgrade Live Writer and Messenger

For ages now I have have been prompted when I loaded Live Writer that there was an upgrade available, and every time I tried it get it, at the end of the install it failed and rolled back. As I did not have time to dig into it I just used the older version.

Well today, due to upgrades in our LAN, I need to upgraded Live Messenger and as this is of part of the same Live Essentials 2011 package it not unsurprising I hit the same problem. A bit of experimentation showed the issues was that the upgrade was not able to remove the old version. If i tried to remove it via Control Panel it failed with a 0x80004004 error. In the error log I saw

Product: Windows Live Messenger -- Error 1402. Could not open key: UNKNOWN\Components\A49B6681220C2EA49826913B104EE03B\B55DF58AB1984134795AAE690CDB085B.  System error 5.  Verify that you have sufficient access to that key, or contact your support personnel.

A bit of web research show this seems to be related to 32/64bt issues and maybe debris from the beta version of Live Writer.

The answer was to use Windows Clean Up Utility (remember this is take no prisoners tool so use it with care) and remove all the package with the words ‘Microsoft’ and ‘Live’ in their names. Once this was done the Live Essentials 2011 installer was happy to do a new install, and it even remembered my blog settings!

Adding a Visual Basic 6 project to a TFS 2010 Build

Adding a Visual Basic 6 project to your TFS 2010 build process is not are hard as I had expected it to be. I had assumed I would have to write a custom build workflow template, but it turned out I was able to use the default template with just a few parameters changed from their defaults. This is the process I followed.

I created a basic ‘Hello world’ VB6 application. I had previously made sure that my copy of VB6 (SP6) could connect to my TFS 2010 server using the Team Foundation Server MSSCCI Provider so was able to check this project into source control.

Next I created a MSbuild script capable building the VB project, as follows

<Project ToolsVersion="4.0" DefaultTargets="Default" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
  
  <PropertyGroup>
    <TPath>C:\Program Files\MSBuild\ExtensionPack\4.0\MSBuild.ExtensionPack.tasks</TPath>
    <TPath Condition="Exists('C:\Program Files (x86)\MSBuild\ExtensionPack\4.0\MSBuild.ExtensionPack.tasks')">C:\Program Files (x86)\MSBuild\ExtensionPack\4.0\MSBuild.ExtensionPack.tasks </TPath>
  </PropertyGroup>
  <Import Project="$(TPath)"/>
 
  <PropertyGroup>
    <VBPath>C:\Program Files\Microsoft Visual Studio\VB98\VB6.exe</VBPath>
    <VBPath Condition="Exists('C:\Program Files (x86)\Microsoft Visual Studio\VB98\VB6.exe')">C:\Program Files (x86)\Microsoft Visual Studio\VB98\VB6.exe</VBPath>
  </PropertyGroup>
 
  <ItemGroup>
    <ProjectsToBuild Include="Project1.vbp">
      <OutDir>$(OutDir)</OutDir>
      <!-- Note the special use of ChgPropVBP metadata to change project properties at Build Time -->
      <ChgPropVBP>RevisionVer=4;CompatibleMode="0"</ChgPropVBP>
    </ProjectsToBuild>
  </ItemGroup>
  <Target Name="Default">
    <!-- Build a collection of VB6 projects -->
    <MSBuild.ExtensionPack.VisualStudio.VB6 TaskAction="Build" Projects="@(ProjectsToBuild)" VB6Path="$(VBPath)"/>
  </Target>
 
  <Target Name="clean">
    <Message Text="Cleaning - this is where the deletes would go"/>
    
  </Target>
  
</Project>

This used the MSBuildExtensions task to call VB6 from MSBuild, this MSI needed to be installed on the PC being used for development. Points to note about this script are:

  • I wanted this build to work on both 32bit and 64bit machines so I had to check both the “Program Files” and “Program Files (x86)” directories, the Condition flag is useful for this (I could have used an environment variable as an alternative method).
  • The output directory is set to $(OutDir). This is a parameter that will be passed into the MSBuild process (and is in turn set to a Team Build variable by the workflow template so that the build system can find the built files and copy them to the TFS drop directory).

This MSBuild script file can be tested locally on a development PC using the MSBUILD.EXE from the .NET Framework directory. When I was happy with the build script, I stored it under source control in the same location as the VB project files (though any location in source control would have done)

The next step was to create a new Team Build using the default build template with a workspace containing my VB6 project.

The first thing to edit was the ‘Items to Build’. I deleted whatever was in the list (sorry can’t remember what was there by default). I then added the build.xml file I had just created and stored in source control

image

I then tried to run the build, this if course failed as I needed to install VB6 (SP6) and the MSBuildExtensions on the build server. Once this was done I tried the build again and it work. The only issue was I got a warning that there were no assemblies that Code Analysis could be run against. So I went into the build’s parameters and switched of code analysis and testing as these were not required on this build.

So the process of build ingVB6 on TFS 2010 turned out to much easier than I expect, it just goes to show how flexible the build system in TFS 2010 is. As long as you can express your build as an MSBUILD file it should just work.

You can’t edit a TFS 2010 build workflow template with just Team Explorer installed

I tried to open a TFS 2010 build template within the Visual Studio shell (the bit that gets installed when you put Team Explorer onto a PC) and saw the error “The document contains errors that must be fixed before the designer can be loaded”.

image

At the bottom of the screen it showed that all the underling assemblies could not be found.

The solution is simple, install a ‘real version’ of Visual Studio, I put on Premium. It seems that the shell does not provide all the assemblies that are needed. Once I did this I could edit the XAML with no problems

[More] Fun with WCF, SharePoint and Kerberos

This is a follow up to the post Fun with WCF, SharePoint and Kerberos – well it looks like fun with hindsight

When I wrote the last post I thought I had our WCF Kerberos issues sorted, I was wrong. I had not checked what happened when I tried to access the webpart from outside our TMG firewall. When I did this I was back with the error that I had no security token. To sort this we had to make some more changes.

This is the architecture we ended  with.

image

The problem was that the Sharepoint access rule used a listener in TMG that was setup to HTML form authentication against our AD

image

and the rule then tried to authenticate our Sharepoint server via Kerberos using the negotiated setting in the rule. This worked for accessing the Sharepoint site itself but the second hop to the WCF service failed. This was due to use transitioning between authentication methods.

The solution was to change the access rule to Constrained Kerberos (still with the same Sharepoint server web application SPN)

image

The TMG gateway computer (in the AD) then needed to be set to allow delegation. In my previous post we had just set up any machines requiring delegation to ‘Trust this computer for delegation to any service’. This did not work this time as we had forms authentication in the mix. We had to use ‘Trust this computer for delegation to specific services only’ AND ‘use any authentication protocol’. We then added the server hosting the WCF web service and the Sharepoint front end into the list of services that could be delegated too

image

So now we had it so that the firewall could delegate to the Sharepoint server SPN, but this was the wrong SPN for the webpart to use when trying to talk to the WCF web service. To address this final problem I had to specifically set the SPN in the programmatic creation of the WCF endpoint

this.callServiceClient = new CallService.CallsServiceClient(
    callServiceBinding, 
    new EndpointAddress(new Uri("http://mywcfbox:8080/CallsService.svc"), EndpointIdentity.CreateSpnIdentity("http/mywcfbox:8080")));

By doing this a different SPN is used to connect to the WCF web service (from inside the webpart hosted in Sharepoint) to the one used by the firewall to connect to the Sharepoint server itself.

Simple isn’t it! The key is that you never authenticated with the firewall using Kerberos, so it could not delegate what it did not have.

Error –4002 on Access services on Sharepoint 2010

We have had an internal timesheeting system written in Access services running without any problems for the past through months. At the end of last week, when people tried to submit their timesheets they started getting a -4002 error saying the macro (that saves the weekly sheet) could not be started.

Checking the server event logs, Sharepoint logs and Access services log tables showed nothing. So as all good IT staff do we tried the traditional IISRESET command (on both our Sharepoint web servers) and it all leapt back into life. The only change on our server in the past week has been been the ASP.NET security fix, and associated reboot, but I cannot see why this should effect Access Services, it looked as if it had basically Access services just failed to restart fully after the server reboot.

One to keep an eye on.

Experiences running multiple instances of 2010 build service on a single VM

I think my biggest issue with TFS2010 is the problem that a build controller is tied to a single Team Project Collection (TPC). For a company like mine where we run a TPC for each client this means we have had to start to generate a good number of virtualised build controller/agents. It is especially irritating as I know that the volume of builds on any given controller is low.

A while ago Jim Lamb blogged about how you could define multiple build services on a single box, but the post was full caveats on how it was not supported/recommended etc. Well since this post there has been some discussion on this technique and I think the general feeling is, yes it is not supported, but there is no reason it will not function perfectly well as long as you consider some basic limitations:

  1. The two build controllers don’t know about each other, so you can easily have two build running at the same time, this will have an unpredictable effect on performance.
  2. You have to make sure that the two instances don’t share any workspace disk locations, else they will potentially start overwriting each other
  3. Remember building code is usually IO locked not CPU locked, so when creating your build system think a lot about the disk, throwing memory and CPU will have little effect. The fact we run our build services on VMs and these us a SAN should mitigate much of this potential issue.
  4. The default when you install a controller/agent on a box is for one agent to be created for each core on the box. This rule is still a good idea, but if you are installing two controller/agent sets on a box make sure you don’t define more agents than cores (for me this means on by build VM I have to 2 virtual CPUs as I am running 2 controller/agent pairs)

Jims instructions are straight forward, but I did hit a couple of snags:

  • When you enter the command line to create the instance, make sure there a spaces after the equals for the parameters, else you get an error

sc.exe create buildMachine-collection2 binpath= "C:\Program Files\Microsoft Team Foundation Server 2010\Tools\TfsBuildServiceHost.exe /NamedInstance:buildMachine-collection2" DisplayName= "Visual Studio Team Foundation Build Service Host (Collection2)"

  • I cannot stress enough how important it is give the new instances sensible names, especially as their numbers grow. Jim suggested naming after the TPC they service, for me this is bad move as at any given time were are working for a fairly small number of clients, but the list is changing as projects start and stop. It is therefore easier for me to name a controller for the machine is it hosted on as they will be reassigned between TPC based on need. So I settle on the names in the form ‘build1-collection2’ not a TPC base done. These are easy to associate with the VMs in use when you see them in VS2010
  • When I first tried to get this all up and ran the admin console for the command prompt I got the error shown below

image

After a bit of retyping this went away. I think it was down to stray spaces at end of SET variable, but not 100% sure over this. I would just make sure you strings match if you see this problem.

[Updated 26 Nov 2010] The batch file to start the management console is in the form

      set TFSBUILDSERVICEHOST=buildMachine-collection2 
      "C:\Program Files\Microsoft Team Foundation Server 2010\Tools\tfsmgmt.exe"

Make sure that you run this batch file as administration (right click run as admin) if you don't the management console picks up the default instance

  • Also it is a good Idea to go into the PCs service and make sure your new build service instance is set to auto start, to avoid surprises on a reboot.
  • When you configure the new instance make sure you alter the port it runs on (red box below) I am just incrementing it for each new instance e.g. 9191 –> 9192. If you don’t alter this the service will not start as it’s endpoint will already be in use.
  • Also remember to set the identity of the build service run as (green box), usually [Domain]\TFSBuild, too easy to forget as well as you click through the create dialogs.

image

Once this is set you can start the service and configure the controller and agent(s) exactly as normal.

You might want to consider how the workspace is mapped to the your multiple controllers, so you use different root directories, but that is your call. Thus far leaving it all as it was when I was using a separate VM for each build is working fine for me.

We shall see how many services I can put onto single VM, but it is certainly something I don’t want to push to hard. However that said if you are like use with a relatively low load on the build system this has to be worth looking at to avoid proliferation of build VMs.

Stupid mistake over Javascript parameters

I have been using the Google Maps JavaScript API today. I lost too much time over a really stupid error. I was trying to set the zoom level on a map using the call

map.setZoom(<number>);

I had set my initial zoom level to 5 (the scale is 1-17 I think) in the map load, when I called setZoom to 11 all was fine, but if I set it to any other number is reverted to 5. This different effect for different numbers was a real red herring. The problem was down to how I was handling the variable containing the zoom level prior to passing it to setZoom method. When it was set to 11 it was set explicitly e.g.

var zoomNumber = 11;

However when it was any other value it was being pulled from the value property of a combo box, so was actually a string. My problem was that setZoom does not return an error if if pass something in it does not understand, it just reverts to it’s initial value.

The solution was simple, cast the value to a string and it works as expected

map.setZoom(parseInt(ZoomNumber));