But it works on my PC!

The random thoughts of Richard Fennell on technology and software development

Every home should have one

Scientific American has a interesting article on a Data Center in a box. Basically it is a standard shipping container full of Sun Servers. I am our our IT manager would like one! it should fit on any house's driveway, a stylish addition to any home.

It is an interesting spin on the old IBM data recovery model where they turned up with a duplicate of your system in a truck in event of a major failure. The key change is it is being sold as a greatly cheaper option than a traditional data center. All it needs is power, a water supply (cooling) and I suppose given our current weather, a location not prone to flooding.

How books date

I have been having a phase of reading novel's I read years go, this time it is Neuromancer by William Gibson (published 1984). This choice was triggered by it being reviewed on BBC Radio4's A Good Read, the suggestion of Bill Thompson the BBC technology columnist.

I have been surprised by how little it seems to have dated, ok in places the sizes of data streams and the like seem small (how fast capacity moves on), and maybe the way cyberspace is visualized was nearer the mark of today's virtual worlds in Neal Stephenson's Snow Crash (published 1992) but on the whole it seems just as believable as a remember it being when I first read it. Still a worthwhile enjoyable read that has not aged as so much Sci-Fi does.

However, when it comes to books dating it seems that history books suffer the most. Currently I am reading Accidental Empires: How the Boys of Silicon Valley Make Their Millions, Battle Foreign Competition and Still Can't Get a Date, a really interesting book that is making me feel old as many of the systems it talks about I have used; we had a single Tandy TRS80 at school (it replaced our 5 hold punch paper tape teletype!) and my first IT job was for a PC Dealership that sold the IBM PC-AT in the mid 80s.

Accidental Empires was written in the mid 90s (the 2nd edition I have been reading was published 1996) and I think it shows. In the past 10 years the industry has moved on so much, particularly Microsoft's dominance, Bill Gate's role in Microsoft (and as a philanthropist) not to mention the resurrection of Steve Jobs and Apple.

So what does this teach us? Books often say more about their time of publication than the subject they purport to cover whether it is history or the future.

TFS Webpart in MOSS2007


I said I would post again when I had something to show in my TFS WorkItem webparts project, well here it is. I have created a pair of web parts, one to list all the work items and the other to display the details of a given item.

For testing I have create a single ASP.NET 2.0 web page that implements a WebPart manager. This needs to be setup to allow connection between the two WebParts. Note: if you use this sample you will probably have to recreate the connection between the web parts on the test page. Click the edit option (dropdown at top right) for either WebPart and use the connect option. When running it should look like this.


I had tried to use an ASP:Repeater control to display the list of workitems. I added a LinkButton to the first column to allow an item to be selected for display. The problem was that this button disappeared on a postback, a viewstate problem I could not resolve. I therefore swapped from a Repeater to a DataGrid which did not suffer this problem as it has a specific ButtonColumn type. However I left the Repeater code in the projects as it is nice worked example of how to programmatically add a Repeater, also I think it is potentially more flexible (if it worked!)

The details WebPart just does a list of all the fields in the WorkItem, this is because each type of WorkItem can have it's own fields. This does not look that great as there is data the user does not need to see, but without the creating a display for each workitem type I cannot see another solution.

Test Harness

If you look in the tester's web.config you will see the TFS section where you can specify TFS server, UID, Domain etc. The server must be filled in (and is shown on the list web part as a reminder as to what you set it to) but if the others are blank the user current authenticated identity in he browser should be used, if the other fields are filled in these are used instead.

Both these authentication models work fine in the test harness.

SharePoint 2007

After I got the tester working I moved the DLLs onto my test SharePoint 2007 VPC, now this is not a simple operation, but this VPC had already had the changes I specified in my last post, so was fairly quick to do. In summary the steps are:

  • TFS Client API DLLs in the GAC
  • The SharePoint virtual server web.config edit to give full error messages, run a full trust, have my DLLs marked as safe and the TFS entries from the tester web site added.
  • Copy my TFS webpart DLL to the SharePoint virtual server bin directory
  • Copy the two .webpart files in the project to the SharePoint virtual server wpcatalog directory
  • Add the CSS styles for the controls to SharePoint CSS file - but if you miss this out they still look OK.

Now my VPC was not in our main domain, so for testing I initially used hard coded UID and Domain in the web config. Once this was all done I could add the two WebParts to a page and used the connection option on the edit menu to wire them together, and low and behold it all worked.

I then added my test MOSS server to our domain, made sure the authentication was set to Windows and a valid TFS user account was set as the owner of the site.

I then removed the UID settings from the web.config and tried again, it failed. After debugging I could see my HttpContext.Current.User.Identity and  Microsoft.SharePoint.SPContext.Current.Web.CurrentUser objects were the correct user, but the System.Net.CredentialCache.DefaultCredentials always returned what appeared to be blank credentials, a problem as this is the one used by TFS authenticate. Unfortunately neither of the others can be cast to the ICredentials interface required for TFS.

Problem 1: So at this point you are left with hard coded accounts details, not that useful. The options are to throw up a login form, or have the values set by properties on the webpart. I intend to continue to try to find a way around this problem.

Problem 2: As I said my test server is on a VPC and hence 32bit, but our live MOSS2007 servers are all running 64bit. I tried to install my WebParts on our live server, but was stopped as at this time as there are no TFS Client 64bit DLLs, and for some reason the TFS Client DLLs cannot be loaded with WOW64 by the MOSS server. I am trying to find out when we can expect to see 64bit DLLs (or a workaround)


So not a perfect job, but a good learning experience for both TFS and MOSS

You can download the samples here

In theory to get it running in the test harness you just need to load the soluion in VS.NET 2005:

  • Make sure you can reference the TFS DLLs (usually in c:\Program Files\Microsoft Visual Studio 8\Common7\IDE\PrivateAssemblies)
  • Edit the TeamFoundationServer server entry in the test site web.config to point at your TFS server.
  • As it is a WebPart based tested when first run a Database will be created in the test site's APP_DATA directory

I am interest to here anyone comments.

Empty Reports with the eScrum Template

After I installed the eScrum template everything seemed fine bar the reports, only the work item report showed anything, in fact on the other reports their parameter combo boxes were empty. The key difference is the workitem report is drawn form the SQL DBs, while all the other reports are drawn from the OLAP warehouse cube.

If you connect to the SQL server and run the command

select * from dbo._WarehouseConfig where id in ('LastPRocessTimeStamp','RunIntervalSeconds')

it should tell you when the last update from the SQL DB to the OLAP Cube happened. In my case it was days ago, before I put eScrum on the system, hence the empty OLAP based reports.

So I ran the tool on Eric Lee's blog that allows to to force an update. After checking the TFS server's application event log I saw that this had caused the Warehouse Controller Application to start and an update occurred. Why it had not started before I have no idea.

One to keep an eye on.

Losing all files when moving a project in TFS Source Control


Today I tried to move VS2005 solution in TFS source control from one Source Control location to another (on the same server) as I had put it in a stupid location to start with.

So I unbound the source control (File -> Source Control -> Change Source Control) and then went into Source Control Explorer and deleted the directory and then committed the pending delete. This removed the files in the source control BUT ALSO the local ones on my disk, due to the workspace mapping. This was not what I expected, you use source control so you always have copy, a safety net, I had just lost all of my copies by using source control!

The immediate fix

So I checked the recycle bin and nothing, so I knew I had to use the TFS Power Tools to sort this out. The command required is

tftp rollback

The problem here is that you have to be in a directory that is mapped in your workspace so it can poll your TFS server to find projects. So I changed directory to my C:\project\myproject and ran the tool. It does a get latest against the server then shows a dialog where you can find the changeset to rollback to. I selected the one with the project deletes and hit go and it errored. It said 

Postponing undelete to wait for the filename C:\projects\myproject no longer be occupied.

It turns out that as the changeset included a directory delete tfpt must be able to create the directory, which it could not do as it already existed and I was running the tftp.exe from it.

So I had to change directory to one also in the workspace but not for this solution, and also rename or delete the old directory  and then the command worked OK.

Once this was done I was at least back to where I started. I get get all my files, build the project and check in and out.

How it should be done

The way it should have moved the project, assuming I did not care about the change history, is as follows

  1. In VS2005 load the solution then - file -> source control -> change source control -> unbind the project
  2. The select the solution or project in solution explorer and - file -> source control -> add selected projects to source control. The key point is here you are asked if you want to use the old location or a new one. This option is not should if you do not go via the file menu. Pick a new location.
  3. Once it is moved this way the old branch can be removed. Note that you will have to map the old location to the local disk in the workspace so that the delete options become enabled.

Hope this saves someone a bit of time.

Another stupid way to waste an hour or two.....

When developing SharePoint add-in try to remember if you put a previous version of the DLL in the GAC or not. I have just wasted a couple of hours trying to work out why a new version of a DLL is not being read from a site collection's bin directory, when there was an old version in the GAC.

Of course this would not have been an issue if I had been checking the version number properly! A good argument for pair programming to avoid these stupid blind spots

First thoughts on using the TFS API inside Sharepoint 2007

I have been looking at writing some SharePoint 2007 (MOSS) web parts to show information about TFS workitems. This turned out to be a bit more complex than expected.

I wrote a ASP.NET 2.0 webpart using the TFS WorkItem Tracking Web Project on Codeporject as a starting point. My first WebPart just listed workitems in a project. This went OK and the webpart worked fine in a simple ASP.NET Web Part Manager based test page.

Note: Workitem types are specific to the process template in use, this does mean you have to be careful when rendering data from work item, the fields you expect might not be there in all types of workitem, but that was not my major problem.

When I was happy with my first version I tried to load it into my test MOSS VPC, and I had problems, these were the steps I followed to get it working:

  • I copied  the mywebpart.DLL and mywebpart.PDB in the C:\Inetpub\wwwroot\wss\VirtualDirectories\8080\bin directory (8080 was the port of my test MOSS server) Note: I later made the copying of these files a post build event in VS2005, but when I started it expect it to work - I was young and naive!
  • I copy the mywebpart.webpart file to C:\Inetpub\wwwroot\wss\VirtualDirectories\9080\wpcatalog  (I had to create this directory). This file contained the following information that defined how the WebPart is listed in MOSS

  <webPart xmlns="
      <type name="BlackMarble.WebParts.TFS.WorkItemList" />
      <importErrorMessage>Cannot import a TFS Web Part.</importErrorMessage>
        <property name="Title" type="string">TFS Web Part</property>
        <property name="Description" type="string">A webpart that allows presentation of TFS data</property>
        <property name="AllowClose" type="bool">FALSE</property>

  • I also copied the TFS DLLs that were in my VS2005 project's bin directory to C:\Inetpub\wwwroot\wss\VirtualDirectories\8080\bin directory
  • When I tried to add the new webpart to a page, the page refused to render, I knew this was due to the WebPart not being trusted. I needed to edit the web.config in the C:\Inetpub\wwwroot\wss\VirtualDirectories\8080 directory to have the settings I required to allow my WebPart to be accessed:

<!-- In sectionGroups, to allow custom web.config entries -->
<sectionGroup name="TeamFoundationIntegration">
    <section name="TeamFoundationServer" type="MyNameSpace.TeamFoundationConfigurationManager" allowLocation="true" allowDefinition="Everywhere"/>

<!-- In Configuration, the hard coded details are just for initial testing -->
        <TeamFoundationServer server="[my server]
" userName="[uid]" password="[pwd]" domain="[domain]" project="[project]"></TeamFoundationServer>

<!-- In SafeControls -->
<SafeControl Assembly="BlackMarble.WebParts.TFS, Version=, Culture=neutral, PublicKeyToken=8ce1ffa442ef12af" Namespace="BlackMarble.WebParts.TFS" TypeName="*" Safe="True" />

<! edited the following to get error messages -->
<customErrors mode="Off" />
<compilation batch="false" debug="true"

  • Now as I got more detailed error messages I saw that I needed the [assembly: AllowPartiallyTrustedCallers] attribute on my DLL, I added this and rebuilt the DLL. However, this did not fix the problem as the TFS DLLs also needed this attribute to but I could not add it to them as they were pre-compiled.
  • Just to prove the problem was the TFS DLLs I commented out the lines that called TFS DLLs and the WebPart render without a problem.
  • So I tried moving the TFS DLLs to the GAC, as this is a more trusted location, but it did not help.
  • I next edited the web.config again to reduce the MOSS security level

<!-- Edited <trust level="WSS_Minimal" originUrl="" /> to -->
   <trust level="Full" originUrl="" />

  • Once this was done I got a new error, a file access error, so it seemed the TFS DLLs were now trusted enough to be used - I was going in the right direction!
  • I used FIleMon on the server to check exactly which files were the problems and saw I had to give the MYSERVER/IUSR_MYSERVER user create rights to the directory C:\WINDOWS\system32\inetsrv\Microsoft\Team Foundation\1.0\Cache, in fact the Microsoft directory did not exist. This seemed a strange user as the MOSS application pool was running under the Network Service account. This sort of problem is a good reason to use FileMon as it shows the failing file and user.

When all this was done, low and behold the WebPart leapt into life showing a list of Projects and their WorkItems on my TFS server. Now I need to do more to make this useful such as:

  • Get the user identify from SharePoint not a config file
  • Create an associated work item details webpart to allow editing and creation.
  • Sort out the CSS Styling
  • Make sure it works on our main 64Bit MOSS server (I have doubts about this one being possible) not just on 32Bit.
  • Add some AJAX support to make the user experience smoother.

I will post some more, maybe with some samples if it works.

Good pointer to DNR TV from Colin @ DDD5

I have just watched the Dot Net Rocks TV Show that Colin Mackay recommended during his DDD5 session on Mock Objects (and posted about in his blog). 

I cannot agree more that it is great practical intro to using the Model/View/Presenter pattern, easy to follow and not getting bogged down as is so often the case with anything on patterns.

I must get round to watching more on DNR TV, I  listen to to the DNR Radio Podcats regularly but subjects like this are far easier understand when you see the code. However between DNR Radio, DNR TV, Channel 9, MSDN TV etc. and wanting some form of home life I am not sure how I will find the time.

Technorati Tag ddd5

DDD5, thoughts the day after...........

After not really enough sleep it seems a good time reflect on DDD5.

My session

I felt my session went well, and the initial feedback was good. It is always good to be asked question all day about your session, proves it got people thinking; I certainly learnt a few tricks for TFS from these conversations.

A common question I got was 'I am using Product XYZ should a change to TFS?'. To this my answer is always a question 'Does your system, whether off the shelf or home grown work for you?' If it does then don't rush to change it. Certainly look at TFS and see if it is better solution for your project needs - if it is not don't change to TFS. If you don't a working system to manage software development certainly look at TFS, or anything for that matter. Good software only comes out of a structured process, whether the process is Agile or Formal in style all the team must know what it is and adhere to it.

For those at the session I would apologize again about the projector problems that caused the slides to be offset a bit to the right. Unfortunately I did not know about the super secret adjustment knob under the floor panel in Memphis until after I had finished.

Also I was certainly glad I had taken two laptops,I had intended one for PowerPoint and one for VPCs, but only one would sync with the projector. I have see this before with Vista, but it was strange as they were virtually identical Acers with the same ATI chipset and our standard Vista install image. Anyway I think  got away with one PC, who needs to see their speaker notes anyway?

Other sessions  went to

As is so often the case, when you present at a conference you miss half the sessions with your own prep and follow-up conversations. I did manage to make it to two, both very good:

  • Colin Angus Mackay's An introduction to Unit Testing with Mock Objects
  • Abid Quereshi's Windows Workflow Foundation for Your Automated Build

Abid's particularly addressed the interesting issue of automation, but not from the more common TDD/CI side but to provide a flexible means for creating deployment tools. Something I had not considered before.

Next DDD

In the wrap up notices it was mentioned the next DDD might be in the North, this caused an interesting mixture cheers and boos, plus one voice 'where is the north?'.

It will be interesting to see how much backing any moving of the location around the country gets. After yesterday's drive from Yorkshire, a full day at the conference then a drive back, I certainly would appreciate less distance to travel, as I sure others would.

My next DDD Session

After chatting to a few people I think I will propose a general introduction to Scrum for the next DDD, based on the http://www.scrumalliance.org/ materials I got on my Certified Scrum Master course.

In my last two sessions at DDD I have mentioned Scrum, but I realize now that quite a few attendees have little experience of Scrum, just knowing it as a name, so a good general overview would be a good idea. I think I will like this to using Scrum with TFS

Am I right would this be of session of be interest?


Technorati Tag