But it works on my PC!

The random thoughts of Richard Fennell on technology and software development

TFS 2008 Media on MSDN

Don't make the mistake I did with Team Foundation Server 2008 media on MSDN downloads.

The MSDN file en_visual_studio_team_system_2008_team_foundation_server_workgroup_x86_x64wow_dvd_X14-29253.iso is the workgroup edition, as the file name suggests. The problem is that you cannot upgraded it to a full edition. TFS documentation says that a trial or workgroup edition can be upgraded by entering a valid CD-key when in Add remove programs, maintenance mode; however this option is greyed out if you install from this media. I checked with Microsoft and there is no way round this for this ISO image.

So if you want to install or upgrade to TFS 2008 full edition make sure start with the right media, else you will downgrade your installation to a 5 user workgroup.

TFS WebPart for viewing workitems in SharePoint 2007

I have been trying to get a simple means for our clients to log faults into our TFS system whilst inside our MOSS2007 based customer portal. I had been suffering from two major problems as my previous posts mentioned. However, I now have some solutions or at least workarounds:

  • Historically all users who connected to the TFS server needed a Team Foundation Client CAL - this issue has been addressed by changes in licensing for TFS by Microsoft; basically it is now free to look at work items and add bugs
  • The way the SharePoint and TFS APIs handle user identity (SPUser & ICreditial) do not match, and are not improbable. There is no way round the user having to reenter their user credentials for each system - so my web part logs into TFS as a user set via it's parameters, this is not the same user as the credentials used to authenticate into SharePoint, it is in effect a proxy user for accessing TFS

So where does this leave us?

image

I have posted a set of sample code that provides

  • A web part that lists work items
  • A web part that shows the details of a work item (using the first webpart for all communications to the TFS server)
  • An ASP.NET test harness
  • A .WSP and batch file to install the web parts on a SharePoint Server

ASP.NET Usage:

  1. Load the solution in VS2008 (if you need to use VS2005 you will need to recreate the solution file, and point at the right TFS API DLLs, but everything else should work)
  2. Run the test harness project (note as we are using webparts it will have to create a local ASPNETDB.MDF files the first time it runs. The DB contains the config for the webparts so you will see nothing on the first loading until you setup the parameters)
  3. In the test page select the edit mode at the top of the page, then edit the list webpart in WebPartZone1, enter the following:
    - TFSServerURL – the TFS server e.g. http://tfs.mydomain.com:8080
    - TFSDomain – the domain used to authenticate against e.g. mydomain
    - TFSUsername – the user name to connect to the TFS server as, we create a dedicated user for this webpart to login as.
    - TFSPassword – the password used to authenticate with (shown in clear text)
    - TFSAllowedWorkItemTypes – a comma separated list of work item types to be listed in the control, must match types in the [System.WorkItemType] field in the TFS DB. Depending on the process template in use the types will vary but as a start in most templates there is a ‘bug’ type.
    - TFSDefaultProject – the name of the default TFS project to select on loading, can be left blank
    - TFSPagingSize – the number of rows to show in the list of work items
    - TFSShowOnlyDefaultProject – if this is set only the default project is listed in the available projects – this means a single TFS user, which can see many projects, can be used for different webpages and the project shown locked down with this parameter
    - TFSUsePaging – set if the list of workitems should be page
  4. Once this is all done and saved you should be able to a list of projects and workitems in the first webpart.
  5. To wire the two webparts together select the connection mode radio button at the top of the page
  6. On the web part in WebPartZone2 select the connect option
  7. In the connections zone that appears create a new connection to link the two webparts
  8. Once this is done you should see the detail of any given workitem when it is selected from the list. The problem is you see all the fields in the work item (useful for debugging)
  9. Put web page back into edit mode and edit the settings on the details web part
    - TFSFieldsToShow – a comma separate list of field names to be shown.
    - TFSShowAllField – if checked the TFSFIeldsToShow is ignored
  10. When all the configuration is done you have the option to create new bug workitems and add notes to existing ones.

If you want to use the webparts in SharePoint you need to install the feature pack using the .WSP package - I assume anyone doing this will know enough about WSP files and SharePoint to get going.

This all said it is not as if there are not still problems and qwerks:

  • You do need the TFC client on the server hosting the webparts, or at least the referenced DDLs - bit obvious really that one. 
  • When try to connect to TFS you might get an error about not being able to access the TFS Cache - use SysInternals filemon (or maybe the event logs) to check the directory being used and you will find the problem concerns the user running the hosting process (usually a member of the IIS_WPG group) not having rights to fully access the cache directory. Also it is a good idea to delete all cache files before retrying as some people report they had to rebuild the cache to clear the error.
  • Interesting point I discover which altered the design - Though the pair of webparts worked perfectly in an ASP.NET test harness, the connection options, when in SharePoint, were grey'd out. Turns out you have to add a second parameter to the consumer declaration else the default name is used for all webparts, which confused SharePoint e.g.

[System.Web.UI.WebControls.WebParts.ConnectionConsumer("Work Item List Consumer", "wilc")] // the second param is an ID for connection
public void InitializeProvider(IWorkItemListToDetails provider)
{
      this.workItemListProvider = provider;
}

However the last problem was negated by the fact that in ASP.NET you can have a pairs of connections to get bi-directional communications between web parts. In SharePoint you are only allowed a single connection between any two webparts. Hence the current design using some strange boolean flags and logic to manage call backs in the pre-render stage. I left the older code in place, commented out, as a sample.

  • And the killer problem for me - you only can run these webparts on a 32bit SharePoint as there there are no 64Bit TFS DLLs. A major problem for us as our SharePoint servers are 64Bit. We need to wait for Rosario it seems before TFS moves to 64bit. Even though 32bit CTPs are available for Rosario as yet there is no date for a 64Bit CTP. I also checked, and WOW64 will not help to wrapper the 32Bit DLLs for a 64Bit OS. I have checked all this with Microsoft support.

So what we have here is a sample solution for 32bit environments. I am going to modify this to work for 64bit by putting all the TFS API bits in a separate WebService to host on a 32Bit server. I will post about this went it is done

Powershell and SourceSafe

I posted yesterday on using Powershell to email a TFS user if they had files checked out. Well, we still run a legacy set of SourceSafe databases for old projects that are under maintenance, but not major redevelopment. (Our usual practice is to migrate projects to TFS at major release points).

Anyway, these SourceSafe repositories are just as likely, if not more so, to have files left checked out as TFS. The following script email a list of all checked out files in a SourceSafe DB

# to run this script without signing need to first run
#  Set-ExecutionPolicy  Unrestricted
# (the other option is to sign the script)

#  then run it using
#  .\VSSstatus.ps1

function CheckOutVSSFileForUser(
    [string]$ssdir,
    [string]$to,
    [string]$from,
    [string]$server    )
{
    # get the open file list
    [char]10 + "Checking checked out file in " + $ssdir

    # set the environment variable without this you cannot access the DB
    $env:ssdir=$ssdir

    # we assume the logged in user has rights, as the -Yuid,pwd ss.exe
    # parameter does not work due to the ,
    # could used a named user that takes no password as another option
    # can use the -U option to limit the user listed
    $filelist = &"C:\Program Files\Microsoft Visual Studio\VSS\win32\ss.exe" status $/ -R

    # we have the results as an array of rows, so insert some line feeds
    foreach ($s in $filelist) { $emailbody = $emailbody + [char]10 + $s }
    # not the strange concatenation for the email to field, not a + as I suspected!
    $title = "File currently have checked out in " + $ssdir
    SendEmail $to  $from $server  $title  $emailbody
}

function SendEmail(
    [string]$to,
    [string]$from,
    [string]$server,
    [string]$title,
    [string]$body    )
{

    # send the email
    $SmtpClient = new-object system.net.mail.smtpClient
    $SmtpClient.host = $server
    $SMTPClient.Send($from,$to,$title,$body)
    "Email sent to " + $to
}

# the main body
$emailServer = "mail.domain.com"
$from = "vss@domain.com "
$to = "admin@domain.com "
$ssdir = ("\\server\sourcesafe1",
          "\\server\sourcesafe2",
          "\\server\sourcesafe3")

foreach ($s in $ssdir) {
    CheckOutVSSFileForUser $s $to $from $emailServer
}

Update 10-0ct-2008: Corey Furman has extended script this on his blog http://codeslabs.wordpress.com/2008/10/09/who-has-what-checked-out

 

Alt.Net.UK

I am far from the first to post about the Alt.Net.UK conference next year. I think this is a great idea, I have posted a few times on how I am finding the most useful conferences being the ones about best practice. New technology is great, we all like new toys, but it is the engineering practices we do day to day that have the most effect in the quality of your work, not the IDE we use.

This is an open spaces event, I have only been to one of these before but it was certainly an interesting way to work. Hopefully this should enhance the 'best practice' nature of the conference, I am looking forward to it

Using Powershell to remind users of checked out files from TFS

With any source control system it is possible to leave files checked out. This is especially true if your IDE does the checking out behind the scenes. This is made worse still by the fact you can have a number of workspaces on the same PC in TFS. It is too easy to forget.

It is therefore a good idea to check from time to time that the files you have checked out are the ones you think you have. There is nothing worse than trying to work on a project to find a key file is checked out or locked to another user or PC.

To this end I have written the following Powershell script to check for the files checked out by a team of developers. In this version you have to list the users by name, but I am sure it could be extended to pickup users for an AD or TFS Group

 

# To run this script without signing need to first run
#     Set-ExecutionPolicy  Unrestricted
# If you want run it from a timer you will need to sign it

# Then run it using
#    .\TFstatus.ps1

function CheckOutTFSFileForUser(
    [string]$user,
    [string]$domain,
    [string]$server,
    [string]$from     )
{
    # get the open file list,
    # we put a newline at the start of the line,
    #
used an ASCii code as 'n did not seem to work
    [char]10 + "Checking checked out file for " + $user
    $filelist = &"C:\Program Files\Microsoft Visual Studio 9.0\Common7\IDE\tf.exe" status /user:$user /s:
https://vsts.domain.com:8443

    # we have the results as an array of rows
    #
so insert some line feeds
    foreach ($s in $filelist) { $emailbody = $emailbody + [char]10 + $s }
    # note the strange concatenation for the email to field, not a + as I suspected being new to Powershell
    $title = "Files currently have checked out to " + $user + " in TFS"
    if ($user -eq "*" )
    {
        # if they have asked for all user email send to the admin/from account
        $to = $from
    } else
    {
        $to = $user + $domain
    }
    SendEmail $to  $from $server  $title  $emailbody
}

function SendEmail(
    [string]$to,
    [string]$from,
    [string]$server,
    [string]$title,
    [string]$body    )
{

    # send the email
    $SmtpClient = new-object system.net.mail.smtpClient
    $SmtpClient.host = $server
    $SMTPClient.Send($from,$to,$title,$body)
    "Email sent to " + $to
}

# the main body
$domain = "@domain.com"
$emailServer = "mail.domain.com"
$from = "admin@domain.com "

# the list of users to check, the * means all
$users = ("*","anne","bill","chris")
# loop through the list of users
foreach ($u in $users) { CheckOutTFSFileForUser $u $domain $emailServer $from }

Last night I met a spaceman

Last night I went to a lecture by Dr Alexander Martynov and Colonel Alexander Volkov organised by Space Connections on the Russian space efforts in both the Soviet and current era.

The thing that struck me was the often spoken of different  between the US/NASA technology based solution and the the Russian 'simple first' philosophy e.g. the cosmonaut should be able to fix it themselves with the tools to hand, a philosophy the Mir space station repeatedly showed

This can also be seen in there supposed story of the huge cost that NASA incurred designing a space pen, while the Russians just took a pencil. Though I have heard this is an urban myth as the free floating graphite from a pencil is problem in zero-g for electronics, so both sides used grease pencil.

This said the Russian proposed Mars mission (interesting I cannot seem to find any pages to link to here) they discussed is based on an electric engine, very clean (no nuclear) and state of the art, like the ESA Smart1. However, when you look deeper it is still the idea of very reliable and simple replicated system designed for the long haul. The proposed 'Mar station' making up to seven Earth to Mars and back trips over fifteen years period.

I have to say this all appeals to my luddite side, technology is great in its place but I do like the simple 'pencil like' solution if possible; the simplest thing that will do the job. Now that it a very Agile way of thinking isn't it! if you can say such a thing over a space program.

Intellisense not working in Visual Studio 2008

Since I upgraded my VS2008 Beta2 to the RTM, the intellisense has not been working. I have seen a few posts about this, some suggesting you need to reset the configuration by running

devenv.exe /safemode

(see http://msdn2.microsoft.com/en-us/library/ms241278(VS.80).aspx)

but this did not work for me.

So I had a poke about in Tools|Option and found that on the Text Editor|All Languages that the three checkboxes for Statement Completion where showing neither empty or checked but a fully coloured box - which usually means an unknown settings. So a set these all to checked (a tick) and my Intellisense started working.