But it works on my PC!

The random thoughts of Richard Fennell on technology and software development

Using SYSPREP’d VM images as opposed to Templates in a new TFS 2012 Lab Management Environment

An interesting change with Lab Management 2012 and SCVMM 2012 is that templates become a lot less useful. In the SCVMM 2008 versions you had a choice when you stored VMs in the SCVMM library. …

  • You could store a fully configured VM
  • or a generalised template.

When you added the template to a new environment you could enter details such as the machine name, domain to join and product key etc. If you try this with SCVMM 2012 you just see the message ‘These properties cannot be edited from Microsoft Test Manager’

image

So you are meant to use SCVMM to manage everything about the templates, not great if you want to do everything from MTM. However, is that the only solution?

An alternative is to store a SYSPREP’d VM as a Virtual Machine in the SCVMM library. This VM can be added as many times as is required to an environment (though if added more than once you are asked if you are sure)

image

This method does however bring problems of its own. When the environment is started, assuming it is network isolated, the second network adaptor is added as expected. However, as there is no agent on the VM it cannot be configured, usually for a template Lab Management would sort all this out, but because the VM is SYSPREP’d it is left sitting at the mini setup ‘Pick your region’ screen.

You need to manually configure the VM. So the best process I have found is

  1. Create the environment with you standard VMs and the SYSPRED’d one
  2. Boot the environment, the standard ready to use VMs get configured OK
  3. Manually connect to the SYSPREP’d VM and complete the mini setup. You will now have a PC on a workgroup
  4. The PC will have two network adapters, neither connected to you corporate network, both are connected to the network isolated virtual LAN. You have a choice
    • Connect the legacy adaptor to your corporate LAN, to get at a network share via SCVMM
    • Mount the TFS Test Agent ISO
  5. Either way you need to manually install the Test Agent and run the configuration (just select the defaults it should know where the test controller is). This will configure network isolated adaptor to the 192.168.23.x network
  6. Now you can manually join the isolated domain
  7. A reboot the VM (or the environment) and all should be OK

All a bit long winded, but does mean it is easier to build generalised VMs from MTM without having to play around in SCVMM too much. 

I think all would be a good deal easier of the VM had the agents on it before the SYSPREP, I have not tried this yet, but that is true in my option of all VMs used for Lab Management. Get the agents on early as you can, just speeds everything up.

Using git tf to migrate code between TFS servers retaining history

Martin Hinshelwood did a recent post on moving source code between TFS servers using  git tf. He mentioned that you could use the --deep option to get the whole changeset check-in history.

Being fairly new to using Git, in anything other than the simplest scenarios, it took me a while to get the commands right. This is what I used in the end (using the Brian Keller VM for sample data) …

C:\tmp\git> git tf clone http://vsalm:8080/tfs/fabrikamfibercollection $/fabrikamfiber/Main oldserver --deep

Connecting to TFS...

Cloning $/fabrikamfiber/Main into C:\Tmp\git\oldserver: 100%, done.

Cloned 5 changesets. Cloned last changeset 24 as 8b00d7d

C:\tmp\git> git init newserver

Initialized empty Git repository in C:/tmp/git/newserver/.git/

C:\tmp\git> cd newserver

C:\tmp\git\newserver [master]> git pull ..\oldserver --depth=100000000

remote: Counting objects: 372, done.

remote: Compressing objects: 100% (350/350), done.

96% (358/372), 2.09 MiB | 4.14 MiB/s

Receiving objects: 100% (372/372), 2.19 MiB | 4.14 MiB/s, done.

Resolving deltas: 100% (110/110), done.

From ..\oldserver

* branch HEAD -> FETCH_HEAD

C:\tmp\git\newserver [master]> git tf configure http://vsalm:8080/tfs/fabrikamfibercollection $/fabrikamfiber/NewLocation

Configuring repository

C:\tmp\git\newserver [master]> git tf checkin --deep --autosquash

Connecting to TFS...

Checking in to $/fabrikamfiber/NewLocation: 100%, done.

Checked in 5 changesets, HEAD is changeset 30

The key was I had missed the –autosquash option on the final checkin.

Once this was run I could see my checking history, the process is quick and once you have the right command line straight forward. However, just like TFS Integration Platform time is compressed, and unlike TFS Integration Platform you also lose the ownership of the original edits.

image

This all said, another useful tool in the migration arsenal.

Where did my parameters go when I edited that standard TFS report?

I have been doing some editing of the standard scrum TFS 2012 Sprint Burndown report in SQL 2012 Report Builder. When I ran the report after editing the MDX query in the dsBurndown DataSet to return an extra column I got an error:

  • on a remote PC it just said error with dsBurndown dataset
  • on the server hosting reporting services, or in Report Builder, I got a bit more information, it said the TaskName parameter was not defined.

On checking the state of the dataset parameters before and after my edit I could see that the TaskName parameter had been lost

image

Manually re-adding it fixed the problem.

Interestingly which parameters were lost seemed to depend on the MDX query edit I made, I assume something is inferring the parameters from the MDX query.

So certainly one to keep an eye on. I suspect this is a feature of Report Builder, maybe I am better just using trusty Notepad to edit the .RDL file. Oh how I love to edit XML in Notepad

Visual Studio 2013 announcement at TechEd USA

Today at TechEd USA Brian Harry announced Visual Studio 2013, have a look at his blog for details of the new ALM features. These include…

  • Agile Portfolio Management
  • Git source control on premises
  • Revised team explorer including pop out windows
  • Improvements in code editing and annotation
  • Improvement in web based test management
  • Team Room – chat like collaboration
  • Cloud based web load testing
  • The start of addition of release management to TFS via the purchase of InRelease

For more info see the various sessions up on Channel 9

My session on TFS at the ‘Building Applications for the Future’

Thanks to everyone who attended my session on ‘TFS for Developers’ at the Grey Matter’s ‘Building Applications for the Future’ event today. As you will have noticed my session was basically slide free, so not much to share there.

As I said at the end of my session to find out more have a look at

Also a couple of people asked by about TFS and Eclipse, which I only mentioned briefly at the end. For more on Team Explorer Everywhere look at the video I did last year on that very subject

Webinar on PreEmptive Analytics tools on the 28th of May

A key requirement for any DevOps strategy is the reporting on how your solution is behaving in the wild. PreEmptive Analytics™ for Team Foundation Server (TFS) can provide a great insight in this area, and there is a good chance you are already licensed for it as part of MSDN.

So why not have a look on the UK MSDN site for more details the free Microsoft hosted event.

MSDN Webinar Improve Software Quality, User Experience and Developer Productivity with Real Time Analytics
Tuesday, May 28 2013: 4:00 – 5:00 pm (UK Time)

Also why not sign up for Black Marble’s webinar event in June on DevOps process and tools in the Microsoft space.

Why do all my TFS labels appear to be associated with the same changeset?

If you look at the labels tab in the source control history in Visual Studio 2012 you could be confused by the changeset numbers. How can all the labels added by my different builds, done over many days, be associated with the same changeset?

image

If you look at the same view in VS 2010 the problem is not so obvious, but that is basically due to the column not being shown.

image

The answer is that the value shown in the first screen shot is for the root element associated with the label. If you drill into the label you can see all the labelled folders and files with their changeset values when the label was created

image

So it is just that the initial screen is confusing drilling in makes it all clearer.

Setting up a TFS 2012 proxy in a cross domain system

Today I have been setting up a cross domain TFS proxy. The developers are in one domain and the TFS server in another. Given there is no trust between these domains you have use a trick to get it to work.

So I created a local user tfsproxy.local on both the TFS server and proxy with the same password on each. At the proxy end I made this local user a local admin.

Next I ran the TFS 2012.2 wizard setting the proxy account  to the tfsproxy.local user. It all passed verification, but then I got an error

TF400371: Failed to add the service account 'TFSPROXY\TFSProxy.local' to Proxy Service Accounts Group. Details: TF14045: The identity with type 'System.Security.Principal.WindowsIdentity' and identifier 'S-1-5-21-4198714966-1643845615-1961851592-1024' could not be found..

It seems this is a known issue with TFS2012. It is meant to be fixed in TFS2012.3, so I pulled down the ’go live’ CTP and installed this on the proxy. It made no difference, I assumed it actually needs to be installed on the server end and not just the proxy as this is where the user lookup occurs. However, I did not access to do that upgrade today.

I was about to follow the workaround of removing the proxy from the domain, configuring it and then putting it back. But I then had an idea; the step it was failing on was granting rights, so I did it manually. On the TFS server end I added the tfsproxy.local user to the ‘Proxy Service Accounts Group’. Once this was done the configuration completed without error.

A quick test showed the proxy was working as expected.

How healthy is my TFS server?

If you want to know the health of the TFS server there are a number of options from a full System Center MOM pack downwards. A good starting point are the performance reports and administrative report pack produced by Grant Holiday. Though the performance pack is  designed for TFS 2008 they work on 2010 and 2012, but you do need to do a bit of editing.

  1. As the installation notes state, create a new shared data source called “TfsActivityReportDS”
    1. Set the connection string to: Data Source=[your SQL server];Initial Catalog=Tfs_[your TPC name]    -  this is the big change as it this used to point to the tfs_ActivityLogging DB, this (as of TFS 2010) is now all rolled into you Team project Collection DB, so you need to alter the connection string to match your TPC. Also note if you use multiple TPCs you will need multiple data sources and reports.
    2. Credentials: domain\user that has access to the Tfs_[TPC Name] database
    3. Use as windows credentials when connecting to the data source
    4. Once uploaded each report needs to be edited via the web manage option to change it Data Source to match the newly created source

      image
    5. You also need to edit each report in the pack via Report Builder as the SQL queries all contain the full path. For each dataset, (and each report can have a few) you need to edit the query to only contain the table name not the whole SQL path

      i.e. From TfsActivityLogging.dbo.tbl_Command to tbl_Command

      image

Once this is done most of the reports should working and giving a good insight into the performance of your server.

Some reports such as the Source Control Requests and Top user bypassing proxy take a bit more SQL query fiddling.

  • Server Status - Top Users Bypassing Proxies – you need to alter the Users part of the query to something like (note the hard coded table path, i am sure we could do better, but I don’t usually need this report as have few proxies, so not made much effort on it)

    Users(
            ID,
            UserName,
            FullyQualifiedAlias,
            eMail
        ) AS
        (

            SELECT [personSK]
                  ,[Name]
                  ,[Domain] + '\' + [Alias] as FullyQualifiedAlias
                  ,[Email]
              FROM [Tfs_2012_Warehouse].[dbo].[DimPerson] with (nolock)
        )

  • Source Control Requests – runs from a straight web service endpoint, so you need to edit the Url it targets to something like

http://localhost:8080/versioncontrol/v3.0/administration.asmx

Unlike the performance reports the admin report packs is designed for TFS 2010/2012 so it works once you make sure the reports are connected to the correct shared data sources.

However, remember the new web based Admin Tools on TFS 2012 actually address many of these areas out the box.

Getting going with the TFS Java API

If you are using the TFS 2012 Java API it is important you read the release notes. It is not enough to just reference the com.microsoft.tfs.sdk-11.0.0.jar file in your classpath as you might expect. You also have to pass a Java system property that associates com.microsoft.tfs.jni.native.base-directory with the location of the native library files that provide platform specific implementation for method calls.  The command line for this is done in the form

java.exe -D"com.microsoft.tfs.jni.native.base-directory=C:\Users\Username\YourApplication\native"

If you don’t set this property you get an exception similar to

Exception in thread "main" java.lang.UnsatisfiedLinkError: com.microsoft.tfs.jni.internal.platformmisc.NativePlatformMisc.nativeGetEnvironmentVariable(Ljava/lang/String;)Ljava/lang/String;

Now setting this property on the command line is all well and good, but how do you do this if you are working in Eclipse?

The answer is you set the argument via the Run > Run Configuration. Select your configuration and enter the VM argument as shown below.

image

Once this is set you can run and debug you application inside Eclipse