But it works on my PC!

The random thoughts of Richard Fennell on technology and software development

When you try to run a test in MTM you get a dialog ‘Object reference not set to an instance of an object’

When trying to run a newly created manual test in MTM I got the error dialog

‘You cannot run the selected tests, Object reference not set to an instance of an object’.

image

 

On checking the windows event log I saw

Detailed Message: TF30065: An unhandled exception occurred.

Web Request Details Url: http://……/TestManagement/v1.0/TestResultsEx.asmx

So not really that much help in diagnosing the problem!

Turns out the problem was I had been editing the test case work item type. Though it had saved/imported without any errors (it is validated during these processes) something was wrong with it. I suspect to do with filtering the list of users in the ‘assigned to’ field as this is what I last remember editing, but I might be wrong, it was on a demo TFS instance I have not used for a while.

The solution was to revert the test case work item type back to a known good version and recreate the failing test(s). Its seems once a test was created from the bad template there was nothing you could do about fixing it.

Once this was done MTM ran the tests without any issues.

When I have some time I will do an XML compare of the exported good and bad work item types to see what the problem really was.

Where has my mouse cursor done? Unable to record a video in Microsoft Test Manager

MTM has the feature that you can use Expression Media Encoder 4 to record the test run as a video. To enable this feature, after you install MTM, you have to install the basic version of Expression Encoder, and a few patches see notes here for a list of files and the process.

I recently did this on PC and tried to record a video. As soon as the recording process started the PC virtually stopped. It ran to 50%+ load on a dual core 2.3GHz CPU and the mouse disappeared. As soon as I stopped MTM (via task manager and the keyboard alone) it became responsive again. If I ran MTM without a video recording it was fine. Should be noted that if I ran the Expression Encoder (not via MTM) I got the same problem.

Turns out the problem was not the PC performance but the nVidia video drivers. Once I updated these to the current ones from the nVidia site it all worked as expected.

Recording Silverlight actions on Microsoft Test Manager

Whilst try to record some manual tests in MTM for a new Silverlight application I found I was not getting any actions recorded, just loads of “Fail to record the object corresponding to this action” warnings in the actions window.

Turns out to fix this you have to do three things

  1. Install the Visual Studio 2010 Feature Pack 2 (MSDN Subscribers Only) – this adds Silverlight support to MTM (this I had already done)
  2. In your SIlverlLight application you need to reference the Microsoft.VisualStudio.TestTools.UITest.Extension.SilverlightUIAutomationHelper.dll  from the folder C:\Program Files (x86)\Common Files\microsoft shared\VSTT\10.0\UITestExtensionPackages\SilverlightUIAutomationHelper
  3. Finally if using IE9 you need to run IE in IE8 mode. To do this in IE9 press F12 and select the browser mode

image

Once this was done I got the actions I would expect

image

Experiences upgrading to Lab Manager 2010 RC

Whilst preparing for my session at Techdays I have upgraded my 2010 Beta2 Lab Manager to RC. I am pleased to say the process is far more straight forward than the initial install. Again I used the Lab Manager team blog as my guide, they have revised ‘Getting started with Lab Management 2010 RC’ Parts 1, 2 ,3 and 4 posts to help.

I was able to skip through the initial OS/VMM setup as this has not altered. I chose to throw away my TestVM (with its Beta agents) and create a new one. I upgraded my TFS 2010 instance to RC. The only awkward bit was that I had to extract the RC version of the Lab Build Template from a newly created RC Team Project Collection and load it over my existing Beta2 version. I then recreated the Lab E2E build – and it just worked. My basic sample created for Beta2 build and tested OK.

I got confident then and so decided to build my own application with Coded UI tests, and surprise, surprise it work. OK this was after some reconfiguring of the Test VM to allow UI interactive testing, and a few dead ends, but basically the underlying system worked and I think I now have a working understanding of it.

The whole process is just much slicker than it was, and the online MSDN documentation is much more useful too. This is certainly becoming as more accessible product, but you still do need a good mixture of ITPro and Developer skills that I bet many teams are going to struggle to find in a single person. The best thing I can recommend is build your own system step by step (following the blog posts) so you know how all the moving parts interact. Once you do this you will find it is less daunting.

VHD boot and c00002e2 Errors

For some reason that is beyond me now I did not setup my Lab Manager test system to be a VHD boot. So before installing the 2010 RC version I decided to P2V this system (on the same hardware) to make backups easier whilst testing. All seemed to go well

  1. I used IMAGEX to create a WIM of the disk
  2. Created an empty VHD
  3. Used IMAGEX to apply the WIM to the VHD
  4. Formatted the PC with a default Windows 7 install
  5. Added a VHD boot Windows Server 2008R2 to the PC, tested this all booted OK
  6. Replaced the test VHD with my own and rebooted

…. and it just went into a reboot cycle. Pressing F8 and stopping the reboot on error I saw I had a “c00002e2 Directory Services could not start” error. I managed to get into the PC by pressing F8 and using the AD recovery mode (safe mode did not work). After much fiddling around I eventually noticed that my boot drive was drive D: not C: as I would have expected. My VHD and parent drive had reversed letter assignments. So when the AD services tried to start they look on the parent Windows 7 partition (C:) for their data and hence failed.

I think the root cause was the way I had attached the empty VHD to used IMAGEX. I had not done it using WINPE, but just created in my Windows 7 instance and attached the VHD as drive D: before copying on the WIM

So my revised method was

  1. I used IMAGEX to create a WIM of the disk (actually used the one I already had as there was nothing wrong with it, which was a good job as I had formatted the disk)
  2. Formatted the PC with a default Windows 7 install
  3. Added a VHD boot Windows Server 2008R2 to the PC, tested this all booted OK
  4. Copied my WIM file to the same directory as my newly created W2k8R2.VHD
  5. Copied IMAGEX to this directory
  6. Booted of a Win7 DVD
  7. Pressed Shift F10 to get a prompt at the first opportunity
    1. Ran DISKPART
    2. Select Disk 1
    3. Select Part 1
    4. Detail Part – this was the 100Mb system partition Windows 7 creates and was assigned as drive C: (note when you boots Windows 7 the drive letters get reassigned just to confuse you, as to look at this you would expect your Windows 7 boot drive to be D:)
    5. Assign Letter = Q – this set the system partition to be drive Q, but any unused letter would do
    6. Select vdisk file:d:\vhd\w2k8r2.vhd
    7. attach vdisk – this loaded the VHD and assigned it the letter C: as this was now not in use
    8. list disk
    9. Select disk 2
    10. Select Part 1
    11. detail Part – checked the drive letter was correct
    12. I then exited DISKPART and from the same command prompt ran IMAGEX to put the WIM on this new drive C:
  8. Rebooted and it worked

So the technical tip is make sure your drive letter assignments are what you think they are, it may not be as obvious as you expect.

MTLM becomes MTM

You may have noticed that Microsoft have had another burst of renaming. The tester’s tool in VS2010 started with the codename of Camaro during the CTP phase, this became Microsoft Test & Lab Manager (MTLM) in the Beta 1 and 2 and now in the RC it is call Microsoft Test Manager (MTM).

Other than me constantly referring to things by the wrong name, the main effect of this is to make searching on the Internet a bit awkward, you have to try all three names to get good coverage. In my small corner of the Internet, I will try to help by updating my existing MTLM tag to MTM and update the description appropriately.

At last, my creature it lives……..

I have at last worked all the way through setting up my portable end to end demo of  testing using Windows Test and Lab Manager. The last error I had to resolve was the tests not running in the lab environment (though working locally on the development PC). My the Lab Workflow build was recorded as a partial success i.e. it built, it deployed but all the tests failed.

I have not found a way to see the detail of why the tests failed in VS2010 Build Explorer. However, if you:

  1. Go into MTLM,
  2. Pick Testing Center
  3. Select the Test Tab
  4. Pick the Analyze Test Results link
  5. Pick the test run you want view
  6. The last item in the summary is the error message , as you can see in my case it was that the whole run failed not any of the individual tests themselves

image

So my error was “Build directory of the test run is not specified or does not exist”. This was caused because the Test Controller (for me running as Network Service) could not see the contents of the drop directory. The drop directory is where the test automation assemblies are published as part of the build. Once I gave Network Service read rights to access the \\TFS2010\Drops share my tests, and hence my build, ran to completion.

It has been a interesting journey to get this system up and running. MTLM when you initially look at it is very daunting, you have to get a lot of ducks in a row and there are many pitfalls on the way. If any part fails then nothing works, it feels like a bit of a house of cards. However if you work though it step by step I think you will come to see that the underlying architecture of how it hangs together is not as hard to understand as it initially seems. It is complex and has to be done right, but you can at least see why things need to be done. Much of this perceived complexity for me a developer is that I had to setup a number of ITPro products I am just not that familiar with such as SCOM and Hyper-V Manager. Maybe the answer is to make your evaluation of this product a joint Dev/ITPro project so you both learn.

I would say that getting the first build going (and hence the underlying infrastructure) seems to be the worst part. I feel that now I have a platform I understand reasonably, that producing different builds will not be too bad. I suspect the next raft of complexity will appear when I need a radically different test VM (or worse still a networks of VMs) to deploy and test against.

So my recommendation to anyone who is interest in this product is to get your hands dirty, you are not going to understand it by reading or watching videos, you need to build one. So find some hardware, lots of hardware!

ASPNETCOMPILER: error 1003 with TFS2010 Team build

I have been looking at TFS 2010 Lab Manager recently. One problem I had was that using the sample code from the Lab Manager Blog Walkthru the building of the CALC ASP.NET web site failed on the build server, I got an error

ASPNETCOMPILER: error 1003 The directory ‘c:\build\1LabWalkthru\Calculator –Build\Calc’ does not exist.

and the build service was right it didn’t exist; it should have been ‘c:\build\1LabWalkthru\Calculator –Build\Source\Calc’.

This was due to a problem detailed here. The Solution file had the wrong path in the Debug.AspNetCompiler.PhysicalPath property. It was set to “..\Calc” when it should have been “.\Calc”. Once this was altered the build could find the files.

So you want to demo VS2010 Lab Manager…….

I recently decided to build a demo system for VS2010 Lab Manager. This was for a number of reasons, not least I just wanted to have a proper play with it, but also that I was hoping to do a session on Microsoft Test and Lab Manager at DDD8 (as it turns out my session did not get voted for, maybe better luck for DDS, you can still vote for that conference’s sessions).

Anyway if any of you have looked at the Lab Manager side of MTLM you will know that getting it going is no quick task. Firstly I cannot recommend highly enough the Lab Management Teams’ blog posts ‘Getting started with Lab Management’ Parts 1, 2 ,3 and 4. This type of walkthrough post is a great way to move into a new complex product such as this. It provides the framework to get you going, it doesn’t fix all your problems but gives you a map to follow into the main documentation or other blog posts.

The architecture I was trying to build was as below. My hardware was a Shuttle PC as this was all I could find in the office that could take 8Gb of memory, the bare minimum for this setup. Not as convenient as a laptop for demos, but I was not going to bankrupt myself getting an 8Gb laptop!

image

As I wanted my system to be mobile, it needed to be it’s own domain (demo.com). This was my main problem during the install. MTLM assumes the host server and all the VMs are in the same domain, but that the domain controller (DC) is on some other device on the domain. I installed the DC on the host server; this meant I had to do the following to get it all to work (I should say I did all of these to get my system running, but they may not all be essential, but they are all sensible practice so probably worth doing)

  • Run the VMM Host as a user other than the default of Local System (this is an option set during the installation). The default Local System user has reduced rights on a domain controller, and so is not able to do all that it needs to. I create a new domain account (demo\VMMserver) and used this as the service account for the VMM.
  • The ‘Getting Started’ blog posts suggest a basic install of TFS, this just installs source control, work item tracking and build services using a SQL Express instance. This is fine, but this mode defaults to using the Network Service account to run the TFS web services. This has the same potential issues as the Local System account on the DC, so I swapped this to use a domain account (demo\TFSservice) using the TFS Administration console. 
  • AND THIS IS THE WIERD ONE AND I SUSPECT THE MOST IMPORTANT. As I was using the host system as a DNS and DHCP the VMs needed to be connected to the physical LAN of the host machine to make use of these services. However as I did not want them to pickup my office’s DHCP service I left the physical server’s Ethernet port unplugged. This meant that when I tried to create a new lab environment I got a TF259115 error. Plugging in a standalone Ethernet hub (connected to nothing else) fixed this problem. I am told this is because part of the LAN stack on the physical host is disabled due to the lack of a physical Ethernet link, even though the DNS and DHCP services were unaffected. The other option would have been to run the DNS, DHCP etc on Hyper-V VM(s).
  • When configuring the virtual lab in TFS Administration console the ‘Network Location’ was blank. If you ignore this missing Network location or manually enter it you get a TF259210 error when you verify the settings in TFS Administration. This is a known problem in SCVMM and was fixed by overriding the discovered network and entering demo.com.

So I now had a working configuration, but when I try to import my prepared test VM into Lab Center, I got an “Import failed, the specified owner is not a valid Active Directory Domain Services account, Specify a valid  Active Directory Domain Services account and try again” error. If I check the SCVMM jobs logs (in SCVMM Admin console) I saw this was an Error 813 in the ‘create hardware setup’ step. However, the account the job was running as was a domain user, as was the service account the host was running on (after I had made the changes detailed above) as I was confused.

This turns out to be a user too stupid error; I was logged in as the TFS servers local administrator (tfs2010\administrator) not the domain one (demo\administrator), or actually any domain account with VMM administrator rights. Once I logged in on the TFS server (where I was running MTLM) as a domain account all was OK. Actually I suspect moving to the VMMService and TFSService accounts was not vital, but did not harm.

I could now create my virtual test environment and actually start to create Team Builds that make use of my test lab environment. Also I think having worked though these problems I have a better understanding of how all the various parts underpinning MTLM hang together, a vital piece of knowledge if you intend to make real use of these tools.

Oh and thanks to everyone who helped me when I got stuck