The blogs of Black Marble staff

What machine name is being used when you compose an environment from running VMs in Lab Management?

This is a follow up to my older post on a similar subject 

When composing a new Lab Environment from running VMs the PC you are running MTM on needs to be able to connect to the running VMs. It does this using IP so at the most basic level you need to be able to resolve the name of the VM to an IP address.

If your VM is connected to the same LAN as your PC, but not in the same domain the chances are that DNS name resolution will not work. I find the best option is to put a temporary entry in your local hosts file, keeping it for just as long as the creation process takes.

But what should this entry be? Should it be the name of the VM as it appears in the MTM new environment wizard?

Turns out the answer is no, it needs to be the name as appears in the SC-VMM console


So the hosts table contains the correct entries for the FQDN (watch out for typo’s here, a mistype IP address only adds to the confusion) e.g. wyfrswin7.wyfrs.local shamrockbay.wyfrs.local

Once all this is set then just follow the process in my older post to enable the connection so the new environment wizard can verify OK.

Remember the firewall on the VMs may also be an issue. Just for the period of the environment creation I often disable this.

Also Wireshark is your friend, it will show if the machine you think is responding is the one you really want.

Lab Management with SCVMM 2012 and /labenvironmentplacementpolicy:aggressive

I did a post a year or so ago about setting up TFS Labs and mentioned command

C:\Program Files\Microsoft Team Foundation Server 2010\Tools>tfsconfig lab /hostgroup /collectionName:myTpc  ​/labenvironmentplacementpolicy:aggressive /edit /Name:"My hosts group"

This can be used tell TFS Lab Management to place VMs using any memory that is assigned stopped environments. This allowed a degree of over commitment of resources.

As I discovered today this command only works for SCVMM 2010 based system. if you try it you just get a message saying not support on SCVMM 2012. There appears to be no equivalent for 2012.

However you can use features such as dynamic memory with in SCVMM 2012 so all is not lost

Kerbal Space Program - Its educational and written in Mono too!


My son is really taken with Kerbal Space Program. This great games allows you to design your own  space craft and so run your own on-going space program, all with a realistic physics engine.

What is particularly nice is that this cross platform Mono based application is being built in a very agile manner with a new release most weeks, each adding features as well as bug fixes. There also seems to be an active community of people building plug-ins for extra space craft components and rovers.

I am not sure how much orbital mechanics will appear in his school exams this year, but it is certainly educational in the longer term.

TF900548 when using my Typemock 2012 TFS custom build activity

Using the Typemock TFS 2012 Build activity I created I had started seen the error

TF900548: An error occurred publishing the Visual Studio test results. Details: 'The following id must have a positive value: testRunId.'

I thought it might be down to having patched our build boxes to TFS 2012 Update 1, maybe it needed to be rebuild due to some dependency? However, on trying the build activity on my development TFS server I found it ran fine.

I made sure I had the same custom assemblies and Typemock autorun folder and build definition on both systems, I did, so it was not that.

Next I tried running the build but targeting an agent not on the same VM as the build controller. This worked, so it seems I have a build controller issues. So I ran Windows update to make sure the OS was patched it to date, it applied a few patches and rebooted. And all was OK my test ran gain.

It does seem that for many build issues the standard switch it off and back on again does the job

Black Marble is hosting the Yorkshire Chapter of the Global Windows Azure Bootcamp on the 27th of April

Black Marble is hosting the Yorkshire Chapter of the Global Windows Azure Bootcamp taking place in several locations globally on the April 27th, 2013. This free community organised event is one day deep dive class where you will get you up to speed on developing for Windows Azure. The class includes a trainer with deep real world experience with Windows Azure, as well as a series of labs so you can practice what you just learned.

Black Marble’s event will be run by Robert Hogg (Microsoft Integration MVP) and Steve Spencer (Windows Azure MVP). Come along and join the global Azure event of the year!

Check out the prerequisites you need to install on your PC and here to sign up

Global Azure Bootcamp Logo

Kinect 1.7 update now out

Kinect for Windows has just had a update to version 1.7. The Kinect team are calling it "our most significant update to the SDK since we released the first version"

And they are right, it is just awesome , will be talking a lot about Kinect 1.7 over the next few months.

The new version includes

Kinect Fusion -   3D object scanning application  creates live 3D models

New recognisable gestures - push-to-press buttons, grip-to-pan

and new multiple users and two person interactions

along with Microsoft pushing a bunch of code samples to CodePlex in a move to deliver open source samples get them here

Get the SDK here

and the Developer toolkit is here remember you need both kits (install the SDK first)


Changing from Incandescent to LED Lighting

Posts on this blog are usually IT related so I thought it might make a change to write about something that, while still technology related, is a little different…

As some of my colleagues will attest to, I firmly believe in a well insulated house and buying products for the home that are as efficient as possible. My Home Server, for example, runs on a low power CPU in an effort to reduce the day-to-day running costs.

Many of our lights at home have already been converted to use CFL bulbs, replacing the original incandescent bulbs. I’m not a great fan of CFL bulbs however for a couple of reasons:

  • The start-up time of some bulbs still seems to be long (not that that is necessarily an issue on a winter morning when I don’t want to be immediately blinded when switching the lights on)
  • The bulbs contain mercury (albeit in small amounts); if you break a bulb, both the bulb and the items you use to clean up should be treated as hazardous waste (see for details)

On the plus side however, exchanging the incandescent bulbs for CFL ones has significantly reduced the number of bulbs I have to change. Our living room light, for example, used to need a bulb changing on average once per month as opposed to about once every 2-3 years for CFL bulbs. In addition, my (albeit approximate) calculations suggest that we save about £5-7 per bulb per year in energy costs, so in general the CFL bulbs pay for themselves within a few months.

Many of our new lights however use GU10 incandescent bulbs and while GU10 CFL bulbs are available, they are typically longer than the original incandescent bulbs and so will not fit in many of the housings (e.g. in spotlights etc.)

LED bulbs are however available and with recent improvements to LED technology, can now match the light output of the incandescent bulbs they replace. Even better, many of the GU10 LED replacements are exactly the same size as the original bulbs and are therefore direct replacements. I like LED bulbs for the following reasons:

  • Fast start – LED lights are at full output almost immediately. In fact they beat an incandescent bulb, one of the reasons that they are used in brake lights on many cars these days
  • ‘Warm white’ bulbs are now available. White LEDs always used to be ‘cold white’, i.e. showed a significant blue cast, making them a very harsh light. Great for some specific applications, but not so nice for everyday use. Dimmable bulbs are also available.

The bulbs I favour are as follows:


These have 20 SMD LEDs and produce a light output equivalent to a 50W incandescent. Hopefully if there are failures of the individual LEDs within the bulb, the failure will be gradual rather than suddenly stopping working completely, giving us time to source replacements.

I have, however, found an issue when replacing a set of incandescent GU10 bulbs with their LED equivalents. We replaced a set of 4 bulbs in a kitchen fitting with LED bulbs and found that when the lights were switched off, the bulbs still glowed gently. With a single incandescent bulb and 3 LED bulbs in the fitting, the issue didn’t occur. Following a little research, it became obvious that even with a properly earthed system, the capacitively coupled power from live to switched live is enough to cause an LED bulb to glow gently. With an incandescent bulb in the fitting, the resistance of this bulb was low enough to effectively absorb the leakage current and stop the LEDs from glowing.

There is a solution to the issue, which is to fit a resister-capacitor combination (a contact suppressant; not designed for this purpose, but works perfectly well) across the terminals of the light fitting. These can either be a DIY solution, but I have found a product recommended for this purpose, which is a combination 0.1uF capacitor and a 100 ohm resistor in a package suitable for use with 240V AC:


The link goes to a Farnell page, but I am sure that there are also available from other quality electronics resellers. There is also a 0.22uF version for longer circuits, should that be required. I’d strongly recommend using a bit of heat-shrink tubing on each lead of the package to ensure that the supply will not come in contact with the light casing.

Adding one of these devices to our kitchen light has completely solved the issue of the bulbs glowing even when switched off. The LED bulbs are saving something like 90% of the energy (and therefore the running costs) that would be consumed by incandescent bulbs and hopefully they will have a very long life span.

Installing a DB from a DACPAC using Powershell as part of TFS Lab Management deployment

I have been battling setting up a DB deployed via the SQL 2012 DAC tools and Powershell.  My environment was a network isolated pair of machines

  • DC – the domain controller and SQL 2012 server
  • IIS – A web front end

As this is network isolated I could only run scripts on the IIS server, so my DB deploy needed to be remote. So the script I ended up with was

    [string]$sqlserver = $( throw "Missing: parameter sqlserver"),
    [string]$dacpac = $( throw "Missing: parameter dacpac"),
    [string]$dbname = $( throw "Missing: parameter dbname") )

Write-Host "Deploying the DB with the following settings"
Write-Host "sqlserver:   $sqlserver"
Write-Host "dacpac: $dacpac"
Write-Host "dbname: $dbname"

# load in DAC DLL (requires config file to support .NET 4.0)
# change file location for a 32-bit OS
add-type -path "C:\Program Files (x86)\Microsoft SQL Server\110\DAC\bin\Microsoft.SqlServer.Dac.dll"

# make DacServices object, needs a connection string
$d = new-object Microsoft.SqlServer.Dac.DacServices "server=$sqlserver"

# register events, if you want 'em
register-objectevent -in $d -eventname Message -source "msg" -action { out-host -in $Event.SourceArgs[1].Message.Message } | Out-Null

# Load dacpac from file & deploy to database named pubsnew
$dp = [Microsoft.SqlServer.Dac.DacPackage]::Load($dacpac)
$d.Deploy($dp, $dbname, $true) # the true is to allow an upgrade, could be parameterised, also can add further deploy params

# clean up event
unregister-event -source "msg"

Remember the SQL 2012 DAC tools only work with PowerShell 3.0 as they have a .NET 4 dependency.

This was called within the Lab Build using the command line


cmd /c powershell $(BuildLocation)\SQLDeploy.ps1 dc $(BuildLocation)\Database.dacpac sabs

All my scripts worked correctly locally when I ran it on the command line, they were also starting from within the build, but failing with errors along the lines of

Deployment Task Logs for Machine: IIS
Accessing the following location using the lab service account: blackmarble\tfslab, \\store\drops.
Deploying the DB with the following settings
sqlserver:   dc
dbname: Database1
Initializing deployment (Start)
Exception calling "Deploy" with "3" argument(s): "Could not deploy package."
Initializing deployment (Failed)
+  $d.Deploy($dp, $dbname, $true) # the true is to allow an upgrade
+  ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo          : NotSpecified: (:) [], MethodInvocationException
+ FullyQualifiedErrorId : DacServicesException
Stopped accessing the following location using the lab service account: blackmarble\tfslab, \\store\drops.

Though not obvious from the error message the issue was who the script was running as. The TFS agent runs as a machine account, this had no rights to access the SQL on the DC. Once I granted the computer account IIS$ suitable rights to the SQL box all was OK. The alternative would have been to enable mixed mode authentication and user a connection string in the form 

“server=dc;User ID=sa;Password=mypassword”

So now I can deploy my DB on a new build.

You don’t half get strange errors when two servers have the same SID

You don’t half get strange errors when building a test environment if when you run SYSPREP’d each copy of your VM base image you forget to check the ‘generalize’ box


If you forget this, as I did, each VM has a different name but the same SID. Basically the domain/AD is completely confused as who is what. The commonest error I saw was that I could not setup application (Report Services, SP 2010 and TFS 2012) with domain service accounts. In all cases I got messages about missing rights or cannot communicate with domain controller.

The fix was to basically start again. I re-SDYSPREP’d one of the pair of boxes I had to it reset it’s SID, I stripped off what I was trying to install, re-added the server to the domain and installed the applications again. Once this was done all was fine.

For more on SID and SYSPREP see Mark Russinovick’s blog

Windows 8 Developer Rewards (free phones,games)

The great people over in the Windows Team have set up a rewards program to thank developers for producing great Windows 8 Applications

if you are developing or want to develop for Windows Look at and Register with

for some great rewards.

The amount and value of the rewards are staggering, from Xbox Subscriptions to Samsung TV’s

There are added bonus for early adopters such as free Windows Phones, copies of Halo.

register today, its free.

when you go to the site to claim your points, if you fill in the who referred you as either

Robert Hogg or Black Marble

it help us get more support for future Windows 8 events has all of the data you need to start

for quick starts on using windows 8 I have several posts on using windows 8 for developers and a windows 8 overview video