But it works on my PC!

The random thoughts of Richard Fennell on technology and software development

Lab Management with SCVMM 2012 and /labenvironmentplacementpolicy:aggressive

I did a post a year or so ago about setting up TFS Labs and mentioned command

C:\Program Files\Microsoft Team Foundation Server 2010\Tools>tfsconfig lab /hostgroup /collectionName:myTpc  ​/labenvironmentplacementpolicy:aggressive /edit /Name:"My hosts group"

This can be used tell TFS Lab Management to place VMs using any memory that is assigned stopped environments. This allowed a degree of over commitment of resources.

As I discovered today this command only works for SCVMM 2010 based system. if you try it you just get a message saying not support on SCVMM 2012. There appears to be no equivalent for 2012.

However you can use features such as dynamic memory with in SCVMM 2012 so all is not lost

Installing a DB from a DACPAC using Powershell as part of TFS Lab Management deployment

I have been battling setting up a DB deployed via the SQL 2012 DAC tools and Powershell.  My environment was a network isolated pair of machines

  • DC – the domain controller and SQL 2012 server
  • IIS – A web front end

As this is network isolated I could only run scripts on the IIS server, so my DB deploy needed to be remote. So the script I ended up with was

param(
    [string]$sqlserver = $( throw "Missing: parameter sqlserver"),
    [string]$dacpac = $( throw "Missing: parameter dacpac"),
    [string]$dbname = $( throw "Missing: parameter dbname") )

Write-Host "Deploying the DB with the following settings"
Write-Host "sqlserver:   $sqlserver"
Write-Host "dacpac: $dacpac"
Write-Host "dbname: $dbname"


# load in DAC DLL (requires config file to support .NET 4.0)
# change file location for a 32-bit OS
add-type -path "C:\Program Files (x86)\Microsoft SQL Server\110\DAC\bin\Microsoft.SqlServer.Dac.dll"

# make DacServices object, needs a connection string
$d = new-object Microsoft.SqlServer.Dac.DacServices "server=$sqlserver"

# register events, if you want 'em
register-objectevent -in $d -eventname Message -source "msg" -action { out-host -in $Event.SourceArgs[1].Message.Message } | Out-Null

# Load dacpac from file & deploy to database named pubsnew
$dp = [Microsoft.SqlServer.Dac.DacPackage]::Load($dacpac)
$d.Deploy($dp, $dbname, $true) # the true is to allow an upgrade, could be parameterised, also can add further deploy params

# clean up event
unregister-event -source "msg"

Remember the SQL 2012 DAC tools only work with PowerShell 3.0 as they have a .NET 4 dependency.

This was called within the Lab Build using the command line

image

cmd /c powershell $(BuildLocation)\SQLDeploy.ps1 dc $(BuildLocation)\Database.dacpac sabs

All my scripts worked correctly locally when I ran it on the command line, they were also starting from within the build, but failing with errors along the lines of

Deployment Task Logs for Machine: IIS
Accessing the following location using the lab service account: blackmarble\tfslab, \\store\drops.
Deploying the DB with the following settings
sqlserver:   dc
dacpac:
\\store\drops\Main.CI\Main.CI_20130314.2\Debug\Database.dacpac
dbname: Database1
Initializing deployment (Start)
Exception calling "Deploy" with "3" argument(s): "Could not deploy package."
Initializing deployment (Failed)
At
\\store\drops\Main.CI\Main.CI_20130314.2\Debug\SQLDeploy.ps1:26
char:2
+  $d.Deploy($dp, $dbname, $true) # the true is to allow an upgrade
+  ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo          : NotSpecified: (:) [], MethodInvocationException
+ FullyQualifiedErrorId : DacServicesException
Stopped accessing the following location using the lab service account: blackmarble\tfslab, \\store\drops.

Though not obvious from the error message the issue was who the script was running as. The TFS agent runs as a machine account, this had no rights to access the SQL on the DC. Once I granted the computer account IIS$ suitable rights to the SQL box all was OK. The alternative would have been to enable mixed mode authentication and user a connection string in the form 

“server=dc;User ID=sa;Password=mypassword”

So now I can deploy my DB on a new build.

You don’t half get strange errors when two servers have the same SID

You don’t half get strange errors when building a test environment if when you run SYSPREP’d each copy of your VM base image you forget to check the ‘generalize’ box

image

If you forget this, as I did, each VM has a different name but the same SID. Basically the domain/AD is completely confused as who is what. The commonest error I saw was that I could not setup application (Report Services, SP 2010 and TFS 2012) with domain service accounts. In all cases I got messages about missing rights or cannot communicate with domain controller.

The fix was to basically start again. I re-SDYSPREP’d one of the pair of boxes I had to it reset it’s SID, I stripped off what I was trying to install, re-added the server to the domain and installed the applications again. Once this was done all was fine.

For more on SID and SYSPREP see Mark Russinovick’s blog

Cannot run Microsoft Fakes based test if Typemock Isolator enabled

With Microsoft Fakes moving to the Premium SKU of Visual Studio in 2012.2 (CTP4 is now available) more people will be looking at using them.

I have just installed CTP4 and have seen a behaviour I don’t think I have not seen in the previous version of Visual Studio (I need to check because as well as CTP4  I have recently installed the new version of Typemock Isolator 7.3.0 that addresses issues with Windows 8 and Visual  Studio 2012).

Anyway the error you see when you run a fakes based test is ‘UnitTestIsolation instrumentation failed to initialialize, Please restart Visual Studio and rerun this test’

image

The solution is to disable Typemock Isolator (Menu Typemock > Suspend Mocking), when this is done, without a reboot, the Fakes based test run.

Does mean you can’t have a solution using both Fakes and Isolator, but why would you?

TFS TPC Databases and SQL 2012 availability groups

Worth noting that when you create a new TPC in TFS 2012, when the TFS configuration DB and other TPC DBs are in SQL 2012 availability groups, the new TPC DB is not placed in this or any other availability group. You have to add it manually, and historically remove it when servicing TFS. Though the need to remove it for servicing changes with TFS 2012.2 which allows servicing of high availability DBs

Recovering network isolated lab management environments if you have to recreate your SC-VMM server’s DB

Whilst upgrading our Lab Management system we lost the SC-VMM DB. This has meant we needed to recreate environments we already have running on Hyper_V hosts but were unknown to TFS. If they were not network isolated this is straight forward, just recompose the environment (after clear out the XML in the VM descriptions fields). However if they are network isolated and running, then you have do play around a bit.

This is the simplest method I have found thus far. I am interested to hear if you have a better way

  • In SC-VMM (or via PowerShell) find all the VMs in your environment. They are going to have names in the form Lab_[GUID]. If you look at the properties of the VMs in the description field you can see the XML that defines the Lab they belong to.

image

If you are not sure what VMs you need you can of course cross reference the internal machine names with the AD within the network isolated environment. remember this environment is running so you can login to it.

  • Via SC-VMM Shutdown each VM
  • Via SC-VMM store the VM in the library
  • Wait a while…….
  • When all the VMs have been stored, navigate to them in SC-VMM. For each one in turn open the properties and
    • CHECK THE DESCRIPTION XML TO MAKE SURE YOU HAVE THE RIGHT VM AND KNOW THEIR ROLE
    • Change the name to something sensible (not essential if you like GUIDs in environment members names, but as I think it helps) e.g change Lab_[guid] to ‘My Test DC’
    • Delete all the XML in the Description field
    • In the hardware configuration, delete the ‘legacy network’ and connect the ‘Network adaptor’ to your main network – this will all be recreated when you create the new lab

image

Note that a DC will not have any connections to your main network as it is network isolated. For the purpose of this migration it DOES need to be reconnected. Again this will be stored by the tooling when you create the new environment.

  • When all have been update in SC-VMM, open MTM and import the stored VMs into the team project
  • You can now create a new environment using these stored VM. It should deploy out OK, but I have found you might need to restart it before all the test agent connect correctly
  • And that should be it, the environment is known to TFS lab managed and is running network isolated

You might want to delete the stored VMs once you have the environment running. But this will down to your policies, they are not needed as you can store the environment as a whole to archive or duplicate it with network isolation.

Fix for ‘Cannot install test agent on these machines because another environment is being created using the same machines’

We have recently been upgrading our TFS 2012QU1 Lab Management system from SCVMM 2008 to SCVMM 2012 SP1. This has not been the nicest experience, we are preparing at least one joint blog post from Rik, Rob and myself on the joys, yes it did take everyone, a lovely cross skill problem.

We are now in the process of sorting out the state of running environments transferred between the systems. I thought I would start with an easy one, a single non-domain joined VM.

So after the upgrade this VM appeared as a running VM on one of the Hyper-V hosts, I could add it to a new environment, which I did. All appeared OK, the environment was created but Lab Management could not upgrade the Test Agents. If you tried the ‘reinstall agents’ option in Test Manager you got the error ‘cannot install test agent on these machines because another environment is being created using the same machines’

image

All very confusing and a hunt on the web found nothing on this error.

The fix I found (might be another way) was to

  1. Connect to the VM via MTM
  2. Uninstall the TFS 2012 RTM Test Agent.
  3. Install the TFS 2012 QU1 test agent from the Test Agents ISO (which I had unpacked to network share so I could run the vstf_testagent.exe easily)
  4. When the install was complete the configuration wizard ran, I set the agent to run as the local administrator and pointed it at one of our test controllers

image

Once the Test Agent configuration completed and the agent restarted it connected to the test controller and the environment reported itself as being ready.

So one (simple) environment down, plenty more to go

Running an external command line tool as part of a Wix install

I have recently been battling running a command line tool within a Wix 3.6 installer. I eventually got it going but learnt a few things. Here is a code fragment that outlines the solution.

<Product ………>
……… loads of other Wix bits

<!-- The command line we wish to run is set via a property. Usually you would set this with <Property /> block, but in this case it has to be done via a
       
CustomAction as we want to build the command from other Wix properties that can only be evaluated at runtime. So we set the
       whole command line  including command line arguments as a CustomAction that will be run immediately i.e. in the first phase of the MSIExec process  
       while the command set is being built.

     Note that the documentation say the command line should go in a property called QtExecCmdLine, but this is only true if the CustomAction
    is to be run immediately. The CustomAction to set the property is immediate but the command line’s execution CustomAction is deferred, so we
    have to set the property to the name of the CustomAction and not QtExecCmdLine  -->

<CustomAction Id='PropertyAssign' Property='SilentLaunch' Value='&quot;[INSTALLDIR]mycopier.exe&quot; &quot;[ProgramFilesFolder]Java&quot; &quot;[INSTALLDIR]my.jar&quot;' Execute='immediate' />

  <!—Next we define the actual CustomAction that does the work. This needs to be deferred (until after the files are installed) and set to not be impersonated
          so runs as the same elevated account as the rest of the MSIExec actions. (assuming your command line tool needs admin rights -->
  <CustomAction Id="SilentLaunch" BinaryKey="WixCA"  DllEntry="CAQuietExec" Execute="deferred" Return="check" Impersonate="no" />
 
  <!—Finally we set where in the install sequence the CustomActions and that they are only called on a new install
          Note that we don't tidy up the actions of this command line tool on a de-install -->
  <InstallExecuteSequence>
   <Custom Action="PropertyAssign" Before="SilentLaunch">NOT Installed </Custom>
   <Custom Action="SilentLaunch" After="InstallFiles">NOT Installed </Custom>
  </InstallExecuteSequence>

 
</Product>

So the usual set of non-obvious Wix steps, but we got there in the end

More in rights being stripped for the [team project]\contributors group in TFS 2012 when QU1 applied and how to sort it.

I recently wrote a post that discussed how the contributor rights had been stripped off areas in TFS 2012 server when QU1 was applied, this included details on the patches to apply the manual steps to resolve the problem.

Well today I found that it is not just in the area security you can see this problem. We found it too in the main source code repository. Again the [Team project]\contributors group was completely missing. I had to re-add it manually. Once this was done all was OK for the users

image

FYI: You might ask how I missed this before, most of the  users  on this project had higher levels of rights granted by being members of other groups. It was not until someone was re-assigned between team we noticed.

Getting Windows Phone 7.8 on my Lumia 800

Microsoft have release Windows Phone 7.8 in the last few days. As usual the rollout appears to be phased, I think based on serial number of your phone. As with previous versions you can force the update, so jumping the phased rollout queue. The process is

  1. Put the phone in flight mode (so no data connection)
  2. Connect it to your PC running Zune, it will look to see if there is an OS update. If it finds it great, let it do the upgrade
  3. If it does not find it, select the settings menu (top right in Zune)
  4. You need to select the update menu option on the left menu
  5. Zune will check for an update, about a second or two after it starts this process disconnect the PC from the Internet. This should allow Zune to get a list of updates, but not the filter list of serial numbers. So it assume the update is for you.
  6. You should get the update available message, reconnect the internet (it needs to download the file) and continue to do the upgrade

You will probably have to repeat step 5 a few times to get the timing correct

I also had to repeat whole process 3 three for 3 different firmware and OS updates before I ended up with 7.8.

image

But now I have multi size tiles and the new lock screen.

Or if you don’t fancy all the hassle you could just wait a few days