What I wish I had known when I started developing Lability DevTest Lab Environments

At Black Marble we have been migrating our DevTest labs to from on-premises TFS Lab Management to a mixture of on-premise and Azure hosted Lability defined Labs as discussed by Rik Hepworth on his blog. I have only been tangentially involved in this effort until recently, consuming the labs but not creating the definitions.

So this post is one of those I do where I don’t want to forget things I learnt the hard way, or to put it another way asking Rik or Chris after watching a 2 hour environment deploy fail for the Xth time.

  • You can’t log tool much. The log files are your friends, both the DSC ones and any generated by tools triggered by DSC. This is because most of the configuration process is done during boots so there is no UI to watch.
  • The DSC log is initially created in working folder the .MOF file is in on the target VM; but after a reboot (e.g. after joining a domain) the next and subsequent DSC log files are created in  C:\Windows\System32\Configuration\ConfigurationStatus
  • Make sure you specify the full path for any bespoke logging you do, relative paths make it too easy to lose the log file
  • Stupid typos get you every time, many will be spotted when the MOF file is generated, but too many such as ones in command lines or arguments are only spotted when you deploy an environment. Also too many of these don’t actually cause error messages, they just mean nothing happens. So if you expect a script/tool to be run and it doesn’t check the log and the definition for mismatches in names.
  • If you are using the Package DSC Resource to install an EXE or MSI couple of gotcha’s
    • For MSIs the ProductName parameter must exactly match the one in the MSI definition, and this must match the GUID ProductCode.  Both of these can be found using the Orca tool

      image

    • Package MongoDb {

      PsDscRunAsCredential = $DomainCredentialsAtDomain

      DependsOn = '[Package]VCRedist'

      Ensure = 'Present'

      Arguments = "/qn /l*v c:\bootstrap\MongoDBInstall.log INSTALLLOCATION=`"C:\Program Files\MongoDB\Server\3.6\`""

      Name = "MongoDB 3.6.2 2008R2Plus SSL (64 bit)"

      Path = "c:\bootstrap\mongodb-win32-x86_64-2008plus-ssl-3.6.2-signed.msi"

      ProductId = "88B5F0D8-0692-4D86-8FF4-FB3CDBC6B40F"

      ReturnCode = 0

      }

    • For EXEs the ProductName does not appear to be as critical, but you still need the Product ID. You can get this with PowerShell on a machine that already has the EXE installed
    • Get-WmiObject Win32_Product | Format-Table IdentifyingNumber, Name, Version
  • I had network issues, they could mostly be put does to incorrect Network Address Translation. In my case this should have been setup when Lability was initially configured, the commands ran OK creating a virtual switch and NetNat, but I ended up with a Windows failback network address of 169.x.x.x when I should have had an address of 192.168.x.x on my virtual switch. So if in doubt check the settings on your virtual switch, in the Windows ‘Networking and Share Center’ before you start doubting your environment definitions.

Hope these pointers help others, as well as myself, next time Lability definitions are written

Creating test data for my Generate Release Notes Extension for use in CI/CD process

As part of the continued improvement to my CI/CD process I needed to provide a means so that whenever I test my Generate Release Notes Task, within it’s CI/CD process, new commits and work item associations are made. This is required because the task only picks up new commits and work items since the last successful running of a given build. So if the last release of the task extension was successful then the next set of tests have no associations to go in the release notes, not exactly exercising all the code paths!

In the past I added this test data by hand, a new manual commit to the repo prior to a release; but why have a dog and bark yourself? Better to automate the process.

This can done using a PowerShell file, run inline or stored in the builds source repo and run within a VSTS build. The code is shown below, you can pass in the required parameters, but I set sensible default for my purposes

For this PowerShell code to work you do need make some security changes to allow the build agent service user to write to the Git repo. This is documented by Microsoft.

The PowerShell task to run this code is placed in a build as the only task

image

This build is then triggered as part of the release process

image

Note that the triggering of this build has to be such that it runs on a non-blocking build agent as discussed in my previous posts. In my case I trigger the build to add the extra commits and work items just before triggering the validation build on my private Azure hosted agent.

Now, there is no reason you can’t just run the PowerShell directly within the release if you wanted to. I chose to use a build so that the build could be reused between different VSTS extension CI/CD pipelines; remember I have two Generate Release Note Extensions, PowerShell and NodeJS Based.

So another step to fully automating the whole release process.

Life gets better in Visual Studio Code for PowerShell

I have been using Visual Studio Code for PowerShell development, but got a bit behind on reading release notes. Today I just realised I can make my Integrated Terminal a Code a PowerShell instance.

In File > Preferences > user Settings (settings.json) enter the following

 

// Place your settings in this file to overwrite the default settings
{
     // The path of the shell that the terminal uses on Windows.
    "terminal.integrated.shell.windows": "C:\\windows\\system32\\WindowsPowerShell\\v1.0\\powershell.exe"
}

Now my terminal is a PowerShell instance, and you can see it has loaded by profile so POSH Git is work as well

 

image

 

So I think we have reached the goodbye PowerShell ISE point

Using Visual Studio Code to develop VSTS Build Tasks with PowerShell and Pester tests

Background

I am finding  myself writing a lot of PowerShell at present, mostly for VSTS build extensions. Here I hit a problem (or is it an opportunity for choice?) as to what development environment to use?

  • PowerShell ISE is the ‘best’ experience for debugging a script, but has no source control integration – and it is on all PCs
  • Visual Studio Code has good Git support, but you need to jump through some hoops to get debugging working.
  • Visual Studio PowerShell tools, are just too heavy weight, it is not even in the frame for me for this job.

So I have found myself getting the basic scripts working in the PowerShell ISE then moving to VS Code to package up the task/extensions as this means writing .JSON too – so awkward

This gets worse when I want to add Pester based unit tests, I needed a better way of working, and I chose to focus on VS Code

The PowerShell Extension for VS Code

Visual Studio Code now supports PowerShell. Once you have installed VS Code you can install the extension as follows

  1. Open the command pallet (Ctrl+Shift+P)
  2. Type “Extension”
  3. Select “Install Extensions”. 
  4. Once the extensions list loads, type PowerShell and press Enter.

Once this extension is installed you get Intellisense etc. as you would expect. So you have a good editor experience, but we still need a F5 debugging experience.

Setting up the F5 Debugging experience

Visual Studio Code can launch any tool to provide a debugging experience. The PowerShell extension provides the tools to get this running for PowerShell.

I found Keith Hill provided a nice walkthrough with screenshots of the setup, but here is my quick summary

  1. Open VS Code and load a folder structure, for me this usually this will be a Git repo
  2. Assuming the PowerShell extension is installed, goto the debug page in VS Code
  3. Press the cog at the top of the page and a .vscode\launch.json file will be added to the root of the folder structure currently loaded i.e. the root of your Git repo
  4. As Keith points out the important line, the program, the file/task to run when you press F5 is empty – a strange empty default.

image

We need to edit this file to tell it what to run when we press F5. I have decided I have two options and it depends on what I am putting in my Git Repo as to which I use

  • If we want to run the PowerShell file we have in focus in VS Code (at the moment we press F5) then we need the line

              “program”: “${file}”

  • However, I soon released this was not that useful as I wanted to run Pester based tests. I was usually editing a script file but wanted to run a test script. So this meant changing the file in focus prior to pressing F5. In this case I decided it was easier to hard code the program setting to run to a script that ran all the Pester tests in my folder structure

               “program”: “${workspaceRoot}/Extensions/Tests/runtests.ps1”

    Where my script contained the single line to run the tests in the script’s folder and below

               Invoke-Pester $PSScriptRoot –Verbose

Note: I have seen some comments that if you edit the launch.json file you need to reload VS Code for it to be read the new value, but this has not been my experience

So now when I press F5 my Pester tests run, I can debug into them as I want, but that raises some new issues due to the requirements of VSTS build tasks

Changes to my build task to enable testing

A VSTS build task is basically a PowerShell script that has some parameters. The problem is I needed to load the .PS1 script to allow any Pester tests to execute functions in the script file. This is done using the form

 

# Load the script under test
. “$PSScriptRoot\..\..\..\versioning\versiondacpactask\Update-DacPacVersionNumber.ps1”

Problem 1: If any of the parameters for the script are mandatory this include fails with errors over missing values. The fix is to make sure that any mandatory parameters are passed or they are not mandatory – I chose the latter as I can make any task parameter ‘required’ in the task.json file

Problem 2: When you include the script it is executed – not what I wanted at all. I had to put a guard if test at the top of the script to exit if the required parameters were not at least reasonable – I can’t think of a neater solution

# check if we are in test mode i.e.
If ($VersionNumber -eq “” -and $path -eq “”) {Exit}
# the rest of my code …..

Once these changes were made I was able to run the Pester tests with an F5 as I wanted using mocks to help test program flow logic

 

# Load the script under test
. “$PSScriptRoot\..\..\..\versioning\versiondacpactask\Update-DacPacVersionNumber.ps1”

Describe “Use SQL2012 ToolPath settings” {
    Mock Test-Path  {return $false} -ParameterFilter {
            $Path -eq “C:\Program Files (x86)\Microsoft Visual Studio 14.0\Common7\IDE\Extensions\Microsoft\SQLDB\DAC\120\Microsoft.SqlServer.Dac.Extensions.dll”
        }
    Mock Test-Path  {return $true} -ParameterFilter {
            $Path -eq “C:\Program Files (x86)\Microsoft Visual Studio 12.0\Common7\IDE\Extensions\Microsoft\SQLDB\DAC\120\Microsoft.SqlServer.Dac.Extensions.dll”
        }    
 
    It “Find DLLs” {
        $path = Get-Toolpath -ToolPath “”
        $path | Should be “C:\Program Files (x86)\Microsoft Visual Studio 12.0\Common7\IDE\Extensions\Microsoft\SQLDB\DAC\120”
    }
}

Summary

So I think I now have a workable solution with a good IDE with a reasonable F5 debug experience. Ok the PowerShell console in VS Code is not as rich as that in the PowerShell ISE, but I think I can live with that given the quality of the rest of the debug tools.

Running Pester PowerShell tests in the VSTS hosted build service

Updated 22 Mar 2016 This task is available in the VSTS Marketplace

If you are using Pester to unit test your PowerShell code then there is a good chance you will want to include it in your automated build process. To do this, you need to get Pester installed on your build machine. The usual options would be

If you own the build agent VM then any of these options are good, you can even write the NuGet restore into your build process itself. However there is a problem, both the first two options need administrative access as they put the Pester module in the $PSModules folder (under ‘Program Files’); so these can’t be used on VSTS’s hosted build system, where your are not an administrator

So this means you are left with copying the module (and associated functions folder) to some local working folder and running it manually; but do you really want to have to store the Pester module in your source repo?

My solution was to write a vNext build tasks to deploy the Pester files and run the Pester tests.

image_thumb[12]

The task takes two parameters

  • The root folder to look for test scripts with the naming convention  *.tests.ps1. Defaults to $(Build.SourcesDirectory)\*
  • The results file name, defaults to $(Build.SourcesDirectory)\Test-Pester.XML

The Pester task does not in itself upload the test results, it just throws and error if tests fails. It relies on the standard test results upload task. Add this task and set

  • it to look for nUnit format files
  • it already defaults to the correct file name pattern.
  • IMPORTANT: As the Pester task will stop the build on an error you need to set the ‘Always run’ to make sure the results are published.

image_thumb[11]

Once all this is added to your build you can see your Pester test results in the build summary

image_thumb[10]

image_thumb[14]

You can find the task in my vNextBuild repo

Using Release Management vNext templates when you don’t want to use DSC scripts – A better script

A couple of months ago I wrote a post on using PowerShell scripts to deploy web sites in Release Management vNext templates as opposed to DSC. In that post I provided a script to help with the translation of Release Management configuration variables to entries in the [MSDELPOY].setparameters.xml file for web sites.

The code I provided in that post required you to hard code the variables to translate. This quickly become a problem for maintenance. However, there is a simple solution.

If we use a naming convention for our RM configuration variables that map to web.config entries (I chose __NAME__ to be consistent to the old RM Agent based deployment standards) we can let PowerShell do the work.

So the revised script is

$VerbosePreference ='Continue' # equiv to -verbose 

function Update-ParametersFile
{
    param
    (
        $paramFilePath,
        $paramsToReplace
    )

    write-verbose "Updating parameters file '$paramFilePath'" -verbose
    $content = get-content $paramFilePath
    $paramsToReplace.GetEnumerator() | % {
        Write-Verbose "Replacing value for key '$($_.Name)'" -Verbose
        $content = $content.Replace($_.Name, $_.Value)
    }
    set-content -Path $paramFilePath -Value $content

}

# the script folder
$folder = Split-Path -parent $MyInvocation.MyCommand.Definition
write-verbose "Deploying Website '$package' using script in '$folder'"

# work out the variables to replace using a naming convention

# we make sure that the value is stored in an array even if it is single item
$parameters = @(Get-Variable -include "__*__" )
write-verbose "Discovered replacement parameters that match the convention '__*__': $($parameters | Out-string)"
Update-ParametersFile -paramFilePath "$ApplicationPath\$packagePath\$package.SetParameters.xml" -paramsToReplace $parameters

write-verbose "Calling '$ApplicationPath\$packagePath\$package.deploy.cmd'"
& "$ApplicationPath\$packagePath\$package.deploy.cmd" /Y  /m:"$PublishUrl" -allowUntrusted /u:"$PublishUser" /p:"$PublishPassword" /a:Basic | Write-Verbose

Note: This script allow the deployment to a remote IIS server, so useful for Azure Web Sites. If you are running it locally on an IIS server just trim everything after the /Y on the last line

So now I provide

  • $PackagePath – path to our deployment on the deployment VM(relative to the $ApplicationPath local working folder)
  • $Package – name of the MSdeploy package
  • The publish settings you can get from the Azure Portal
  • $__PARAM1__ –  a value to swap in the web.config
  • $__PARAM2__ –  another value to swap in the web.config

In RM it will look like this.

image

So now you can use a single script for all your web deployments.

Lessons learnt using simple PowerShell scripts with vNext Release Management

If you are using basic PowerShell scripts as opposed to DSC with Release Management there are a few gotcha’s I have found.

You cannot pass parameters

Lets look at a sample script that we would like to run via Release Manager

param
(
    $param1
)

write-verbose -verbose "Start"
write-verbose -verbose "Got var1 [$var1]"
write-verbose -verbose "Got param1 [$param1]"
write-verbose -verbose "End"

In Release Manager we have the following vNext workflow

image

You can see we are setting two custom values which we intend to use within our script, one is a script parameter (Param1), the other one is just a global variable (Var1).

If we do a deployment we get the log

Copying recursively from \\store\drops\rm\4583e318-abb2-4f21-9289-9cb0264a3542\152 to C:\Windows\DtlDownloads\ISS vNext Drops succeeded.

Start

Got var1 [XXXvar1]

Got param1 []

End

You can see the problem, $var1 is set, $param1 is not. Took me a while to get my head around this, the problem is the RM activity’s PSSCriptPath is just that a script path, not a command line that will be executed. Unlike the PowerShell activities in the vNext build tools you don’t have a pair of settings, one for the path to the script and another for the arguments. Here we have no ways to set the command line arguments.

Note: The PSConfigurationPath is just for DSC configurations as discussed elsewhere.

So in effect the Param1 is not set, as we did not call

test -param1 “some value”

This means there is no point using parameters in the script you wish to use with RM vNext. But wait, I bet you are thinking ‘I want to run my script externally to Release Manager to test it, and using parameters with validation rules is best practice, I don’t want to loose that advantage

The best workaround I have found is to use a wrapper script that takes the variable and makes them parameters, something like this

$folder = Split-Path -Parent $MyInvocation.MyCommand.Definition
& $folder\test.ps1 -param1 $param1

Another Gotcha Note that I need to find the path the wrapper script is running in and use it to build the path to my actual script. If I don’t do this I get that the test.ps1 script can’t be found.

After altering my pipeline to use the wrapper and rerunning the deployment I get the log file I wanted

Copying recursively from \\store\drops\rm\4583e318-abb2-4f21-9289-9cb0264a3542\160 to C:\Windows\DtlDownloads\ISS vNext Drops succeeded.

Start

Got var1 [XXXvar1]

Got param1 [XXXparam1]

End

 

This is all a bit ugly, but works.

Looking forward this appears to not be too much of an issue. The next version of Release Management as shown at Build is based around the vNext  TFS build tooling which seems to always allow you to pass true PowerShell command line arguments. So this problem should go away in the not too distant future.

Don’t write to the console

The other big problem is any script that writes or reads from the console. Usually this means a write-host call in a script that causes an error along the lines

A command that prompts the user failed because the host program or the command type does not support user interaction. Try a host program that supports user interaction, such as the Windows PowerShell Console or Windows PowerShell ISE, and remove prompt-related commands from command types that do not support user interaction, such as Windows PowerShell workflows.
+At C:\Windows\DtlDownloads\ISS vNext Drops\scripts\test.ps1:7 char:1
+ Write-Host "hello 1" -ForegroundColor red

But also watch out for any CLS calls, that has caught me out. I have found the it can be hard to track down the offending lines, especially if there are PowerShell modules loading modules.

The best recommendation is to just use write-verbose and write-error.

  • write-error if your script has errored. This will let RM know the script has failed, thus failing the deployment – just what we want
  • write-verbose for any logging

Any other form of PowerShell output will not be passed to RM, be warned!

You might also notice in my sample script that I am passing the –verbose argument to the write-verbose command, again you have to have this maximal level of logging on  for the messages to make it out to the RM logs. Probably a better solution, if you think you might vary the level of logging, is to change the script to set the $VerbosePreference

param
(
    $param1
)




$VerbosePreference ='Continue' # equiv to -verbose

write-verbose "Start"
write-verbose "Got var1 [$var1]"
write-verbose "Got param1 [$param1]"
write-verbose "End"

So hopefully a few pointers to make your deployments a bit smoother

Cannot run Pester unit tests in Visual Studio but they work Ok from the command prompt

I have been using Pester for some PowerShell tests. From the command prompt all is good, but I kept getting the error ‘module cannot be loaded because scripts is disabled on this system’ when I tried to run them via the Visual Studio Test Explorer

 

image

I found the solution on StackOverflow, I had forgotten that Visual Studio is 32bit, so you need to set the 32bit execution policy. Opening the default PowerShell command prompt and and setting the policy only effect the 64Bit instance.

  1. Open C:\Windows\SysWOW64\WindowsPowerShell\v1.0\powershell.exe
  2. Run the command Set-ExecutionPolicy RemoteSigned
  3. My tests passed (without restarting Visual Studio)

image

Using Release Management vNext templates when you don’t want to use DSC scripts

Update 21 Aug 2015 – This post contains all the basic information, but there is an improved PowerShell script discussed in Using Release Management vNext templates when you don’t want to use DSC scripts – A better script


Many web sites are basically forms over data, so you need to deploy some DB schema and something like a MVC website. Even for this ’bread and butter’ work it is important to have an automated process to avoid human error. Hence the rise in use of release tools to run your DACPAC and MSDeploy packages.

In the Microsoft space this might lead to the question of how Desired State Configuration (DSC) can help? I, and others, have posted in the past about how DSC can be used to achieve this type of deployment, but this can be complex and you have to ask is DSC the best way to manage DACPAC and MSDeploy packages? Or is DSC better suited to only the configuration of your infrastructure/OS features?

You might ask why would you not want to use DSC, well the most common reason I see is that you need to provide deployment script to end clients who don’t use DSC, or you have just decided want basic PowerShell. Only you will be able to judge which is the best for your systems, but I thought it worth outlining an alternative way to do deployment of these package using Release Management vNext pipelines that does not make use of DSC.

Background

Let us assume we have a system with a SQL server and a IIS web server that have been added to the Release Management vNext environment. These already have SQL and IIS enabled, maybe you used DSC for that?

The vNext release template allows you to run either DSC or PowerShell on the machines, we will ignore DSC, so what can you do if you want to use simple PowerShell scripts?

Where do I put my Scripts?

We will place the PowerShell scripts (and maybe any tools they call) under source control such that they end up in the build drops location, thus making it easy for Release Management to find them, and allowing the scripts (and tools) to be versioned.

Deploying a DACPAC

The script I have been using to deploy DACPACs is as follows

# find the script folder
$folder = Split-Path -parent $MyInvocation.MyCommand.Definition
Write-Verbose "Deploying DACPAC $SOURCEFILE using script in '$folder'"
& $folder\sqlpackage.exe /Action:Publish /SourceFile:$folder\..\$SOURCEFILE /TargetServerName:$TARGETSERVERNAME /TargetDatabaseName:$TARGETDATABASENAME | Write-Verbose -Verbose

Note that:

  1. First it finds the folder it is running in, this is the easiest way to find other resource I need
  2. The only way any logging will end up in the Release Management logs is if is logged at the verbose level i.e. write-verbose “your message” –verbose
  3. I have used a simple & my.exe to execute my command, but pass the output via the write-verbose cmdlet to make sure we see the results. The alternative would be to use invoke-process
  4. SQLPACKAGE.EXE (and its associated DLLs) are located in the same SCRIPTS folder as the PowerShell script and are under source control. Of course you could make sure any tools you need are already installed on the target machine.

I pass the three parameters need for the strips as custom configuration

image

Remember that you don’t have to be the SQL server to run SQLPACKAGE.EXE, it can be run remotely (that is why in the screen shot above the ServerName is ISS IIS8 not SQL as you might expect)

Deploying a MSDeploy Package

The script I use to deploy the WebDeploy package this is as follows

function Update-ParametersFile
{
    param
    (
        $paramFilePath,
        $paramsToReplace
    )

    write-verbose "Updating parameters file '$paramFilePath'" -verbose
    $content = get-content $paramFilePath
    $paramsToReplace.GetEnumerator() | % {
        Write-Verbose "Replacing value for key '$($_.Key)'" -Verbose
        $content = $content.Replace($_.Key, $_.Value)
    }
    set-content -Path $paramFilePath -Value $content

}


# the script folder
$folder = Split-Path -parent $MyInvocation.MyCommand.Definition
write-verbose "Deploying Website '$package' using script in '$folder'" -verbose

Update-ParametersFile -paramFilePath "$folder\..\_PublishedWebsites\$($package)_Package\$package.SetParameters.xml" -paramsToReplace @{
      "__DataContext__" = $datacontext
      "__SiteName__" = $siteName
      "__Domain__" = $Domain
      "__AdminGroups__" = $AdminGroups

}

write-verbose "Calling '$package.deploy.cmd'" -verbose
& "$folder\..\_PublishedWebsites\$($package)_Package\$package.deploy.cmd" /Y | Write-Verbose -verbose

Note that:

  1. First I declare a function that I use to replace the contents of the package.setparameters.xml file, a key step in using binary promotion and WebDeploy
  2. Again I finds the folder the script is running in so I can locate other resources
  3. I then declare the parameters I need to replace and call the replacement function 
  4. Finally I call the package.deploy.cmd command, and pass the output via the write-verbose to pass the output to the Release Management logs

This is called as follows

image

Summary

So I think these reusable scripts give a fairly  easy way to make use of  vNext Release Management pipelines. They can also easily be given to clients who just want to manually run something.

Getting Release Management to fail a release when using a custom PowerShell component

If you have a custom PowerShell script you wish to run you can create a tool in release Management (Inventory > Tools) for the script which deploys the .PS1, PSM files etc. and defines the command line to run it.

The problem we hit was that our script failed, but did not fail the build step as the PowerShell.EXE running the script exited without error. The script had thrown an exception which was in the output log file, but it was marked as a completed step.

The solution was to use a try/catch in the .PS1  script that as well as writing a message to Write-Error also set the exit code to something other than 0 (Zero). So you end up with something like the following in your .PS1 file

param
(
[string]$Param1 ,
[string]$Param2 )

try
{
    # some logic here

} catch
{
    Write-Error $_.Exception.Message
    exit 1 # to get an error flagged so it can be seen by RM
}

Once this change was made an exception in the PowerShell caused the release step to fail as required. The output from the script appeared as the Command Output.