But it works on my PC!

The random thoughts of Richard Fennell on technology and software development

Overwriting your own parameters in Release Management can cause Powershell remoting problems

I have been doing some work on vNext Release Management; I managed to waste a good hour today with a stupid error.

In vNext process templates you provide a username and password to be used as the Powershell remoting credentials (in the red box below)

image

My Powershell script also took a parameter username, so this was provided as a custom configuration too (the green box). This was the issue. Not unsurprisingly having two parameters with the same name is a problem. You might get away with it if they are the same value (I did on one stage, which caused more confusion), but if they differ (as mine did in my production stage) the last one set wins, which meant my remote Powershell returned the error

System.Reflection.TargetInvocationException: Exception has been thrown by the target of an invocation. ---> System.AggregateException: One or more errors occurred. ---> Microsoft.TeamFoundation.Release.Common.Helpers.OperationFailedException: Permission denied while trying to connect to the target machine Gadila.blackmarble.co.uk on the port:5985 via power shell remoting.

Easy to fix once you realise the problem, a logon failure is logged on the target machine in the event log. Just make sure you have unique parameters

Speaking at Leeds DevOps on the 21st of July

I will be speaking at Leeds DevOps on the 21st of July on the subject of Desired State Configuration (DSC).

‘In the Windows world, due to its API based architecture, deployment is too often not as simple as copying an EXE and updating a text configuration file. Desired State Configuration is an attempt to ease the pain we suffer in this space. Providing a set of tools that can be leveraged by any set of deployment tools whether in a Windows or heterogeneous environment. In this session we will look at what DSC is, what resource are available and how to write your own’.

The event is at the The Node in Leeds, tickets are free and are available over on Eventbrite or meetup.com

Using Release Management vNext templates when you don’t want to use DSC scripts

Many web sites are basically forms over data, so you need to deploy some DB schema and something like a MVC website. Even for this ’bread and butter’ work it is important to have an automated process to avoid human error. Hence the rise in use of release tools to run your DACPAC and MSDeploy packages.

In the Microsoft space this might lead to the question of how Desired State Configuration (DSC) can help? I, and others, have posted in the past about how DSC can be used to achieve this type of deployment, but this can be complex and you have to ask is DSC the best way to manage DACPAC and MSDeploy packages? Or is DSC better suited to only the configuration of your infrastructure/OS features?

You might ask why would you not want to use DSC, well the most common reason I see is that you need to provide deployment script to end clients who don’t use DSC, or you have just decided want basic PowerShell. Only you will be able to judge which is the best for your systems, but I thought it worth outlining an alternative way to do deployment of these package using Release Management vNext pipelines that does not make use of DSC.

Background

Let us assume we have a system with a SQL server and a IIS web server that have been added to the Release Management vNext environment. These already have SQL and IIS enabled, maybe you used DSC for that?

The vNext release template allows you to run either DSC or PowerShell on the machines, we will ignore DSC, so what can you do if you want to use simple PowerShell scripts?

Where do I put my Scripts?

We will place the PowerShell scripts (and maybe any tools they call) under source control such that they end up in the build drops location, thus making it easy for Release Management to find them, and allowing the scripts (and tools) to be versioned.

Deploying a DACPAC

The script I have been using to deploy DACPACs is as follows

# find the script folder
$folder = Split-Path -parent $MyInvocation.MyCommand.Definition
Write-Verbose "Deploying DACPAC $SOURCEFILE using script in '$folder'"
& $folder\sqlpackage.exe /Action:Publish /SourceFile:$folder\..\$SOURCEFILE /TargetServerName:$TARGETSERVERNAME /TargetDatabaseName:$TARGETDATABASENAME | Write-Verbose -Verbose

Note that:

  1. First it finds the folder it is running in, this is the easiest way to find other resource I need
  2. The only way any logging will end up in the Release Management logs is if is logged at the verbose level i.e. write-verbose “your message” –verbose
  3. I have used a simple & my.exe to execute my command, but pass the output via the write-verbose cmdlet to make sure we see the results. The alternative would be to use invoke-process
  4. SQLPACKAGE.EXE (and its associated DLLs) are located in the same SCRIPTS folder as the PowerShell script and are under source control. Of course you could make sure any tools you need are already installed on the target machine.

I pass the three parameters need for the strips as custom configuration

image

Remember that you don’t have to be the SQL server to run SQLPACKAGE.EXE, it can be run remotely (that is why in the screen shot above the ServerName is ISS IIS8 not SQL as you might expect)

Deploying a MSDeploy Package

The script I use to deploy the WebDeploy package this is as follows

function Update-ParametersFile
{
    param
    (
        $paramFilePath,
        $paramsToReplace
    )

    write-verbose "Updating parameters file '$paramFilePath'" -verbose
    $content = get-content $paramFilePath
    $paramsToReplace.GetEnumerator() | % {
        Write-Verbose "Replacing value for key '$($_.Key)'" -Verbose
        $content = $content.Replace($_.Key, $_.Value)
    }
    set-content -Path $paramFilePath -Value $content

}


# the script folder
$folder = Split-Path -parent $MyInvocation.MyCommand.Definition
write-verbose "Deploying Website '$package' using script in '$folder'" -verbose

Update-ParametersFile -paramFilePath "$folder\..\_PublishedWebsites\$($package)_Package\$package.SetParameters.xml" -paramsToReplace @{
      "__DataContext__" = $datacontext
      "__SiteName__" = $siteName
      "__Domain__" = $Domain
      "__AdminGroups__" = $AdminGroups

}

write-verbose "Calling '$package.deploy.cmd'" -verbose
& "$folder\..\_PublishedWebsites\$($package)_Package\$package.deploy.cmd" /Y | Write-Verbose -verbose

Note that:

  1. First I declare a function that I use to replace the contents of the package.setparameters.xml file, a key step in using binary promotion and WebDeploy
  2. Again I finds the folder the script is running in so I can locate other resources
  3. I then declare the parameters I need to replace and call the replacement function 
  4. Finally I call the package.deploy.cmd command, and pass the output via the write-verbose to pass the output to the Release Management logs

This is called as follows

image

Summary

So I think these reusable scripts give a fairly  easy way to make use of  vNext Release Management pipelines. They can also easily be given to clients who just want to manually run something.

Fix for 500 internal errors when trying to trigger a Release Management pipeline from a build via the REST API

With the help of the Release Management team at Microsoft I now have a working REST based automated TFS Build to Release Management pipeline. Previously we were using a TFS automated build and then manually triggering our agent based Release Management pipeline. When we moved to a vNext PS/DSC based RM pipeline I took the chance to automate the link using REST via a PowerShell script to trigger the initial deployment. However, I hit problem, first with a stupid 401 permission error and later with a much stranger 500 internal server error.

Fixing the 401 error

The first problem was that in the InitiateReleaseFromBuild.ps1 script defaults to a hardcoded username and password. You should really be using the current credentials. To do this make sure the lines around line60 in the script are as shown below (or enter valid credentials if you don’t want to use default credentials)

$wc = New-Object System.Net.WebClient
$wc.UseDefaultCredentials = $true
# rmuser should be part rm users list and he should have permission to trigger the release.
#$wc.Credentials = new-object System.Net.NetworkCredential("rmuser", "rmuserpassword", "rmuserdomain")

Fixing the 500 error

The 500 error was stranger. Turns out the issue was the registration of our TFS server in Release Management.

Using the dialogs in the RM client we has registered our TFS server, this had generated the URL https://tfs.domain.com:443/tfs. If we ran the InitiateReleaseFromBuild.ps1 script with this URL set as a parameter we got the 500 error, the RM logs showed the workflow could not start. Eventually we realised it was because RM thought it could not access the TFS server. So the problem was that at some point  between the script being run and the RM server processing the URL the :443 had been removed; presumably because this is the default for HTTPS and some layer was being ‘helpful’. This meant that the RM server was trying to string match the URL https://tfs.domain.com/tfs against https://tfs.domain.com:443/tfs which failed, hence the workflow failed.

The fix was to edit the TFS registration in RM to remove the port number, leave the field empty (not that obvious as the dialog completes this field for you when you select HTTPS)

image

Once this was done the URL matching worked and the release pipeline triggered as expected.

Failing Ping tests on Application Insights

Whilst setting up Application Insights on one of our web sites I hit a problem. The target site appeared to be working OK, but if I setup a ping test it failed.

Digging into the failure, as with much of Application Insights just keep clicking to go deeper, I found the issue was that a CSS file was failing to load.

image

Presumably on this Umbraco site the CSS file is meant to be loaded for the site but none of the styles are actually used, hence the site renders OK.

The fix was to make sure the video.css file was present on the server. So Application Insights found a problem with a production system – just as it is meant to!

So it is important to remember that the ping test is not the simple thing I thought it was, it is actually a full page load, making sure that only 200 OK responses are seen.

Windows Media Center issues again

Today was my day for semi annual Media Center (MCE) problems. As usual they seemed to start with an unexpected power issue, a local power cut, maybe the answer is a UPS for the TV setup? Once the PC was rebooted it had forgotten it had any tuners. If I tried to view live TV or re-setup the TV signal it just hung with a spinning ‘toilet bowl of death’ cursor. Corrupt TV data DB I suspect, I have seen it before

I tried clearing the DB content in C:\programdata\windows\ehome, but no luck. In the end I did the dirty fix of

  • Going into Window features
  • Remove media center
  • Reboot
  • Re-add media center
  • Re-run MCE setup – this took over an hour, it is slow to find Freeview channels

Downside of this is that it has the issue it resets all the series settings, media locations etc. but it does tend to work.

My MCE seems to have been getting slower and generally needed more reboots for a while, strange is it has been on the same dedicated hardware for a few years.  Given Windows 10 is on the horizon and it has no MCE I guess it  is time to revisit an MCE replacement (or leave my MCE box on Windows 8). Last time I looked the issue was PVR support for Freeview and general ‘wife friendly operations’. It does seem that fewer and fewer people are prioritising terrestrial broadcast as media source, it all seems to be about streaming. Just don’t think I am there yet, I like my PVR. But there is no harm is a trawl of the other current offerings, I might be surprised

Updated 9pm  when the setup wizard actually finished – turns out my media library settings were not lost, just series recording settings

Strange TFS build process template editing issue with Typemock

Had a strange issue today while editing our standard TFS 2013 XAML build process template to add an optional post drop script block to allow a Release Management pipeline to be triggered via REST. Our standard template includes a block for enabling and disabling Typemock, after editing our template to add the new script block (nowhere near the Typemock section) our builds failed with the error

TF215097: An error occurred while initializing a build for build definition \BM\ISS.Expenses.Main.CI: Exception Message: Cannot set unknown member 'TypeMock.TFS2013.TypeMockStart.DisableAutoLink'. (type XamlObjectWriterException) Exception Stack Trace: at System.Xaml.XamlObjectWriter.WriteStartMember(XamlMember property) 

It took ages to find the issue, we hunted for badly formed XAML, but the issue turned out to be that when ever we opened the template in Visual Studio 2013 it added the highlighted property

 

<If Condition="[UseTypemock = True]" DisplayName="If using Typemock" sap2010:WorkflowViewState.IdRef="If_8">
  <If.Then>
   <Sequence DisplayName="Enabling Typemock" sap2010:WorkflowViewState.IdRef="Sequence_16">
      <tt:TypeMockRegister AutoDeployDir="[TypemockAutoDeployDir]" Company="[TypemockCompany]" sap2010:WorkflowViewState.IdRef="TypeMockRegister_1" License="[TypemockLicense]" />
      <tt:TypeMockStart DisableAutoLink="{x:Null}" EvaluationFolder="{x:Null}" Link="{x:Null}" LogLevel="{x:Null}" LogPath="{x:Null}" ProfilerLaunchedFirst="{x:Null}" Target="{x:Null}" Verbosity="{x:Null}" Version="{x:Null}" AutoDeployDir="[TypemockAutoDeployDir]" sap2010:WorkflowViewState.IdRef="TypeMockStart_1" />
     </Sequence>
  </If.Then>
</If>

It should have been

<If Condition="[UseTypemock = True]" DisplayName="If using Typemock" sap2010:WorkflowViewState.IdRef="If_8">
  <If.Then>
    <Sequence DisplayName="Enabling Typemock" sap2010:WorkflowViewState.IdRef="Sequence_16">
       <tt:TypeMockRegister AutoDeployDir="[TypemockAutoDeployDir]" Company="[TypemockCompany]" sap2010:WorkflowViewState.IdRef="TypeMockRegister_1" License="[TypemockLicense]" />
       <tt:TypeMockStart EvaluationFolder="{x:Null}" Link="{x:Null}" LogLevel="{x:Null}" LogPath="{x:Null}" ProfilerLaunchedFirst="{x:Null}" Target="{x:Null}" Verbosity="{x:Null}" Version="{x:Null}" AutoDeployDir="[TypemockAutoDeployDir]" sap2010:WorkflowViewState.IdRef="TypeMockStart_1" />
    </Sequence>
  </If.Then>
</If>

All I can assume is that this is due to some assembly mismatch between the Typemock DLLs linked to the XAML build process template and those on my development PC.

The fix for now is to do the editing in a text editor, or at least checking the file to make sure the property has not been edited before it is checked in.

Generating MsTest wrappers for nUnit tests

Recently whilst at a clients one of our consultants came across an interesting issue; the client was using Selenium to write web tests, they wanted to trigger them both from Microsoft Test Manager (MTM) as local automated tests, and also run them using BrowserStack for multi browser regression testing. The problem was to import the tests into MTM they needed to be written in MsTest and for BrowserStack nUnit.

As they did not want to duplicate each test what could they ?

After a bit of thought T4 templates came to the rescue, it was fairly easy to write a proof of concept T4 template to generate an MsTest wrapper for each nUnit at compile time. This is what we did, and the gotcha’s we discovered.

Prerequisites

Process

[To make life easier this code has all been made available on GitHub]

  1. Create a solution containing a class library with some nUnit tests as test data
  2. Add a MsTest Unit Test project to this solution.
  3. Add a T4 ‘Text Template’ item to the MsTest project

    image
  4. Write the T4 template that uses reflection to find the nUnit tests in the solution and generates the MsTest wrappers. See the source for the template on Github
  5. Once this is done both the nUnit and MsTest can now be run inside Visual Studio

    image
  6. You can now add the tests to either MTM or BrowserStack as needed, each product using the unit tests it can see.

The Gotcha – you have two build engines

The main issues I had were due to me not realising the implications of the T4 template being processed in different ways between Visual Studio and MSBuild.

By default the template is processed whenever the .TT file is edited in Visual Studio, for me this is not the behaviour required, I wanted the template processed every time the nUnit tests are altered. The easiest way to do this is to always regenerate the .CS file from the template on a compile. Oleg again provides great documentation on how to do this, you end up editing the .CSPROJ file.

<!-- Include the T$ processing targets-->
<Import Project="$(VSToolsPath)\TextTemplating\Microsoft.TextTemplating.targets" />
 
<!-- Set parameters we want to access in the transform -->
<ItemGroup>
   <T4ParameterValues Include="slnDir">
     <Value>$(MSBuildProjectDirectory)\..</Value>
     <Visible>false</Visible>
   </T4ParameterValues>
  </ItemGroup>

<ItemGroup>
   <T4ParameterValues Include="configuration">
     <Value>$(Configuration)</Value>
     <Visible>false</Visible>
   </T4ParameterValues>
</ItemGroup>

<ItemGroup>
   <T4ParameterValues Include="projectName">
     <Value>$(MSBuildProjectName)</Value>
     <Visible>false</Visible>
   </T4ParameterValues>
</ItemGroup>
 
<!-- Tell the MSBuild T4 task to make the property available: -->
<PropertyGroup>
   <!-- do the transform -->
   <TransformOnBuild>true</TransformOnBuild>
   <!-- Force a complete reprocess -->
   <TransformOutOfDateOnly>false</TransformOutOfDateOnly>
</PropertyGroup>

I thought after editing my .CSPROJ file to call the MSBuild targets required, and exposed properties I needed from MSBuild, that all would be good. However I quickly found that though when building my solution with MSBuild from the command line all was fine, a build in Visual Studio failed. Turns out I had to make my template support both forms of building.

This meant assuming in my .TT file I was building on MSBuild and if I got nulls for required property values switch to the Visual Studio way of working e.g.

    // get the msbuild variables if we can
    var configName = Host.ResolveParameterValue("-", "-", "configuration");
   

    if (String.IsNullOrEmpty(configName)==true)
    {
        WriteLine ("// Generated from Visual Studio");

// Get the VS instance
        IServiceProvider serviceProvider = (IServiceProvider)this.Host;
        DTE dte = serviceProvider.GetService(typeof(DTE)) as DTE; 
        configName = dte.Solution.SolutionBuild.ActiveConfiguration.Name ;

    } else
    { 
        WriteLine ("// Generated from MSBuild");
    }

 

Once this was done, I then made sure I could get a successful build both inside Visual Studio and from the command prompt in the folder containing my .SLN file (in my case passing in the Visual Studio version as I was using a VS2015RC command prompt, but only had the VS2013 SDKs installed) e.g.

msbuild /p:VisualStudioVersion=12.0

So where are we now?

Now I have a nice little proof of concept on GitHub. To use it add the GeneratedMstests project to your solution and in this project add references to any nUnit projects. Once this is done you should be able to generate wrappers for nUnit tests.

I am sure I could do a better job of test discovery, adding references to assemblies and it would be a good idea to make my the sample code into a Visual Studio template, but it is a start, lets see if it actual does what is needed