But it works on my PC!

The random thoughts of Richard Fennell on technology and software development

Running Test Suites within a network Isolated Lab Management environment when using TFS vNext build and release tooling

Background

As I have posted many times we make use of TFS Lab Management to provide network isolated dev/test environments. Going forward I see us moving to Azure Dev Labs and/or Azure Stack with ARM templates, but that isn’t going to help me today, especially when I have already made the investment in setting up a Lab Management environments and they are ready to use.

One change we are making now is a move from the old TFS Release Management (2013 generation) to the new VSTS and TFS 2015.2 vNext Release tools. This means I need to be able to trigger automated tests on VMs within Lab Management network isolated environments with a command inside my new build/release process. I have posted on how to do this with the older generation Release Management tools, turns out it is in some ways a little simpler with the newer tooling, no need to fiddle with shadow accounts etal.

My Setup

image

Constraints

The constraints are these

  • I need to be able to trigger tests on the Client VM in the network isolated lab environment. These tests are all defined in automated test suites within Microsoft Test Manager.
  • The network isolated lab already has a TFS Test Agent deployed on all the VMs in the environment linked back to the TFS Test Controller on my corporate domain, these agents are automatically installed and managed, and are handling the ‘magic’ for the network isolation – we can’t fiddle with these without breaking the Labs 
  • The new build/release tools assume that you will auto deploy a 2015 generation Test Agent via a build task as part of the build/release process. This is a new test agent install, so removed any already installed Test Agent – we don’t want this as it breaks the existing agent/network isolation.
  • So my only options to trigger the tests by using TCM (as we did in the past) from some machine in the system. In the past (with the old tools) this had to be within the isolated network environment due to the limitation put in place by the use of shadow accounts.  
  • However, TCM (as shipped with VS 2015) does not ‘understand’ vNext builds, so it can’t seem to find them by definition name/number – we have to find builds by their drop location, and I think this needs to be a UNC share, not a drop back onto the TFS server. So using TCM.EXE (and any wrapper scripts) probably is not going to deliver what I want i.e. the test run associated with a vNext build and/or release.

My Solution

The solution I adopted was to write a PowerShell script that performs the same function as the TCMEXEC.PS1 script that used to be run within the network isolated Labe Environment by the older Release Management products.

The difference is the old script shelled out to run TCM.EXE, my new version makes calls to the new TFS REST API (and unfortunately also to the older C# API as some features notably those for Lab Management services are not exposed via REST). This script can be run from anywhere, I chose to run it on the TFS vNext build agent, as this is easiest and this machine already had Visual Studio installed so had the TFS C# API available.

You can find this script on my VSTSPowerShell GitHub Repo.

The usage of the script is

TCMReplacement.ps1
      -Collectionuri http://tfsserver.domain.com:8080/tfs/defaultcollection/
-Teamproject "My Project"
-testplanname "My test plan" 
-testsuitename "Automated tests"
-configurationname "Windows 8"
-buildid  12345
   -environmentName "Lab V.2.0" 
-testsettingsname "Test Setting"
-testrunname "Smoke Tests"
-testcontroller "mytestcontroller.domain.com"
-releaseUri "vstfs:///ReleaseManagement/Release/167"
-releaseenvironmenturi "vstfs:///ReleaseManagement/Environment/247"

Note

  • The last two parameters are optional, all the others are required. If the last two are not used the test results will not be associated with a release
  • The is also a pollinginterval parameter which default to 10 seconds. The script starts a test run then polls on this interval to see if it has completed.
  • If there are any failed test then the script writes to write-error as the TFS build process sees this is a failed step

In some ways I think this script is an improvement over the TCMEXEC script, the old one needed you to know the IDs for many of the settings (loads of poking around in Microsoft Test Manager to find them), I allow the common names of settings to be passed in which I then use to lookup the required values via the APIs (this is where I needed to use the older C# API as I could not find a way to get the Configuration ID, Environment ID or Test Settings ID via REST).

There is nothing stopping you running this script from the command line, but I think it is more likely to make it part of release pipeline using the PowerShell on local machine task in the build system. When used this way you can get many of the parameters from environment variables. So the command arguments become something like the following (and of course you can make all the string values build variables too if you want)

 

   -Collectionuri $(SYSTEM.TEAMFOUNDATIONCOLLECTIONURI) 
-Teamproject $(SYSTEM.TEAMPROJECT)
-testplanname "My test plan"
   -testsuitename "Automated tests"
-configurationname "Windows 8"
-buildid  $(BUILD.BUILDID)
  -environmentName "Lab V.2.0"
   -testsettingsname "Test Settings"
-testrunname "Smoke Tests"
-testcontroller "mytestcontroller.domain.com"
-releaseUri $(RELEASE.RELEASEURI)
-releaseenvironmenturi $(RELEASE.ENVIRONMENTURI)

 

Obviously this script is potentially a good candidate for a TFS build/release task, but as per my usual practice I will make sure I am happy with it’s operation before wrappering it up into an extension.

Known Issues

  • If you run the script from the command line targeting a completed build and release the tests run and are shown in the release report as well as on the test tab as we would expect.

    image

    However, if you trigger the test run from within a release pipeline, the test runs OK and you can see the results in the test tab (and MTM), but they are not associated within the release. My guess is because the release had not completed when the data update is made. I am investigating this to try to address the issue.

So hopefully you will find this a useful tool if you are using network isolated environments and TFS build

Running WebTests as part of a VSTS VNext Release pipeline

Background

Most projects will have a range of tests

  • Unit tests (maybe using a mocking framework) running inside the build process
  • Integration/UX and load tests run as part of a release pipeline
  • and finally manual tests

In a recent project we were using WebTests to provide some integration tests (in addition to integration tests written using unit testing frameworks) as a means to test a REST/ODATA API, injecting data via the API, pausing while a backend Azure WebJob processed the injected data, then checking a second API to make sure the processed data was correctly presented. Basically mimicking user operations.

In past iterations we ran these tests via TFS Lab Management’s tooling, using the Test Agent that is deploys when an environment is created.

The problem was we are migrating to VSTS/TFS 2015.2 Release Management. This uses the new Functional Testing Task, which uses the newer Test Agent that is deployed on demand as part of the release pipeline (not pre-installed) and this agent does not support running WebTests at present.

This means my only option was to use MsTest if I wanted to continue using this form of webtest. However, there is no out the box MsTest task for VSTS, so I needed to write a script to do the job that I could deploy as part of my build artifacts.

Now I could write a build/release task to make this nice and easy to use, but that is more work and I suspect that I am not going to need this script too often in the future (I might be wrong here only time will tell). Also I hope that Microsoft will at some point provide an out the box task to do the job either by providing an MStest task or adding webtest support to the functional test task.

This actually reflects my usual work practice for build tasks, get the script working first locally, use it as PowerShell script in the build, and if I see enough refuse make it a task/extension.

So what did I actually need to do?

Preparation

  1. Install Visual Studio on the VM where the tests will be run from. I need to do this because though MSTest was already present  it fails to run .Webtest tests unless a suitable SKU of Visual Studio is installed
  2. Set the solution configuration so that the projects containing the webtests is not built, we only need the .webtest files copied to the drops location. If you build the project the files get duplicated into the bin folder, which we don’t need as we then need to work out which copy to use.
  3. Make sure the solution contains a .TestSettings file that switches on ‘Think Times’, and this file is copied as a build artifact. This stalled me for ages, could not work out why tests worked in Visual Studio and failed from the command line. Without this file there is no think time at all so my background process never had time to run.

    image
  4. Write a script that finds all my .Webtest files and place it in source control such that it is copied to the builds drop location.
param 

(

$tool = "C:\Program Files (x86)\Microsoft Visual Studio 14.0\Common7\IDE\MSTest.exe",
$path ,
$include = "*.webtest",
$results ,
$testsettings

)

$web_tests = get-ChildItem -Path $paths -Recurse -Include $include

foreach ($item in $web_tests) {
    $args += "/TestContainer:$item "

}


& $tool $args /resultsfile:$Results /testsettings:$testsettings

 

Build

Once the script and other settings are made I altered the build so that the .webtests (including their associated JSON test data sub folders), the script and the .testsettings files are all copied to the drops location

 

image

 

Release

In the release pipeline I need to call my script with suitable parameters so it find the tests, uses the .testsettings and creates a .TRX results file. I then need to use the ‘Publish Test Results’ task to uploaded these MSTest format results

image

So for the PowerShell MSTest task I set the following

  • Script name is $(System.DefaultWorkingDirectory)/MyBuild\drop\Scripts\RunMSTest.ps1 
  • The argument is -path $(System.DefaultWorkingDirectory)\MyBuild\drop\Src\WebtestsProject -results $(System.DefaultWorkingDirectory)\webtests.trx -testsettings $(System.DefaultWorkingDirectory)\MyBuild\drop\src\webtest.testsettings

And for the publish test results task.

  • Format – VSTest
  • Arguments - $(System.DefaultWorkingDirectory)\webtests.trx
  • I also set this task to always run to make sure I got test results even if some test failed

Once all this was done and the build/release run I got my test results I needed

image

 

I can drill into my detailed test reports as needed

image

So I have a functioning release pipeline that can run all the various types of automated tests within my solution.

New version of my VSTS Generate Release Notes extension - now supports Builds and Release

I am pleased to announce that I have just made public on the VSTS marketplace a new version of my VSTS Generate Release Notes extension.

This new version now supports both VSTS/TFS vNext Builds and vNext Releases. The previous versions only supported the generation of release notes as part of a build.

The adding of support for release has meant I have had to rethink the internals of how the templates is process as well as the way templates are passed into the task and where results are stored

  • You can now provide a template as a file (usually from source control) as before, but also as an inline property. The latter is really designed for Releases where there is usually no access to source control, only to build artifact drops (though you could put the template in one of these if you wanted)
  • With a build the obvious place to put the release notes file is in the drops location. For a release there is no such artifact drop location, so I just leave the releases notes on the release agent, it is up to the user to get this file copied to a sensible location for their release process.

To find out more check out the documentation on my GitHub repo and have a look at my sample templates to get you started generating release notes

Updates to my StyleCop task for VSTS/TFS 2015.2

Tracking the current version of StyleCop is a bit awkward. Last week I got an automated email from CodePlex saying 4.7.52.0 had been released . I thought this was the most up to date version, so upgraded my StyleCop command line wrapper and my VSTS StyleCop task from 4.7.47.0 to 4.7.52.0.

However, I was wrong about the current version. I had not realised that the StyleCop team  had forked the code onto GitHub. GitHub is now the home of the Visual Studio 2015 and C# 6 development of StyleCop, while Codeplex remains the home of the legacy Visual Studio versions. I had only upgraded to a legacy patch version, not the current version.

So I upgraded my StyleCop Command Line tool and my VSTS StyleCop task to wrapper 4.7.59.0, thus I think bringing me up to date.

How to build a connection string from other parameters within MSDeploy packages to avoid repeating yourself in Release Management variables

Whilst working with the new Release Management features in VSTS/TFS 2015.2 I found I needed to pass in configuration variables i.e. server name, Db name, UID and Password to create a SQL server via an Azure Resource Management Template release step and a connection string to the same SQL instance for a web site’s web.config, set using an MSDeploy release step using token replacement (as discussed in this post)

Now I could just create RM configuration variables for both the connection string and ARM settings,

image

 

However, this seems wrong for a couple of reason

  1. You should not repeat your self, too easy to get the two values out of step
  2. I don’t really want to obfuscate the whole of a connection string in RM, when only a password really needs to be hidden (note the connection string variable is not set as secure in the above screenshot)

What did not work

I first considered nesting the RM variables, e.g. setting a the connection string variable to be equal to ‘Server=tcp: $(DatabaseServer).database.windows.net,1433;Database=$(DatabaseName)….’, but this does not give the desired results, the S(DatabaseServer) and $(DatabaseName) variables are not expanded at runtime, you just get a string with the variable names in it.

How I got want I was after….

(In this post as a sample I am using the Fabrikam Fiber solution. This means I need to provide a value for the FabrikamFiber-Express connection string)

I wanted to build the connection string from the other variables in the MSDeploy package. So to get the behaviour I want…

  1. In Visual Studio load the Fabrikam web site solution.
  2. In the web project, use the publish option to create a publish profile use the ‘WebDeploy package’ option.
  3. If you publish this package you end up with a setparameter.xml file containing the default connection string
    <setParameter name="FabrikamFiber-Express-Web.config Connection String" value="Your value”/>
    Where ‘your value’ is the value you set in the Publish wizard. So to use this I would need to pass in a whole connection string, where I only want to pass parts of this string
  4. To add bespoke parameters to an MSDeploy package you add a parameter.xml file to the project in Visual Studio (I wrote a Visual Studio Extension that help add this file, but you can create it by hand). My tool will create the parameters.xml file based on the AppSettings block of the projects Web.config. So if you have a web.config containing the following
    <appSettings>
        <add key="Location" value="DEVPC" />
      </appSettings>
    It will create a parameters.xml file as follows
    <?xml version="1.0" encoding="utf-8"?>
    <parameters>
      <parameter defaultValue="__LOCATION__" description="Description for Location" name="Location" tags="">
        <parameterentry kind="XmlFile" match="/configuration/appSettings/add[@key='Location']/@value" scope="\\web.config$" />
      </parameter>
    </parameters>
  5. If we publish at this point we will get a setparameters.xml file containing
    <?xml version="1.0" encoding="utf-8"?>
    <parameters>
      <setParameter name="IIS Web Application Name" value="__Sitename__" />
      <setParameter name="Location" value="__LOCATION__" />
      <setParameter name="FabrikamFiber-Express-Web.config Connection String" value="__FabrikamFiberWebContext__" />
    </parameters>
    This is assuming I used the publish wizard to set the site name to __SiteName__ and the DB connection string to __FabrikamFiberWebContext__
  6. Next step is to add my DB related parameters to the paramaters.xml file, this I do by hand, my tool does not help
    <?xml version="1.0" encoding="utf-8"?>
    <parameters>
      <parameter defaultValue="__LOCATION__" description="Description for Location" name="Location" tags="">
        <parameterentry kind="XmlFile" match="/configuration/appSettings/add[@key='Location']/@value" scope="\\web.config$" />
      </parameter>

      <parameter name="Database Server" defaultValue="__sqlservername__"></parameter>
      <parameter name="Database Name" defaultValue="__databasename__"></parameter>
      <parameter name="Database User" defaultValue="__SQLUser__"></parameter>
      <parameter name="Database Password" defaultValue="__SQLPassword__"></parameter>
    </parameters>
  7. If I publish again, this time the new variables also appear in the setparameters .xml file
  8. Now I need to supress the auto generated creation of the connection string  parameter, and replace it with a parameter that uses the other parameters to generate the connection string. You would think this was a case of added more text to the parameters.xml file, but that does not work. If you add the block you would expect (making sure the name matches the auto generated connection string name) as below
    <parameter 
      defaultValue="Server=tcp:{Database Server}.database.windows.net,1433;Database={Database Name};User ID={Database User}@{Database Server};Password={Database Password};Encrypt=True;TrustServerCertificate=False;Connection Timeout=30;"
      description="Enter the value for FabrikamFiber-Express connection string"
      name="FabrikamFiber-Express-Web.config Connection String"
      tags="">
      <parameterentry
        kind="XmlFile"
        match="/configuration/connectionStrings/add[@name='FabrikamFiber-Express']/@connectionString"
        scope="\\web.config$" />
    </parameter>

    It does add the entry to setparameters.xml, but this blocks the successful operations at deployment. It seems that if a value needs to be generated from other variables there can be no entry for it in the setparameters.xml. Documentation hints you can set the Tag to ‘Hidden’ but this does not appear to work.

    One option would be to let the setparameters.xml file be generated and then remove the offending line prior to deployment but this feels wrong and prone to human error
  9. To get around this you need to added a file name <projectname>.wpp.target to the same folder as the project (and add it to the project). In this file place the following
    <?xml version="1.0" encoding="utf-8"?>
    <Project ToolsVersion="4.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
    <Target Name="DeclareCustomParameters"
              BeforeTargets="Package">
        <ItemGroup>
          <MsDeployDeclareParameters Include="FabrikamFiber-Express">
            <Kind>XmlFile</Kind>
            <Scope>Web.config</Scope>
            <Match>/configuration/connectionStrings/add[@name='FabrikamFiber-Express']/@connectionString</Match>
            <Description>Enter the value for FabrikamFiber-Express connection string</Description>
            <DefaultValue>Server=tcp:{Database Server}.database.windows.net,1433;Database={Database Name};User ID={Database User}@{Database Server};Password={Database Password};Encrypt=True;TrustServerCertificate=False;Connection Timeout=30;</DefaultValue>
            <Tags></Tags>
            <ExcludeFromSetParameter>True</ExcludeFromSetParameter>
          </MsDeployDeclareParameters>
        </ItemGroup>
      </Target>
      <PropertyGroup>
        <AutoParameterizationWebConfigConnectionStrings>false</AutoParameterizationWebConfigConnectionStrings>
      </PropertyGroup>
    </Project>

    The first block declares the parameter I wish to use to build the connection string. Note the ‘ExcludeFromSetParameter’ setting so this parameter is not in the setparameters.xml file. This is what you cannot set in the parameters.xml

    The second block stops the auto generation of the connection string. (Thanks to Sayed Ibrahim Hashimi for various posts on getting this working)
  10. Once the edits are made unload and reload the project as the <project>. wpp.targets file is cached on loading by Visual Studio.
  11. Make sure the publish profile is not set to generate a connection string

    image
  12. Now when you publish the project, you should get a setparameters.xml file with only the four  SQL variables, the AppSettings variables and the site name.
    (Note I have set the values for all of these to the format  __NAME__, this is so I can use token replacement in  my release pipeline)
    <?xml version="1.0" encoding="utf-8"?>
    <parameters>
      <setParameter name="IIS Web Application Name" value="__Sitename__" />
      <setParameter name="Location" value="__LOCATION__" />
      <setParameter name="Database Server" value="__sqlservername__" />
      <setParameter name="Database Name" value="__databasename__" />
      <setParameter name="Database User" value="__SQLUser__" />
      <setParameter name="Database Password" value="__SQLPassword__" />
    </parameters>
  13. If you deploy the web site, the web.config should have your values from the setparameters.xml file in it
    <appSettings>
       <add key="Location" value="__LOCATION__" />
    </appSettings>
    <connectionStrings>
         <add name="FabrikamFiber-Express" connectionString="Server=tcp:__sqlservername__.database.windows.net,1433;Database=__databasename__;User ID=__SQLUser__@__sqlservername__;Password=__SQLPassword__;Encrypt=True;TrustServerCertificate=False;Connection Timeout=30;" providerName="System.Data.SqlClient" />
    </connectionStrings>

You are now in a position manage the values of the setparameters.xml file however you wish. My choice is to use the ‘Replace Tokens’ build/release tasks from Colin’s ALM Corner Build & Release Tools Extension, as this tasks correctly handles secure/encrypted RM variables as long as you use the ‘Secret Tokens’ option on the advanced menu.

image



 

Summary

So yes, it all seems a but too complex, but it does work, and I think it makes for a cleaner deployment solution, less prone to human error. Which is what any DevOps solution must always strive for.

Depending on the values you put in the <project>.wpp.targets you can parameterise the connection string however you need.

Announcing release of my vNext build tasks as extensions in the VSTS/TFS Marketplace

In the past I have posted about the vNext TFS build tasks I have made available via my GitHub repo. Over the past few weeks I have been making an effort to repackage these as extensions in the new VSTS/TFS Marketplace, thus making them easier to consume in VSTS or using the new extensions support in TFS 2015.2

This it is an ongoing effort, but I pleased to announce the release of the first set of extension.

  • Generate Release Notes – generates a markdown release notes file based on work items associated with a build
  • Pester Test Runner – allows Pester based tests to be run in a build
  • StyleCop Runner – allows a StyleCop analysis to be made of files in a build
  • Typemock TMockRunner – used TMockrunner to wrapper MSTest, allowing Typemock test to be run on a private build agent

To try to avoid people going down the wrong path I intend to go back through my older blog posts on these tasks to update them to point at new resources.

Hope you find these tasks useful. If you find any log any issues on Github

A vNext build task and PowerShell script to generate release notes as part of TFS vNext build.

Updated 22 Mar 2016: This task is now available as an extension in the VSTS marketplace

A common request I get from clients is how can I create a custom set of release notes for a build? The standard TFS build report often includes the information required (work items and changesets/commits associate with the build) but not in a format that is easy to redistribute. So I decided to create a set to tools to try to help.

The tools are available on my github account in two forms:

Both generate a markdown release notes file based on a template passed into the tool. The output report being something like the following:

Release notes for build SampleSolution.Master

Build Number: 20160229.3
Build started: 29/02/16 15:47:58
Source Branch: refs/heads/master

Associated work items
  • Task 60 [Assigned by: Bill <TYPHOONTFS\Bill>] Design WP8 client
Associated change sets/commits
  • ID bf9be94e61f71f87cb068353f58e860b982a2b4b Added a template
  • ID 8c3f8f9817606e48f37f8e6d25b5a212230d7a86 Start of the project

The Template

The use of a template allows the user to define the layout and fields shown in the release notes document. It is basically a markdown file with tags to denote the fields (the properties on the JSON response objects returned from the VSTS REST API) to be replaced when the tool generates the report file.

The only real change from standard markdown is the use of the @@TAG@@ blocks to denote areas that should be looped over i.e: the points where we get the details of all the work items and commits associated with the build.

#Release notes for build $defname  
**Build Number**  : $($build.buildnumber)   
**Build started** : $("{0:dd/MM/yy HH:mm:ss}" -f [datetime]$build.startTime)    
**Source Branch** : $($build.sourceBranch) 
###Associated work items 
@@WILOOP@@ 
* **$($widetail.fields.'System.WorkItemType') $($widetail.id)** [Assigned by: $($widetail.fields.'System.AssignedTo')] $($widetail.fields.'System.Title') 
@@WILOOP@@ 
###Associated change sets/commits 
@@CSLOOP@@ 
* **ID $($csdetail.changesetid)$($csdetail.commitid)** $($csdetail.comment)   
@@CSLOOP@@  

Note 1: We can return the builds startTime and/or finishTime, remember if you are running the template within an automated build the build by definition has not finished so the finishTime property is empty to can’t be parsed. This does not stop the generation of the release notes, but an error is logged in the build logs.

Note 2: We have some special handling in the @@CSLOOP@@ section, we include both the changesetid and the commitid values, only one of there will contain a value, the other is blank. Thus allowing the template to work for both GIT and TFVC builds.

What is done behind the scenes is that each line of the template is evaluated as a line of PowerShell in turn, the in memory versions of the objects are used to provide the runtime values. The available objects to get data from at runtime are

  • $build – the build details returned by the REST call Get Build Details
  • $workItems – the list of work items associated with the build returned by the REST call Build Work Items
  • $widetail – the details of a given work item inside the loop returned by the REST call Get Work Item
  • $changesets – the list of changeset/commit associated with the build build returned by the REST call Build Changes
  • $csdetail – the details of a given changeset/commit inside the loop by the REST call to Changes or Commit depending on whether it is a GIT or TFVC based build

There is a templatedump.md file that just dumps out all the available fields in the PowerShell repo to help you find all the available options

Differences between the script and the task

The main difference between the PowerShell script and the build task is the way the connection is made to the REST API. Within the build task we pickup the access token from the build agent’s context. For the PowerShell script we need to pass credentials in some form or the other, either via parameters or using the default Windows credentials.

Usage

PowerShell

The script can be used in a number of ways

To generate a report for a specific build on VSTS

 .\Create-ReleaseNotes.ps1 -collectionUrl https://yoursite.visualstudio.com/defaultcollection -teamproject "Scrum Project" –defname "BuildTest" -outputfile "releasenotes.md" -templatefile "template.md" -buildnumber "yourbuildnum" -password yourpersonalaccesstoken

Or for the last successful build just leave out the buildnumber

 .\Create-ReleaseNotes.ps1 -collectionUrl https://yoursite.visualstudio.com/defaultcollection -teamproject "Scrum Project" –defname "BuildTest" -outputfile "releasenotes.md" -templatefile "template.md" -password yourpersonalaccesstoken

Authentication options

  1. VSTS with a personal access token – just provide the token using the password parameter
  2. If you are using VSTS and want to use alternate credentials just pass a username and password
  3. If your are using the script with an on-premises TFS just leave off both the username and password and the Windows default credentials will be used.

In all cases the debug output is something like the following


VERBOSE: Getting details of build [BuildTest] from server [https://yoursite.visualstudio.com/defaultcollection/Scrum Project]
VERBOSE: Getting build number [20160228.2]
VERBOSE:    Get details of workitem 504
VERBOSE:    Get details of changeset/commit ba7e613388c06b8440c9e7601a8d6fa29d588051
VERBOSE:    Get details of changeset/commit 52570b2abb80b61a4a629dfd31c0ce071c487709
VERBOSE: Writing output file  for build [BuildTest] [20160228.2].

You should expect to get a report like the example shown at the start of this post.

Build Task

The build task needs to be built and uploaded as per the standard process detailed on my vNext Build’s Wiki (am considering creating a build extensions package to make this easier, keep an eye on this blog)

Once the tool is upload to your TFS or VSTS server it can be added to a build process

image

The task takes two parameters

  • The output file name which defaults to $(Build.ArtifactStagingDirectory)\releasenotes.md
  • The template file name, which should point to a file in source control.

There is no need to pass credentials, this is done automatically

When run you should expect to see a build logs as below and a releases notes file in your drops location.

image

Summary

So I hope some people find these tools useful in generating release notes, let me know if they help and how they could be improved.

Running Pester PowerShell tests in the VSTS hosted build service

Updated 22 Mar 2016 This task is available in the VSTS Marketplace

If you are using Pester to unit test your PowerShell code then there is a good chance you will want to include it in your automated build process. To do this, you need to get Pester installed on your build machine. The usual options would be

If you own the build agent VM then any of these options are good, you can even write the NuGet restore into your build process itself. However there is a problem, both the first two options need administrative access as they put the Pester module in the $PSModules folder (under ‘Program Files’); so these can’t be used on VSTS’s hosted build system, where your are not an administrator

So this means you are left with copying the module (and associated functions folder) to some local working folder and running it manually; but do you really want to have to store the Pester module in your source repo?

My solution was to write a vNext build tasks to deploy the Pester files and run the Pester tests.

image_thumb[12]

The task takes two parameters

  • The root folder to look for test scripts with the naming convention  *.tests.ps1. Defaults to $(Build.SourcesDirectory)\*
  • The results file name, defaults to $(Build.SourcesDirectory)\Test-Pester.XML

The Pester task does not in itself upload the test results, it just throws and error if tests fails. It relies on the standard test results upload task. Add this task and set

  • it to look for nUnit format files
  • it already defaults to the correct file name pattern.
  • IMPORTANT: As the Pester task will stop the build on an error you need to set the ‘Always run’ to make sure the results are published.

image_thumb[11]

Once all this is added to your build you can see your Pester test results in the build summary

image_thumb[10]

image_thumb[14]

You can find the task in my vNextBuild repo

A vNext build task to get artifacts from a different TFS server

With the advent of TFS 2015.2 RC (and the associated VSTS release) we have seen the short term removal of the ‘External TFS Build’ option for the Release Management artifacts source. This causes me a bit of a problem as I wanted to try out the new on premises vNext based Release Management features on 2015.2, but don’t want to place the RC on my production server (though there is go live support). Also the ability to get artifacts from an on premises TFS instance when using VSTS open up a number of scenarios, something I know some of my clients had been investigating.

To get around this blocker I have written a vNext build task that does the getting of a build artifact from the UNC drop. It supports both XAML and vNext builds. Thus replacing the built in artifact linking features.

Usage

To use the new task

  • Get the task from my vNextBuild repo (build using the instructions on the repo’s wiki) and install it on your TFS 2015.2 instance (also use the notes on the repo’s wiki).
  • In your build, disable the auto getting of the artifacts for the environment (though in some scenarios you might choose to use both the built in linking and my custom task)

image

  • Add the new task to your environment’s release process, the parameters are
    • TFS Uri – the Uri of the TFS server inc. The TPC name
    • Team Project – the project containing the source build
    • Build Definition name – name of the build (can be XAML or vNext)
    • Artifact name – the name of the build artifact (seems to be ‘drop’ if a XAML build)
    • Build Number – default is to get the latest successful completed build, but you can pass a specific build number
    • Username/Password – if you don’t want to use default credentials (the user the build agent is running as), these are the ones used. These are passed as ‘basic auth’ so can be used against an on prem TFS (if basic auth is enabled in IIS)  or VSTS (with alternate credentials enabled).

image

 

When the task runs it should drop artifacts in the same location as the standard mechanism, so can be picked up by any other tasks on the release pipeline using a path similar to $(System.DefaultWorkingDirectory)\SABS.Master.CI\drop

Limitations

The task in its current form does not provide any linking of artifacts to the build reports, or allow the selection of build versions when the release is created. This removing audit trail features.

However, it does provide a means to get a pair of TFS servers working together, so can certainly enable some R&D scenarios while we await 2015.2 to RTM and/or the ‘official’ linking of External TFS builds as artifacts

vNext Build editor filePath control always returns a path even if you did not set a value

You can use the filePath type in a vNext VSTS/TFS task as shown below 

{
     "name": "settingsFile",
     "type": "filePath",
     "label": "Settings File",
     "defaultValue": "",
     "required": false,
     "helpMarkDown": "Path to single settings files to use (as opposed to files in project folders)",
     "groupName":"advanced"
   }

to present a file picker dialog in the build editor that allows the build editor to pick a file or folder in the build’s source repository

image

While doing some task development recently I found that this control did not behave as I had expected

  • If a value is explicitally set then the full local path to selected file or folder (on the build agent) is returned e.g. c:\agent\_work\3\s\yourfolder\yourfile.txt – just as expected
  • If you do not set a value, or set a value then remove your setting when you edit a build, then you don’t get an empty string, as I had expected. You get the path to the BUILD_SOURCESDIRECTORY e.g. c:\agent\_work\3\s – makes sense when you think about it.

So, if as in my case, you wanted to have specific behaviour only when this values was set to something other than the repo root you need to add some guard code


if ($settingsFile -eq $Env:BUILD_SOURCESDIRECTORY )
{
    $settingsFile = ""
}

Once I did this my task behaved as a needed, only running the code when the user had set an explicit value for the settings file.