BM-Bloggers

The blogs of Black Marble staff

Running WebTests as part of a VSTS VNext Release pipeline

Background

Most projects will have a range of tests

  • Unit tests (maybe using a mocking framework) running inside the build process
  • Integration/UX and load tests run as part of a release pipeline
  • and finally manual tests

In a recent project we were using WebTests to provide some integration tests (in addition to integration tests written using unit testing frameworks) as a means to test a REST/ODATA API, injecting data via the API, pausing while a backend Azure WebJob processed the injected data, then checking a second API to make sure the processed data was correctly presented. Basically mimicking user operations.

In past iterations we ran these tests via TFS Lab Management’s tooling, using the Test Agent that is deploys when an environment is created.

The problem was we are migrating to VSTS/TFS 2015.2 Release Management. This uses the new Functional Testing Task, which uses the newer Test Agent that is deployed on demand as part of the release pipeline (not pre-installed) and this agent does not support running WebTests at present.

This means my only option was to use MsTest if I wanted to continue using this form of webtest. However, there is no out the box MsTest task for VSTS, so I needed to write a script to do the job that I could deploy as part of my build artifacts.

Now I could write a build/release task to make this nice and easy to use, but that is more work and I suspect that I am not going to need this script too often in the future (I might be wrong here only time will tell). Also I hope that Microsoft will at some point provide an out the box task to do the job either by providing an MStest task or adding webtest support to the functional test task.

This actually reflects my usual work practice for build tasks, get the script working first locally, use it as PowerShell script in the build, and if I see enough refuse make it a task/extension.

So what did I actually need to do?

Preparation

  1. Install Visual Studio on the VM where the tests will be run from. I need to do this because though MSTest was already present  it fails to run .Webtest tests unless a suitable SKU of Visual Studio is installed
  2. Set the solution configuration so that the projects containing the webtests is not built, we only need the .webtest files copied to the drops location. If you build the project the files get duplicated into the bin folder, which we don’t need as we then need to work out which copy to use.
  3. Make sure the solution contains a .TestSettings file that switches on ‘Think Times’, and this file is copied as a build artifact. This stalled me for ages, could not work out why tests worked in Visual Studio and failed from the command line. Without this file there is no think time at all so my background process never had time to run.

    image
  4. Write a script that finds all my .Webtest files and place it in source control such that it is copied to the builds drop location.
param 

(

$tool = "C:\Program Files (x86)\Microsoft Visual Studio 14.0\Common7\IDE\MSTest.exe",
$path ,
$include = "*.webtest",
$results ,
$testsettings

)

$web_tests = get-ChildItem -Path $paths -Recurse -Include $include

foreach ($item in $web_tests) {
    $args += "/TestContainer:$item "

}


& $tool $args /resultsfile:$Results /testsettings:$testsettings

 

Build

Once the script and other settings are made I altered the build so that the .webtests (including their associated JSON test data sub folders), the script and the .testsettings files are all copied to the drops location

 

image

 

Release

In the release pipeline I need to call my script with suitable parameters so it find the tests, uses the .testsettings and creates a .TRX results file. I then need to use the ‘Publish Test Results’ task to uploaded these MSTest format results

image

So for the PowerShell MSTest task I set the following

  • Script name is $(System.DefaultWorkingDirectory)/MyBuild\drop\Scripts\RunMSTest.ps1 
  • The argument is -path $(System.DefaultWorkingDirectory)\MyBuild\drop\Src\WebtestsProject -results $(System.DefaultWorkingDirectory)\webtests.trx -testsettings $(System.DefaultWorkingDirectory)\MyBuild\drop\src\webtest.testsettings

And for the publish test results task.

  • Format – VSTest
  • Arguments - $(System.DefaultWorkingDirectory)\webtests.trx
  • I also set this task to always run to make sure I got test results even if some test failed

Once all this was done and the build/release run I got my test results I needed

image

 

I can drill into my detailed test reports as needed

image

So I have a functioning release pipeline that can run all the various types of automated tests within my solution.

Building bridges - getting DevOps working through Devs and IT Pros talking and learning from each other

I was lucky enough to attended and be on a panel at yesterdays WinOps London conference, it was a different and very interesting view on DevOps for me. I spend most of my time consulting with test and development teams, with these teams it is very rare to come across a team not using source control and they commonly have some form of automated build too. This means any DevOps discussion usually come from the side of ‘how can I extend my build into deployment…’.

At the conference yesterday, where there seemed to be more IT Pro attendees than developers, this ‘post build’ view of was not the norm. Much of the conference content was focused around the provisioning and configuration of infrastructure, getting the environment ‘ready for deployment of a build’. What surprised me most was how repeatedly speakers stressed the importance of using source control to manage scripts and hence control the version of the environments being provisioning.

So what does this tell us?

The obvious fact to me is that the bifurcation of our industry between Devs and IT Pros  means there is a huge scope for swapping each group’s best practices. What seem ingrained best practice for one role is new and interesting for the other. We can all learn from each other – assuming we communicate.

This goes to the core of DevOps, that it is not a tool but a process based around collaboration.

If you want to find out more about how we see DevOps at Black Marble we are running events and are out and about at user groups. Keep an eye on the Black Marble events site or drop me an email.

Migrating work items to VSTS with custom fields using TFS Integration Platform

If you wish to migrate work items from TFS to VSTS your options are limited. You can of course just pull over work items, without history, using Excel. If you have no work item customisation them OpsHub is an option, but if you have work item customisation then you are going to have to use TFS Integration Platform. And we all know what a lovely experience that is!

Note: TFS Integration Platform will cease to be supported by Microsoft at the end of May 2016, this does not mean the tool is going away, just that there will be no support via forums.

In this post I will show how you can use TFS Integration platform to move over custom fields to VSTS, including the original TFS work item ID, this enabling migrations with history as detailed in my MSDN article

TFS Integration Platform Setup

Reference Assemblies

TFS Integration Platform being a somewhat old tool, design for TFS 2010, does not directly support TFS 2015 or VSTS. You have to select the Dev11 connection options (which is TFS 2012 by its internal code name). However, this will still cause problems as it fails to find all the assemblies it expects

The solution to this problem is provided in this post, the key being to add dummy registry entries

  1. Install either
  2. Add the following registry key after you have installed Team Explorer or equiv.
    Windows Registry Editor Version 5.00
    

    [HKEY_LOCAL_MACHINE\SOFTWARE\Wow6432Node\Microsoft\VisualStudio\11.0\InstalledProducts\Team System Tools for Developers]

    @="#101"

    "LogoID"="#100"

    "Package"="{97d9322b-672f-42ab-b3cb-ca27aaedf09d}"

    "ProductDetails"="#102"

    "UseVsProductID"=dword:00000001


MSI

Once this is done the TFS Integration Tools installation should work.

Accept the default options, you will need to select a SQL server for the tool to use as a database to store its progress. The installer will create a DB called tfs_integrationplatform on the SQL instance

Creating a Mappings File

TFS Integration platform needs a mapping file to work out which fields go where.

  1. We assume there is a local TFS server with the source to migrate from and a VSTS instance containing a team project using a reasonably compatible uncustomised process template
  2. Download the TFS Process Mapper and run it.
  3. You need to load into the process mapper the current work item configuration, the tools provides buttons to do this from XML files (exported with WITADMIN) or directly from the TFS/VSTS server.
  4. You should see a list of fields in both the source and target server definitions of the given work item type.
  5. Use the automap button to match the fields
  6. Any unmatch fields will be left on the left columns

    image
  7. Some field you may be match manually e.g. handing name changes from ‘Area ID’ to ‘AreadID’
  8. If you have local custom fields you can add matching fields on the VSTS instance, this is done using the process on MSDN.
  9. Once you have added your custom filed I have found it best to clear the mapping tool and re-import the VSTS work item definitions. The new fields appear in the list and can be mapped manually to their old equivalents.
  10. I now exported my mappings file.
  11. This process described above is the same as manually editing the mapping file in the form
    <MappedField MapFromSide="Left" LeftName="BM.Custom1" RightName="BMCustom1" />

    There is a good chance one of the fields you want is the old TFS servers work item. If you add the mapping as above for System.ID you would expect it to work. However, it does not the field is empty on the target system. I don’t think this is a bug, just an unexpected behaviour in the way the unique WI IDs are handled by the tool. As a workaround I found I had to also be an aggregate field to force the System.ID to be transferred. In my process customisation on VSTS I created an Integer OldId custom field. I then added the following to my mapping, it is important to note that I don’t use the  line in the mappedfields block, I used a AggregatedField. 
    <MappedFields>
             <-- all auto generated mapping stuff,
    This is where you would expect a line like the one below
             <MappedField MapFromSide="Left" LeftName="System.Id" RightName="OldID" /> –>
    </MappedFields>
    <AggregatedFields>
           <FieldsAggregationGroup MapFromSide="Left" TargetFieldName="OldID" Format="{0}">
               <SourceField Index="0" SourceFieldName="System.Id" valueMap=""/>
           </FieldsAggregationGroup>
    </AggregatedFields>
  12. I could now use my edited mappings file

Running TFS Integration Platform

I could now run the TFS Integration tools using the mappings file

  1. Load TFS Integration Platform
  2. Create a new configuration
  3. Select the option for work items with explicit mappings
  4. Select your source TFS server
  5. Select your target VSTS server
  6. Select the work item query that returns the items we wish to move
  7. Edit the mapping XML, but and past in the edited block from the previous section. Note that if you are moving multiple work item types then you will be combining a number of these mapping sections
  8. Save the mapping file, you are now ready to use it in TFS Integration Platform

 

And hopefully work migration will progress as you hope. It might take some trial and error but you should get there in the end.

But really……

This all said, I would still recommend just bring over the active work item backlog and current source when moving to VSTS. It a easier, faster and give you a chance to sort out structures without bringing in all your poor choices of the past.

New version of my VSTS Generate Release Notes extension - now supports Builds and Release

I am pleased to announce that I have just made public on the VSTS marketplace a new version of my VSTS Generate Release Notes extension.

This new version now supports both VSTS/TFS vNext Builds and vNext Releases. The previous versions only supported the generation of release notes as part of a build.

The adding of support for release has meant I have had to rethink the internals of how the templates is process as well as the way templates are passed into the task and where results are stored

  • You can now provide a template as a file (usually from source control) as before, but also as an inline property. The latter is really designed for Releases where there is usually no access to source control, only to build artifact drops (though you could put the template in one of these if you wanted)
  • With a build the obvious place to put the release notes file is in the drops location. For a release there is no such artifact drop location, so I just leave the releases notes on the release agent, it is up to the user to get this file copied to a sensible location for their release process.

To find out more check out the documentation on my GitHub repo and have a look at my sample templates to get you started generating release notes

WSUS Non-Functional After KB3159706 Installed

Consider the following scenario:

  • You have WSUS installed on either Windows Server 2012 or 2012 R2
  • You install KB3159706

In this situation, WSUS fails to start correctly and thus fails to function.

There are additional steps that are required to configure this update once it is installed. The steps can be found in KB3159706.

Note: If using database mirroring or the SUSDB is part of an AlwaysOn Availability Group, this must be undone before performing the actions described in KB3159706 as a schema update is required for the database.

Putting a release process around my VSTS extension development

Updated: 5th Aug 2016 added notes in PublisherID


 

I have been developing a few VSTS/TFS build related extensions and have published a few in the VSTS marketplace. This has all been a somewhat manual process, a mixture of Gulp and PowerShell has helped a bit, but I decided it was time to try to do a more formal approach. To do this I have used Jesse Houwing’s VSTS Extension Tasks.

Even with this set of tasks I am not sure what I have is ‘best practice’, but it does work. The doubt is due to the way the marketplace handles revisions and preview flags. What I have works for me, but ‘your mileage may differ’

My Workflow

The core of my workflow is that I am building the VSIX package twice, once as a private package and the other as a public one. They both contain the same code and have the same version number, they differ in only visibility flags

I am not using a the preview flag options at all. I have found they do not really help me. My workflow is to build the private package, upload it and test it by sharing it with a test VSTS instance. if all is good publish the matched public package on the marketplace. In this model there is no need to use a preview, it just adds complexity I don’t need.

This may not be true for everyone.

Build

The build’s job is to take the code, set the version number and package it into multiple VSIX package.

  1. First I have the vNext build get my source from my GitHub repo.
  2. I add two build variables $(Major) and $(Minor) that I use to manually manage my version number
  3. I set my build number format to $(Major).$(Minor).$(rev:r), so the final .number is incremented until I choose to increment the major or minor version.
  4. I then use one of Jesse’s tasks to package the extension multiple times using the extension tag model parameter. Each different package step uses different Visibility settings (circled in red). I also set the version, using the override options, to the $(Build.BuildNumber) (circled in green)

    image
  5. [Updated Aug 2016] Set the PublisherID and ExtensionID on the tasks, using a pair of build variables is a good idea here to avoid entering strings twice. It is important thay the PublisherID is entered with the correct case - it is case sensitive within the marketplace. Strange things happend of the PublisherID in a VSIX package differ from the one registered on the marketplace
  6. As I am using the VSTS hosted build agent I also need to make sure I check the install Tfx-cli in the global setting section
  7. I then add a second identical publish task, but this time there is no tag set and the visibility is set to public.
  8. Finally I use a ‘publish build artifacts’ task to copy the VSIX packages to a drop location

Release

So now I have multiple VSIX packages I can use the same family of tasks to create a release pipeline.

I create a new release linked to be a Continuous Deployment of the previously created build and set its release name format to Release-$(Build.BuildNumber)

My first environment uses three tasks, all using the option - to work from a VSIX package.

Note In all cases I am using the VSIX path in the format $(System.DefaultWorkingDirectory)/GenerateReleaseNotes.Master/vsix/<package name>-<tag>-$(Build.BuildNumber).vsix. I am including the build number variable in the path as I chose to put all the packages in a single folder, so path wildcards are not an option as the task would not know which package to use unless I alter my build to put one VSIX package per folder.

My tasks for the first environment are

  1. Publish VSTS Extension – using my private package so it is added as a private package to the marketplace
  2. Share VSTS Extension – to my test VSTS account
  3. Install VSTS Extension – to my test VSTS account

For details in the usage of these tasks and setting up the link to the VSTS Marketplace see Jesse’s wiki

If I only intend a extension to ever be private this is enough. However I want to make mine public so I add a second environment that has manual pre-approval (so I have to confirm the public release)

This environment only needs single task

  1. Publish VSTS Extension – using my public package so it is added as a public package to the marketplace

I can of course add other tasks to this environment maybe send a Tweet or email to publicise the new version’s release

Summary

So now I have a formal way to release my extensions. The dual packaging model means I can publish two different versions at the same time one privately and the other public

image

It is now just a case of moving all my extensions over to the new model.

Though I am still interested to hear what other people view are? Does this seem a reasonable process flow?

Upgrading BlogEngine to 3.3

I have just completed the upgrade of our Blog Server to BlogEngine 3.3. This upgrade is a bit more complex than the usual upgrade as between 3.2 to 3.3 there is a change to Razor views for all the widgets. This means you need to remove all the old widgets you have and re-add them using the new razor equivalents.

As our blog is backed by SQL, this mean a SQL script to clear down the old widgets, then a manual add of the new versions on each blog we have on our server. One point to note, if using SQL you do need to get BlogEngine 3.3 from its GitHub repo (at the time of writing, I am sure this will change) as after the formal 3.3 release on CodePlex there is a fix for an issue that stopped the editing of widget properties.

So first experiences with 3.3?

Seems much more responsive, so all is looking good

Azure Web Apps-Deploying Java Servlets to Azure App Service Web Apps

If you are considering to move to hosting your websites in Azure but either have a lot of legacy applications written in Java or your organisation is Java focussed, then Azure App Services provide the option to host Java code (Java Servlets, JSPs etc.) in the same way that they can host .NET code (ASP.NET Web Api, Forms, MVC etc.)

To test this, I have taken a pre-built WAR file containing a single Java Servlet, and see how much effort was required to host it in an Azure Web App.

The approach to hosting Java is as follows:

1. Create the Web App.

2. Go into the Web App, Enable the Java runtime and select your application server (Tomcat and Jetty are available).

image image

3. Upload your WAR file to the Web App. I chose FTP, but there are a number of options for publishing.  To reiterate, the process of publishing a Java Web App is exactly the same as if you were publishing a .NET Web App (except that you don’t have the option of using Visual Studio to publish).  Note: put your WAR file in the “site\wwroot\webapps” folder.  This isn’t immediately obvious and can be one of two places depending on how the web app was provisioned.  See this article for more information.

image

4. Confirm it as running.

That’s all there is to it.

Granted, this is a simple scenario, but Azure web apps have the capability to reach on to your on-premise network using things like Site-to-site VPN, ExpressRoute or Hybrid Connections to give you access to resources like databases, line-of-business systems etc. on your network.