BM-Bloggers

The blogs of Black Marble staff

Release Manager - New deployment is not allowed as an another deployment is in progress

Whilst working with a vNext Release Management pipeline I started seeing the error

Microsoft.TeamFoundation.Release.Common.Helpers.OperationFailedException:
New deployment is not allowed as an another deployment is in progress.
Retry the deployment after sometime.

Problem was I could not see any blocked or paused deployment releases. All Internet searches mentioned multiple pipelines that share components, but this was not the issue.

Eventually I found the issue, my release pipeline included a step that ran CodedUI tests via TCM, hence a previous running of this template had triggered the test via TCM, but they had stalled. I found this by looking in MTM.

image

Release Management was just saying the release was rejected with the above error message, no clue about the unfinished test run. Not that helpful.

You might have expect Release Management to return only after the test had timed out, but that is only down to whether you set the release pipeline to wait or not, I had set mine not to wait.

Once I stopped this test run via MTM all was OK.

Running nUnit and Jasmine.JS unit tests in TFS/VSO vNext build

This article was first published on the Microsoft’s UK Developers site as Running nUnit and Jasmine.JS unit tests in TFS/VSO vNext build

With the advent of vNext build in TFS 2015 and Visual Studio Online running unit tests that are not MSTest based within your build process is far more straightforward than it used to be. No longer do you have to use custom XAML build activities or tell all your TFS build controllers where the test runner assemblies are. The ‘out the box’ vNext build Visual Studio Test task will automatically load any test adaptors it finds in the path specified for test runners in its advanced properties, a path that can be populated via NuGet.

Running nUnit tests

All this means that to find and run MSTest and nUnit tests as part of your build all you have to do is as follows

  1. Create a solution that contains a project with MStest and nUnit tests, in my sample this is a MVC web application project with its automatically created MSTest unit tests project.
  2. In the test project add some nUnit tests. Use NuGet to add the references to nUnit to the test project so it compiles.
  3. Historically in your local Visual Studio instance you needed to install the nUnit Test Runner VSIX package from Visual Studio Gallery – this allows Visual Studio to discover your nUnit tests, as well as any MSTest ones, and run them via the built in Test Explorer

    image

    IMPORTANT Change –
    However installing this VSIX package is no longer required. If you use Nuget to add the nUnit Test Runner to the solution, as well as the nUnit package itself, then Visual Studio can find the nUnit tests without the VSIX package. This is useful but not world changing on your development PC, but when on the build box it means the NuGet restore will make sure the nUnit test adapter assemblies are pulled down onto the local build boxes file system and used to find tests with no extra work.

    Note
    : If you still want to install the VSIX package on your local Visual Studio instance you can, it is just you don’t have to.
  4. Check in your solution into TFS/VSO source control. It does not matter if it is TFVC or Git based
  5. Create a new vNext build using the Visual Studio template
  6. You can leave most of the parameters on default setting. But you do need to edit the Visual Studio Test task’s advanced settings to point at the NuGet packages folder for your solution (which will be populated via NuGet restore) so the custom nUnit test adaptor can be found i.e. usually setting it to  $(Build.SourcesDirectory)\packages

    image
  7. The build should run and find your tests, the MStest ones because they are built in and the nUnit ones because it found the custom test adaptor due to the NuGet restore being done prior to the build. The test results can be found on the build summary page

    image

 

But what if you want run Jasmine.JS test?

If you want to run Jasmine JavaScript unit tests the process is basically the same. The only major difference is that you do still need to install the Chutzpah Test runner on your local Visual Studio as a VSIX package to run the tests locally. There is a NuGet package for the Chutzpah test runner so you can avoid having to manually unpack the VSIX and get it into source control to deploy it to the build host (unless you really want to follow this process), but this package does not currently enable Visual Studio to find the Jasmine tests without the VSIX extension being installed, or at least it didn’t for me.

Using the solution I used before

  1. Use NuGet to add Jasmine.JS to the test project
  2. Add a test file to the test project e.g. mycode.tests.js (adding any JavaScript references needed to find any script code under test in the main WebApp project)
  3. Install the Chutzpah Test runner in your local Visual Studio as a VSIX extension, restart Visual Studio
  4. You should now be able to see and run the Jasmine test run in the test runner as well as the MSTest and nUnit tests.

    image
  5. Add the NuGet package for the Chutzpah test runner to your solution, this is a solution level package, so does not need to be associated with any project.
  6. Check the revised code into source control
  7. In your vNext build add another Visual Studio Test task, set the test assembly to match your javascript test naming convention e.g. **\*.tests.js and the path to the custom test adaptor to $(Build.SourcesDirectory)\packages (as before)

    image
  8. Run the revised build.

    image
  9. You should see the two test tasks run and a pair of test results in the summary for the build.

So now hopefully you should find this a more straight forward way to added testing to your vNext builds. Allowing easy use of both your own build boxes and the hosted build service for VSO with testing frameworks they do not support ‘out the box’

Powershell to help plot graphs of how long TFS upgrades take

When doing TFS upgrades it is useful to know roughly how long they will take. The upgrade programs give a number of steps, but not all steps are equal. Some are quick, some are slow. I have found it useful to graph past updates so I can get a feel of how long an update will take given it got to ‘step x in y minutes’. You can do this by hand, noting down time as specific steps are reached. However for a long upgrade it usually means pulling data out of the TFS TPC upgrade logs.

To make this process easier I put together this script to find the step completion rows in the log file and format them out such that they are easy to graph in Excel

param
(
    $logfile = "TPC_ApplyPatch.log",
    $outfile = "out.csv"
)


# A function to covert the start and end times to a number of minutes
# Can't use simple timespan as we only have the time portion not the whole datetime
# Hence the hacky added a day-1 second
function CalcDuration
{
    param
    (
        $startTime,
        $endTime
    )

    $diff = [dateTime]$endTime - $startTime
    if ([dateTime]$endTime -lt $startTime)
    {
       $diff += "23:59" # add a day as we past midnight
    }

    [int]$diff.Hours *60 + $diff.Minutes
}

Write-Host "Importing $logfile for processing"
# pull out the lines we are interested in using a regular expression to extract the columns
# the (.{8} handle the fixed width, exact matches are used for the test
$lines = Get-Content -Path $logfile | Select-String "  Executing step:"  | Where{$_ -match "^(.)(.{8})(.{8})(Executing step:)(.{2})(.*)(')(.*)([(])(.*)([ ])([of])(.*)"} | ForEach{
    [PSCustomObject]@{
        'Step' = $Matches[10]
        'TimeStamp' = $Matches[2]
        'Action' = $Matches[6]
    }
}
 
# We assume the upgrade started at the timestamp of the 0th step
# Not true but very close
[DateTime]$start = $lines[0].TimeStamp

Write-Host "Writing results to $outfile"
# Work out the duration
$steps = $lines | ForEach{
    [PSCustomObject]@{
        'Step' = $_.Step
        'TimeStamp' = $_.TimeStamp
        'EplasedTime' = CalcDuration -startTime $start -endTime $_.TimeStamp
        'Action' = $_.Action
       
    }
}
$steps | export-csv $outfile -NoTypeInformation

# and list to screen
$steps

Is the Microsoft Band any good for Triathlon? Training Yes, racing No

The title says it all, I have been using a Microsoft Band for a few months now and have found it a great tool for running and cycling as long as you are going out for less than about 5 hours. I tried to use for the first time Triathlon race at at the Leeds Triathlon over the weekend.

As it it not water proof it was not an option for the swim (unlike my old Polar HR monitor), so I put it on in T1 (swim to bike), don’t think it wasted too much time! This is where I hit the first issue (or second if you count that it is not waterproof) that my finger was too wet to operate the touch screen. I have seen this issue on runs on rainy days. So I did not manage to switch it to cycle mode, and did not bother to try again whilst cycling after I had dried out – a had other things on my mind like being a in good aero position and get moving faster.

I did however manage to switch to run mode as I ran out of T2 (bike to run) and it worked OK there.

So my wish list

  • Make it water proof, enough for open water swimming
  • Add a way to sequence different activities (swim, bike, run) and have a simple button what works with wet fingers to switch between them – maybe a de project for myself
  • And of course better battery life

So I still think it is a good product, just not 100% perfect for me as yet

DPM 2012 R2 UR7 Known Issue

There’s a known issue with Update Rollup 7 for System Center Data Protection Manager 2012 R2 that stops expired recovery points being removed, thus leading (eventually) to DPM consuming all available disk space attached to it. This leads to messages such as:

DPM does not have sufficient storage space available...

and

image

Which mean that new recovery points are not being created and therefore changes are not being backed up.

The fix, which involves replacing the ‘pruneshadowcopiesDpm2010.ps1’ file with a corrected version, can be downloaded from https://www.microsoft.com/en-in/download/details.aspx?id=48694

The procedure is:

  1. Ensure that you are running DPM 2012 R2 UR7 (version 4.2.1338.0) from the ‘About DPM’ menu item under the ‘Action’ menu.
  2. Download the revised pruneshadowcopiesDpm2010.ps1 file from the URL above.
  3. Copy the original file to another location (just in case!)
  4. Replace the original pruneshadowcopiesDpm2010.ps1 with the one downloaded from the URL above. On one of our servers (that was upgraded from 2012 to 2012 R2), this location was C:\Program Files\Microsoft System Center 2012\DPM\DPM\bin and on a new installation, this location was C:\Program Files\Microsoft System Center 2012 R2\DPM\DPM\bin.
  5. Allow the system to run the PowerShell script at midnight (the default time) and the old recovery points should be removed.
  6. You may need to shrink the disk space allocated to the recovery point if DPM has automatically grown the disk space allocated. To to this, for each protection group, right click the protection group and click ‘Modify disk allocation’. Against each entry for the protection group, click ‘shrink’. DPM will calculate the new volume size. Click OK to complete the process.

Note: Repeated small shrink operations cause free space fragmentation, so use with care.

Additional notes: UR7 was re-released to fix this issue, so if you updated your DPM system after August 25th, you should be okay. The original script looks like this:

image

The modified version looks like this:

image

Azure API Management - Securing a Web API hosted as an Azure Web App using client certificates

Azure Api Management acts as a security proxy to 1 or more web services (hosted separately). The intention is that developers will request resources via Azure API Management that will forward the request onto the appropriate web API given appropriate permissions. It is important that the underlying Web Service cannot be accessed directly by an end user (and therefore bypassing the API management security).  To achieve this we are using a client certificate to validate that the request has come from the API management site.

clip_image002

This post describes how to:

  1. Create a self signed certificate
  2. Configure certificates in Azure API Management
  3. Configure the Azure Web App to enable client certificates
  4. Add code to validate a certificate has been provided

 

1) Create a self signed certificate

Run the following example commands to create a self-signed certificate. Tweak the values as required:

makecert.exe -n "CN=Your Issuer Name" -r -sv TempCA.pvk TempCA.cer

makecert.exe -pe -ss My -sr CurrentUser -a sha1 -sky exchange -n "CN=Your subject Name" -eku 1.3.6.1.5.5.7.3.2 -sk SignedByCA -ic TempCA.cer -iv TempCA.pvk

 

2) Configure certificates in Azure API Management

-> Open the Azure API management portal
-> Click APIs –> Choose the appropriate API that you want to secure -> Security -> Manage Certificates
-> Upload the certificate

 

clip_image002

A policy should automatically have been added that intercepts requests and appends the appropriate certificate information before forwarding the request to that Web API. Check the policies section to confirm it has been added. The following screenshot shows the expected policy definition

image

3) Configure the Azure Web App to enable client certificates

Given the Web Api is deployed as an azure App then there is no direct access to IIS to enable client certificate security. Instead configuration must be done either using the Azure REST api; or using the Azure Resource Explorer (preview).

A description of using the REST api is here.

To update is via resource explorer follow these steps:

  • go to https://resources.azure.com/, and log in as you would to the Azure portal
  • find the relevant site, either using the search box or by navigating the tree
  • Switch mode from ‘Read Only’ to ‘Read/Write’
  • click the Edit button
  • Set "clientCertEnabled": true
  • Click the PUT button at the top

 

4) Add some code to the web api to check the client certificate

This can be done a number of ways. However the following code will perform these checks:

  • · Check time validity of certificate
  • · Check subject name of certificate
  • · Check issuer name of certificate
  • · Check thumbprint of certificate

 

public class BasicCertificateValidator : IValidateCertificates { public bool IsValid(X509Certificate2 certificate) { if (certificate == null) return false; string issuerToMatch = "CN=Your Issuer Name"; string subjectToMatch = "CN=Your subject Name"; string certificateThumbprint = "thumbprintToIndentifyYourCertificate"; // 1. Check time validity of certificate TimeZoneInfo myTimeZone = TimeZoneInfo.FindSystemTimeZoneById("GMT Standard Time"); var now = TimeZoneInfo.ConvertTimeFromUtc(DateTime.UtcNow, myTimeZone); DateTime notBefore = certificate.NotBefore; DateTime notAfter = certificate.NotAfter; if (DateTime.Compare(now, notBefore) < 0 && DateTime.Compare(now, notAfter) > 0) return false; // 2. Check subject name of certificate if (!certificate.Subject.Contains(subjectToMatch)) return false; // 3. Check issuer name of certificate if (!certificate.Issuer.Contains(issuerToMatch)) return false; // 4. Check thumprint of certificate if (!certificate.Thumbprint.Trim().Equals(certificateThumbprint, StringComparison.InvariantCultureIgnoreCase)) return false; return true; } public IPrincipal GetPrincipal(X509Certificate2 certificate2) { return new GenericPrincipal(new GenericIdentity(certificate2.Subject), new[] { "User" }); } }

To check on each request to the Web Api add a custom DelegatingHandler. Extend a class from from System.Net.Http.DelegatingHandler and override the SendAsync Message. To access the certificate information you can query the HTTPRequestMessage

public class CertificateAuthHandler : DelegatingHandler { public IValidateCertificates CertificateValidator { get; set; } public CertificateAuthHandler() { CertificateValidator = new BasicCertificateValidator(); } protected override Task<HttpResponseMessage> SendAsync(HttpRequestMessage request, CancellationToken cancellationToken) { X509Certificate2 certificate = request.GetClientCertificate(); if (certificate == null || !CertificateValidator.IsValid(certificate)) { return Task<HttpResponseMessage>.Factory.StartNew(() => request.CreateResponse(HttpStatusCode.Unauthorized)); } Thread.CurrentPrincipal = CertificateValidator.GetPrincipal(certificate); return base.SendAsync(request, cancellationToken); } }

To add the custom message handler to all new requests add the following code to App_Start/WebApiConfig.cs

GlobalConfiguration.Configuration.MessageHandlers.Add(new CertificateAuthHandler());

 

Happy Coding!

Jon

Running Typemock Isolator based tests in TFS vNext build

Updated 22 Mar 2016 This task is available in the VSTS Marketplace

Typemock Isolator provides a way to ‘mock the un-mockable’, such as sealed private classes in .NET, so can be a invaluable tool in unit testing. To allow this mocking Isolator interception has to be started before any unit tests are run and stopped when completed. For a developer this is done automatically within the Visual Studio IDE, but on build systems you have to run something to do this as part of your build process. Typemock provide documentation and tools for common build systems such as MSBuild, Jenkins, Team City and TFS XAML builds. However, they don’t provide tools or documentation on getting it working with TFS vNext build, so I had to write my own vNext build Task to do the job, wrapping Tmockrunner.exe provided by Typemock which handles the starting and stopping of mocking whilst calling any EXE of your choice.

tmockrunner <name of the test tool to run> <and parameters for the test tool>

Microsoft provide a vNext build task to run the vstest.console.exe. This task generates all the command line parameters needed depending on the arguments provided for the build task. The source for this can be found on any build VM (in the [build agent folder]\tasks folder after a build has run) or on Microsoft’s vso agent github repo. I decided to use this as my starting point, swapping the logic to generate the tmockrunner.exe command line as opposed to the one for vstest.console.exe. You can find my task on my github. It has been developed in the same manner as the Microsoft provided tasks, this means the process to build and use the task is

  1. Clone the repo https://github.com/rfennell/vNextBuild.git
  2. In the root of the repo use gulp to build the task
  3. Use tfx to upload the task to your TFS or VSO instance

See http://realalm.com/2015/07/31/uploading-a-custom-build-vnext-task/ and http://blog.devmatter.com/custom-build-tasks-in-vso/ for a good walkthroughs of building tasks, the process is the same for mine and Microsoft’s tasks.

IMPORTANT NOTE: This task is only for on premises TFS vNext build instances connected to either an on premises TFS or VSO. Typemock at the time of writing this post does not support VSO’s host build agents. This is because the registration of Typemock requires admin rights on the build agent which you only get if you ‘own’ the build agent VM

Once the task is installed on your TFS/VSO server you can use it in vNext builds. You will note that it takes all the same parameters as the standard VSTest task (it will usually be used as a replacement when there are Typemock Isolator based tests in a solution). The only addition to the parameters are the three parameters for Typemock licensing and deployment location.

image

Using the task allows tests that require Typemock Isolator to pass. So test that if run with the standard VSTest task give

image

With the new task gives

image