But it works on my PC!

The random thoughts of Richard Fennell on technology and software development

Stray white space in a ‘path to custom test adaptors’ will cause tests to fail on VSO vNext build

If you are providing a path to a custom test adaptor such as nUnit or Chutzpah for a TFS/VSO vNext build e.g. $(Build.SourcesDirectory)\packages, make sure you have no leading whitespace in the data entry form.

image

 

If you do have a space you will see an error log like this as the adaptor cannot be found as the command line generated is malformed

2015-07-13T16:11:32.8986514Z Executing the powershell script: C:\LR\MMS\Services\Mms\TaskAgentProvisioner\Tools\tasks\VSTest\1.0.16\VSTest.ps1
2015-07-13T16:11:33.0727047Z ##[debug]Calling Invoke-VSTest for all test assemblies
2015-07-13T16:11:33.0756512Z Working folder: C:\a\0549426d
2015-07-13T16:11:33.0777083Z Executing C:\Program Files (x86)\Microsoft Visual Studio 12.0\Common7\IDE\CommonExtensions\Microsoft\TestWindow\vstest.console.exe "C:\a\0549426d\UnitTestDemo\WebApp.Tests\Scripts\mycode.tests.js"  /TestAdapterPath: C:\a\0549426d\UnitTestDemo\Chutzpah /logger:trx
2015-07-13T16:11:34.3495987Z Microsoft (R) Test Execution Command Line Tool Version 12.0.30723.0
2015-07-13T16:11:34.3505995Z Copyright (c) Microsoft Corporation.  All rights reserved.
2015-07-13T16:11:34.3896000Z ##[error]Error: The /TestAdapterPath parameter requires a value, which is path of a location containing custom test adapters. Example:  /TestAdapterPath:c:\MyCustomAdapters
2015-07-13T16:11:36.5808275Z ##[error]Error: The test source file "C:\a\0549426d\UnitTestDemo\Chutzpah" provided was not found.
2015-07-13T16:11:37.0004574Z ##[error]VSTest Test Run failed with exit code: 1
2015-07-13T16:11:37.0094570Z ##[warning]No results found to publish.

    Cannot run Pester unit tests in Visual Studio but they work Ok from the command prompt

    I have been using Pester for some PowerShell tests. From the command prompt all is good, but I kept getting the error ‘module cannot be loaded because scripts is disabled on this system’ when I tried to run them via the Visual Studio Test Explorer

     

    image

    I found the solution on StackOverflow, I had forgotten that Visual Studio is 32bit, so you need to set the 32bit execution policy. Opening the default PowerShell command prompt and and setting the policy only effect the 64Bit instance.

    1. Open C:\Windows\SysWOW64\WindowsPowerShell\v1.0\powershell.exe
    2. Run the command Set-ExecutionPolicy RemoteSigned
    3. My tests passed (without restarting Visual Studio)

    image

    Generating MsTest wrappers for nUnit tests

    Recently whilst at a clients one of our consultants came across an interesting issue; the client was using Selenium to write web tests, they wanted to trigger them both from Microsoft Test Manager (MTM) as local automated tests, and also run them using BrowserStack for multi browser regression testing. The problem was to import the tests into MTM they needed to be written in MsTest and for BrowserStack nUnit.

    As they did not want to duplicate each test what could they ?

    After a bit of thought T4 templates came to the rescue, it was fairly easy to write a proof of concept T4 template to generate an MsTest wrapper for each nUnit at compile time. This is what we did, and the gotcha’s we discovered.

    Prerequisites

    Process

    [To make life easier this code has all been made available on GitHub]

    1. Create a solution containing a class library with some nUnit tests as test data
    2. Add a MsTest Unit Test project to this solution.
    3. Add a T4 ‘Text Template’ item to the MsTest project

      image
    4. Write the T4 template that uses reflection to find the nUnit tests in the solution and generates the MsTest wrappers. See the source for the template on Github
    5. Once this is done both the nUnit and MsTest can now be run inside Visual Studio

      image
    6. You can now add the tests to either MTM or BrowserStack as needed, each product using the unit tests it can see.

    The Gotcha – you have two build engines

    The main issues I had were due to me not realising the implications of the T4 template being processed in different ways between Visual Studio and MSBuild.

    By default the template is processed whenever the .TT file is edited in Visual Studio, for me this is not the behaviour required, I wanted the template processed every time the nUnit tests are altered. The easiest way to do this is to always regenerate the .CS file from the template on a compile. Oleg again provides great documentation on how to do this, you end up editing the .CSPROJ file.

    <!-- Include the T$ processing targets-->
    <Import Project="$(VSToolsPath)\TextTemplating\Microsoft.TextTemplating.targets" />
     
    <!-- Set parameters we want to access in the transform -->
    <ItemGroup>
       <T4ParameterValues Include="slnDir">
         <Value>$(MSBuildProjectDirectory)\..</Value>
         <Visible>false</Visible>
       </T4ParameterValues>
      </ItemGroup>

    <ItemGroup>
       <T4ParameterValues Include="configuration">
         <Value>$(Configuration)</Value>
         <Visible>false</Visible>
       </T4ParameterValues>
    </ItemGroup>

    <ItemGroup>
       <T4ParameterValues Include="projectName">
         <Value>$(MSBuildProjectName)</Value>
         <Visible>false</Visible>
       </T4ParameterValues>
    </ItemGroup>
     
    <!-- Tell the MSBuild T4 task to make the property available: -->
    <PropertyGroup>
       <!-- do the transform -->
       <TransformOnBuild>true</TransformOnBuild>
       <!-- Force a complete reprocess -->
       <TransformOutOfDateOnly>false</TransformOutOfDateOnly>
    </PropertyGroup>

    I thought after editing my .CSPROJ file to call the MSBuild targets required, and exposed properties I needed from MSBuild, that all would be good. However I quickly found that though when building my solution with MSBuild from the command line all was fine, a build in Visual Studio failed. Turns out I had to make my template support both forms of building.

    This meant assuming in my .TT file I was building on MSBuild and if I got nulls for required property values switch to the Visual Studio way of working e.g.

        // get the msbuild variables if we can
        var configName = Host.ResolveParameterValue("-", "-", "configuration");
       

        if (String.IsNullOrEmpty(configName)==true)
        {
            WriteLine ("// Generated from Visual Studio");

    // Get the VS instance
            IServiceProvider serviceProvider = (IServiceProvider)this.Host;
            DTE dte = serviceProvider.GetService(typeof(DTE)) as DTE; 
            configName = dte.Solution.SolutionBuild.ActiveConfiguration.Name ;

        } else
        { 
            WriteLine ("// Generated from MSBuild");
        }

     

    Once this was done, I then made sure I could get a successful build both inside Visual Studio and from the command prompt in the folder containing my .SLN file (in my case passing in the Visual Studio version as I was using a VS2015RC command prompt, but only had the VS2013 SDKs installed) e.g.

    msbuild /p:VisualStudioVersion=12.0

    So where are we now?

    Now I have a nice little proof of concept on GitHub. To use it add the GeneratedMstests project to your solution and in this project add references to any nUnit projects. Once this is done you should be able to generate wrappers for nUnit tests.

    I am sure I could do a better job of test discovery, adding references to assemblies and it would be a good idea to make my the sample code into a Visual Studio template, but it is a start, lets see if it actual does what is needed

    Errors running tests via TCM as part of a Release Management pipeline

    Whilst getting integration tests running as part of a Release Management  pipeline within Lab Management I hit a problem that TCM triggered tests failed as the tool claimed it could not access the TFS build drops location, and that no .TRX (test results) were being produced. This was strange as it used to work (the RM system had worked when it was 2013.2, seems to have started to be issue with 2013.3 and 2013.4, but this might be a coincidence)

    The issue was two fold..

    Permissions/Path Problems accessing the build drops location

    The build drops location passed is passed into the component using the argument $(PackageLocation). This is pulled from the component properties, it is the TFS provided build drop with a \ appended on the end.

    image 

    Note that the \ in the text box is there as the textbox cannot be empty. It tells the component to uses the root of the drops location. This is the issue, as when you are in a network isolated environment and had to use NET USE to authenticate with a the TFS drops share the trailing \ causes a permissions error (might occur in other scenarios too I have not tested it).

    Removing the slash or adding a . (period) after the \ fixes the path issue, so..

    • \\server\Drops\Services.Release\Services.Release_1.0.227.19779        -  works
    • \\server\Drops\Services.Release\Services.Release_1.0.227.19779\      - fails 
    • \\server\Drops\Services.Release\Services.Release_1.0.227.19779\.     - works 

    So the answer is add a . (period) in the pipeline workflow component so the build location is $(PackageLocation). as opposed to $(PackageLocation) or to edit the PS1 file that is run to do some validation to strip out any trailing characters. I chose the later, making the edit

    if ([string]::IsNullOrEmpty($BuildDirectory))
        {
            $buildDirectoryParameter = [string]::Empty
        } else
        {
            # make sure we remove any trailing slashes as the cause permission issues
            $BuildDirectory = $BuildDirectory.Trim()
            while ($BuildDirectory.EndsWith("\"))
            {
                $BuildDirectory = $BuildDirectory.Substring(0,$BuildDirectory.Length-1)
            }
            $buildDirectoryParameter = "/builddir:""$BuildDirectory"""
        }
       

    Cannot find the TRX file even though it is present

    Once the tests were running I still had an issue that even though TCM had run the tests, produced a .TRX file and published it’s contents back to TFS, the script claimed the file did not exist and so could not pass the test results back to Release Management.

    The issue was the call being used to check for the file existence.

    [System.IO.File]::Exists($testRunResultsTrxFileName)

    As soon as I swapped to the recommended PowerShell way to check for files

    Test-Path($testRunResultsTrxFileName)

    it all worked.

    ‘Test run must be created with at least one test case’ error when using TCM

    I have been setting up some integration tests as part of a release pipeline. I am using TCM.EXE to trigger tests from the command line. Something along the lines

    TCM.exe run /create /title:"EventTests" /collection:"http://myserver:8080/tfs” /teamproject:myteamproject /testenvironment:"Integration" /builddir:\\server\Drops\Build_1.0.226.1975”  /include /planid:26989  /suiteid:27190 /configid:1

    I kept getting the error

    ‘A test run must be created with at least one test case’

    Strange thing was my test suite did contains a number of test, and they were marked as active.

    The issue was actually the configid it was wrong, there is no easy way to check them from the UI. use the following command to get a list of valid IDs

    TCM.exe configs /list   /collection:"http://myserver:8080/tfs” /teamproject:myteamproject

    Id        Name
    --------- ----------------------------------------------------------------
    35        Windows 8.1 ARM
    36        Windows 8.1 64bit
    37        Windows 8.1 ATOM
    38        Default configuration created @ 11/03/2014 12:58:15
    39        Windows Phone 8.1

    Your can now use the correct ID, not one you had to guess

    Review of ‘Software Testing using Visual Studio 2012’ from Packt Publishing

    I have just been reading Software Testing using Visual Studio 2012 by Subashni. S and Satheesh Kumar. N from Packt Publishing

    9540EN_cov-stwvs2012

    This book does what it says on the cover, it is a general introduction to the testing tools within the Visual Studio 2012 family. My comment is not about how well it is done, it is a clear enough introduction, but why produce a book that really just covers what is in MSDN, Channel9, numerous podcasts, blogs and ALM Rangers documentation?

    I suppose this is a question of target audience, some people like to browse a physical book for ‘new’ technology, I can see that (though I tried it on Kindle, more of that later). This book certainly does cover the core areas, but sits strangely between a technology briefing for a manager/person who just needs an overview (it is all a bit long winded, list all the features and flags of tools) and not enough detail for the practitioner (the exercises do not go deep enough unlike those provide by Microsoft in Brian Keller VS/TFS demo VM series)

    Given this concern I wonder who the target audience really is?

    A real issue here is that Microsoft have gone to quarterly updates, so the product is always advancing, faster than any print book can manage (Microsoft’s own MSDN documentation has enough problems keeping up, and frequently is play catch up). For a book on testing this is a major problem as ‘test’ has been a key focus for the updates. This means when the book’s contents is compared to Visual Studio/TFS 2012.3 (the current shipping version at the time of this review) there are major features missing such as

    • The improvements in Test Explorer to support other non Microsoft test framework, playlists etc.,
    • SKU changes in licensing, MTM dropping down to Premium form Ultimate
    • Azure based load testing
    • The test experience in the web browser (as opposed to MTM)

    The list will always grow while Microsoft stick to their newer faster release cycle. This was not too much of a problem when Microsoft shipped every couple of years, a new book opportunity, but now how can any book try to keep up on a 12 week cycle?

    One option you would think is Kindle or eBooks in general, as at least the book can be updated . However there is still the issue of the extra effort of the authors and editors, so in general I find these updates are not that common. The authors will usually have moved onto their next project and not be focused on yet another unpaid update to a book they published last quarter.

    As to my experience on the Kindle, this was the first technical book I have read on one. I have used the Kindle App on a phone for a couple of years for my novel reading, but always felt the screen was too small for anything that might have a diagram in it. I recently bought a Kindle Paperwhite so though I would give this book a go on it. I initially tried to email the book from the Packt site straight to my Kindle, but this failed (a file size issue I am told by Packt customer support), but a local copy of USB was fine.

    So how was the Kindle experience? OK, it did the job, everything was clear enough,  it was not a super engaging reading experience but it is a technical book, what do you expect? It was good enough that I certainly don’t see my getting too many paper books going forward whether thet be novels or technical books.

    So in summary, was the book worth the effort to read? I always gauge this question on ‘did I learn something?’ and I did. There is always a nugget or two in books on subjects you think you know. However, ‘would I say it is a really useful/essential read for anyone who already has a working knowledge in this subject?’, probably not. I would say their time is better spent doing a hand on lab or watching conference recordings on Channel9.

    Leave this book to anyone who wants a general written introduction to the subject of Microsoft specific testing tooling.

    TFS Test Agent cannot connect to Test Controller – Part 2

    I posted last week on the problems I had had getting the test agents and controller in a TFS2012 Standard environment talking to each other and a workaround. Well after a good few email with various people at Microsoft and other consultants at Black Marble I have a whole range of workarounds solutions.

    First a reminder of my architecture, and note that this could be part of the problem, it is all running on a single Hyper-V host. Remember this is a demo rig to show the features of Standard Environments. I think it is unlikely that this problem will be seen in a more ‘realistic’ environment i.e. running on multiple boxes

     

    image

     

    The problem is that when the test agent running on the Server2008 should request the test controller (running the on VSTFS server) should call it back on either it 169.254.x.x address or on abn address obtained via DHCP from the external virtual switch. However the problem is it is requesting a call back on 127.0.0.1, as can be seen in the error log

    Unable to connect to the controller on 'vstfs:6901'. The agent can connect to the controller but the controller cannot connect to the agent because of following reason: No connection could be made because the target machine actively refused it 127.0.0.1:6910. Make sure that the firewall on the test agent machine is not blocking the connection.

    The root cause

    It turns out the root cause of this problem was I had edited the c:\windows\system32\drivers\etc\hosts file on the test server VM to add an entry to allow a URL used in CodedUI tests to be resolved to the localhost

    127.0.0.1   www.mytestsite.com

    Solution 1 – Edit the test agent config to bind to a specific address

    The first solution is the one I outlined in my previous post, tell the test agent to bind to a specific IP address. Edit

    C:\Program Files (x86)\Microsoft Visual Studio 11.0\Common7\IDE\QTAgentService.exe.config

    and added a BindTo line with the correct address for the controller to call back to the agent

    <appSettings>
         // other bits …
          <add key="BindTo" value="169.254.1.1"/>
    </appSettings>

    The problem with this solution you need to remember to edit a config file, all seems a bit complex!

    Solution 2 – Don’t resolve the test URL to localhost

    Change the hosts file entry used by the CodedUI test to resolve to the actual address of the test VM e.g.

    169.254.1.1   www.mytestsite.com

    Downside here is you need to know the test agents IP address, which depending on the system in use could change, and will certainly be different on each test VM in an environment. Again all seems a bit complex and prone to human error.

    Solution 3 – Add an actual loopback entry to the hosts file.

    The simplest workaround which Robert Hancock at Black Marble came up with was to add a second entry to the hosts file for the name loopback

    127.0.0.1   localhost
    127.0.0.1   www.mytestsite.com

    Once this was done the test agent could connect, I did not have to edit any agent config files, or know the address the agent need to bind to. By far the best solution

     

    So thanks to all who helped get to the bottom of this surprisingly complex issue.