Version 2.0.x of my Generate Release Notes VSTS Task has been released with release rollup support

I have just released a major update to my Generate Release Notes VSTS Build extension. This V2 update adds support to look back into past releases to find when there was a successful release to a given stage/environment and creates a rollup set of build artifacts, and hence commits/changesets and workitems, in the release notes.

 

 

This has been a long running request on GitHub for this extension which I am pleased to have been able to address.

To aid backwards compatibility, the default behaviour of the build/release tasks is as it was before, it can be used in a build or in and release, and if in a release it only consider the artifacts in the current release that ran the task.

If you want to use the new features you need to enable them. This is all on the advanced properties

 

image

 

You get new properties to enable scanning past releases until the task find a successful deployment to, by default, the same stage/environment that is currently being released too. You can override this stage name to allow more complex usage e.g. generating the releases notes for what is changed since the last release to production whist in a UAT environment.

This change also means there is new variable that can be accessed in templates, this $Releases which contains all the releases being used to get build artifacts. This can be used on release notes to show the releases being used e.g.

 

**Release notes for release $defname**
**Release Number**  : $($release.name)   
**Release completed** $("{0:dd/MM/yy HH:mm:ss}" -f [datetime]$release.modifiedOn) **Changes since last successful release to '$stagename'**  
**Including releases:**  
$(($releases | select-object -ExpandProperty name) -join ", " )  

 

Generating a content

 

Release notes for release Validate-ReleaseNotesTask.Master Release Number : Release-69  Release completed 05/01/17 12:40:19 Changes since last successful release to 'Environment 2'  Including releases:  Release-69, Release-68, Release-67, Release-66 

 

Hope you find this extension useful

Transform tool for transferring TFS 2015.3 Release Templates to VSTS

If you are moving from on-premises TFS to VSTS you might hit the same problem I have just have. The structure of a VSTS releases is changing, there is now the concept of multiple ‘Deployment Steps’ in an environment. This means you can use a number of different agents for a single environment – a good thing.

The downside this that if you export a TFS2015.3 release process and try to import it to VSTS it will fail saying the JSON format is incorrect.

Of course you can get around this with some copy typing, but I am lazy, so….

I have written a quick transform tool that converts the basic structure of the JSON to the new format. You can see the code as Github Gist

It is a command line tool, usage is as follows

  1. In VSTS create a new empty release, and save it
  2. Use the drop down menu on the newly saved release in the release explorer and export the file. This is the template for the new format e.g. template.json
  3. On your old TFS system export the release process in the same way to get your source file e.g. source.json
  4. Run the command line tool providing the name of the template, source and output file

    RMTransform template.json source.json output.json

  5. On VSTS import the newly create JSON file release file.
  6. A release process should be created, but it won’t be possible to save it until you have fixed a few things that are not transferred
    1. Associated each Deployment step with Agent Pool
    2. Set the user accounts who will do the pre-and post approvals
    3. Any secret variable will need to be reentered
      IMPORTANT – Make sure you save the imported process as soon as you can (i.e. straight after fixing anything that is stopping it being saved). If you don’t save and start clicking into artifacts or global variable it seems to loose everything and you need to re-import

image

It is not perfect, you might find other issues that need fixing, but it save a load of copy typing

Running Test Suites within a network Isolated Lab Management environment when using TFS vNext build and release tooling

Updated 27 Sep 2016: Added solutions to known issues

Background

As I have posted many times we make use of TFS Lab Management to provide network isolated dev/test environments. Going forward I see us moving to Azure Dev Labs and/or Azure Stack with ARM templates, but that isn’t going to help me today, especially when I have already made the investment in setting up a Lab Management environments and they are ready to use.

One change we are making now is a move from the old TFS Release Management (2013 generation) to the new VSTS and TFS 2015.2 vNext Release tools. This means I need to be able to trigger automated tests on VMs within Lab Management network isolated environments with a command inside my new build/release process. I have posted on how to do this with the older generation Release Management tools, turns out it is in some ways a little simpler with the newer tooling, no need to fiddle with shadow accounts etal.

My Setup

image

Constraints

The constraints are these

  • I need to be able to trigger tests on the Client VM in the network isolated lab environment. These tests are all defined in automated test suites within Microsoft Test Manager.
  • The network isolated lab already has a TFS Test Agent deployed on all the VMs in the environment linked back to the TFS Test Controller on my corporate domain, these agents are automatically installed and managed, and are handling the ‘magic’ for the network isolation – we can’t fiddle with these without breaking the Labs 
  • The new build/release tools assume that you will auto deploy a 2015 generation Test Agent via a build task as part of the build/release process. This is a new test agent install, so removed any already installed Test Agent – we don’t want this as it breaks the existing agent/network isolation.
  • So my only options to trigger the tests by using TCM (as we did in the past) from some machine in the system. In the past (with the old tools) this had to be within the isolated network environment due to the limitation put in place by the use of shadow accounts.  
  • However, TCM (as shipped with VS 2015) does not ‘understand’ vNext builds, so it can’t seem to find them by definition name/number – we have to find builds by their drop location, and I think this needs to be a UNC share, not a drop back onto the TFS server. So using TCM.EXE (and any wrapper scripts) probably is not going to deliver what I want i.e. the test run associated with a vNext build and/or release.
  • My Solution

    The solution I adopted was to write a PowerShell script that performs the same function as the TCMEXEC.PS1 script that used to be run within the network isolated Labe Environment by the older Release Management products.

    The difference is the old script shelled out to run TCM.EXE, my new version makes calls to the new TFS REST API (and unfortunately also to the older C# API as some features notably those for Lab Management services are not exposed via REST). This script can be run from anywhere, I chose to run it on the TFS vNext build agent, as this is easiest and this machine already had Visual Studio installed so had the TFS C# API available.

    You can find this script on my VSTSPowerShell GitHub Repo.

    The usage of the script is

    TCMReplacement.ps1
          -Collectionuri http://tfsserver.domain.com:8080/tfs/defaultcollection/
    -Teamproject "My Project"
    -testplanname "My test plan" 
    -testsuitename "Automated tests"
    -configurationname "Windows 8"
    -buildid  12345
       -environmentName "Lab V.2.0" 
    -testsettingsname "Test Setting"
    -testrunname "Smoke Tests"
    -testcontroller "mytestcontroller.domain.com"
    -releaseUri "vstfs:///ReleaseManagement/Release/167"
    -releaseenvironmenturi "vstfs:///ReleaseManagement/Environment/247"

    Note

  • The last two parameters are optional, all the others are required. If the last two are not used the test results will not be associated with a release
  • The is also a pollinginterval parameter which default to 10 seconds. The script starts a test run then polls on this interval to see if it has completed.
  • If there are any failed test then the script writes to write-error as the TFS build process sees this is a failed step
  • In some ways I think this script is an improvement over the TCMEXEC script, the old one needed you to know the IDs for many of the settings (loads of poking around in Microsoft Test Manager to find them), I allow the common names of settings to be passed in which I then use to lookup the required values via the APIs (this is where I needed to use the older C# API as I could not find a way to get the Configuration ID, Environment ID or Test Settings ID via REST).

    There is nothing stopping you running this script from the command line, but I think it is more likely to make it part of release pipeline using the PowerShell on local machine task in the build system. When used this way you can get many of the parameters from environment variables. So the command arguments become something like the following (and of course you can make all the string values build variables too if you want)

     

       -Collectionuri $(SYSTEM.TEAMFOUNDATIONCOLLECTIONURI) 
    -Teamproject $(SYSTEM.TEAMPROJECT)
    -testplanname "My test plan"
       -testsuitename "Automated tests"
    -configurationname "Windows 8"
    -buildid  $(BUILD.BUILDID)
      -environmentName "Lab V.2.0"
       -testsettingsname "Test Settings"
    -testrunname "Smoke Tests"
    -testcontroller "mytestcontroller.domain.com"
    -releaseUri $(RELEASE.RELEASEURI)
    -releaseenvironmenturi $(RELEASE.ENVIRONMENTURI)

     

    Obviously this script is potentially a good candidate for a TFS build/release task, but as per my usual practice I will make sure I am happy with it’s operation before wrappering it up into an extension.

    Known Issues

  • If you run the script from the command line targeting a completed build and release the tests run and are shown in the release report as well as on the test tab as we would expect.

    image

    However, if you trigger the test run from within a release pipeline, the test runs OK and you can see the results in the test tab (and MTM), but they are not associated within the release. My guess is because the release had not completed when the data update is made. I am investigating this to try to address the issue.

  • Previously I reported there was a known issue that the test results were associated with the build, but not the release. It turns out this was due to the AD account the build/release agent was running as was missing rights on the TFS server. To fix the problem I made sure the account as configured as follows””:

    Once this was done all the test results appeared where they should

    So hopefully you will find this a useful tool if you are using network isolated environments and TFS build

    Tidy up those VSTS release pipelines with meta-tasks

    Do you have repeating blocks in your VSTS release pipelines?

    I certainly do. A common one is to run a set of functional test, so I need to repeatedly …

    1. Deploy some test files to a VM
    2. Deploy a test agent to the VM – IMPORTANT I had not realised you can only run one test run against this deployed agent. You need to redeploy it for the next run
    3. Run my tests
    4. … and repeat for next test type/configuration/test plan/DLL etc.

     

    In the past this lead to a lot of repeat tasks in my release pipeline, all very messy.

    Now in VSTS we have the option of  Meta-tasks, these allow tasks to be grouped into in-effect functions with their own properties.

     

    image

    In the above screen shot below you can see I use a meta-task ‘Run Tests’ that wrappers the four tasks shown below.

    image

    Much neater, but as you might expect with something new I have come across a few minor gotchas

    • You cannot order the list of properties for the meta-task
    • This is a problem as the first one is used to generate the instance name in the pipeline. No a major problem you can always edit it.
    • Meta-tasks properties are auto-detected from any variables used with in the meta-task tasks, the auto-detection mechanism is case sensitive, unless the rest of VSTS variable handling. So be careful to not end up with duplicates.

    That all said, I think this is big step forward in readability and reuse for release management

    Running WebTests as part of a VSTS VNext Release pipeline

    Background

    Most projects will have a range of tests

    • Unit tests (maybe using a mocking framework) running inside the build process
    • Integration/UX and load tests run as part of a release pipeline
    • and finally manual tests

    In a recent project we were using WebTests to provide some integration tests (in addition to integration tests written using unit testing frameworks) as a means to test a REST/ODATA API, injecting data via the API, pausing while a backend Azure WebJob processed the injected data, then checking a second API to make sure the processed data was correctly presented. Basically mimicking user operations.

    In past iterations we ran these tests via TFS Lab Management’s tooling, using the Test Agent that is deploys when an environment is created.

    The problem was we are migrating to VSTS/TFS 2015.2 Release Management. This uses the new Functional Testing Task, which uses the newer Test Agent that is deployed on demand as part of the release pipeline (not pre-installed) and this agent does not support running WebTests at present.

    This means my only option was to use MsTest if I wanted to continue using this form of webtest. However, there is no out the box MsTest task for VSTS, so I needed to write a script to do the job that I could deploy as part of my build artifacts.

    Now I could write a build/release task to make this nice and easy to use, but that is more work and I suspect that I am not going to need this script too often in the future (I might be wrong here only time will tell). Also I hope that Microsoft will at some point provide an out the box task to do the job either by providing an MStest task or adding webtest support to the functional test task.

    This actually reflects my usual work practice for build tasks, get the script working first locally, use it as PowerShell script in the build, and if I see enough refuse make it a task/extension.

    So what did I actually need to do?

    Preparation

    1. Install Visual Studio on the VM where the tests will be run from. I need to do this because though MSTest was already present  it fails to run .Webtest tests unless a suitable SKU of Visual Studio is installed
    2. Set the solution configuration so that the projects containing the webtests is not built, we only need the .webtest files copied to the drops location. If you build the project the files get duplicated into the bin folder, which we don’t need as we then need to work out which copy to use.
    3. Make sure the solution contains a .TestSettings file that switches on ‘Think Times’, and this file is copied as a build artifact. This stalled me for ages, could not work out why tests worked in Visual Studio and failed from the command line. Without this file there is no think time at all so my background process never had time to run.

      image

    4. Write a script that finds all my .Webtest files and place it in source control such that it is copied to the builds drop location.
    param 

    (

    $tool = "C:\Program Files (x86)\Microsoft Visual Studio 14.0\Common7\IDE\MSTest.exe",
    $path ,
    $include = "*.webtest",
    $results ,
    $testsettings

    )

    $web_tests = get-ChildItem -Path $paths -Recurse -Include $include

    foreach ($item in $web_tests) {
        $args += "/TestContainer:$item "

    }


    & $tool $args /resultsfile:$Results /testsettings:$testsettings

     

    Build

    Once the script and other settings are made I altered the build so that the .webtests (including their associated JSON test data sub folders), the script and the .testsettings files are all copied to the drops location

     

    image

     

    Release

    In the release pipeline I need to call my script with suitable parameters so it find the tests, uses the .testsettings and creates a .TRX results file. I then need to use the ‘Publish Test Results’ task to uploaded these MSTest format results

    image

    So for the PowerShell MSTest task I set the following

    • Script name is $(System.DefaultWorkingDirectory)/MyBuild\drop\Scripts\RunMSTest.ps1 
    • The argument is -path $(System.DefaultWorkingDirectory)\MyBuild\drop\Src\WebtestsProject -results $(System.DefaultWorkingDirectory)\webtests.trx -testsettings $(System.DefaultWorkingDirectory)\MyBuild\drop\src\webtest.testsettings

    And for the publish test results task.

    • Format – VSTest
    • Arguments – $(System.DefaultWorkingDirectory)\webtests.trx
    • I also set this task to always run to make sure I got test results even if some test failed

    Once all this was done and the build/release run I got my test results I needed

    image

     

    I can drill into my detailed test reports as needed

    image

    So I have a functioning release pipeline that can run all the various types of automated tests within my solution.

    Building bridges – getting DevOps working through Devs and IT Pros talking and learning from each other

    I was lucky enough to attended and be on a panel at yesterdays WinOps London conference, it was a different and very interesting view on DevOps for me. I spend most of my time consulting with test and development teams, with these teams it is very rare to come across a team not using source control and they commonly have some form of automated build too. This means any DevOps discussion usually come from the side of ‘how can I extend my build into deployment…’.

    At the conference yesterday, where there seemed to be more IT Pro attendees than developers, this ‘post build’ view of was not the norm. Much of the conference content was focused around the provisioning and configuration of infrastructure, getting the environment ‘ready for deployment of a build’. What surprised me most was how repeatedly speakers stressed the importance of using source control to manage scripts and hence control the version of the environments being provisioning.

    So what does this tell us?

    The obvious fact to me is that the bifurcation of our industry between Devs and IT Pros  means there is a huge scope for swapping each group’s best practices. What seem ingrained best practice for one role is new and interesting for the other. We can all learn from each other – assuming we communicate.

    This goes to the core of DevOps, that it is not a tool but a process based around collaboration.

    If you want to find out more about how we see DevOps at Black Marble we are running events and are out and about at user groups. Keep an eye on the Black Marble events site or drop me an email.

    New version of my VSTS Generate Release Notes extension – now supports Builds and Release

    I am pleased to announce that I have just made public on the VSTS marketplace a new version of my VSTS Generate Release Notes extension.

    This new version now supports both VSTS/TFS vNext Builds and vNext Releases. The previous versions only supported the generation of release notes as part of a build.

    The adding of support for release has meant I have had to rethink the internals of how the templates is process as well as the way templates are passed into the task and where results are stored

    • You can now provide a template as a file (usually from source control) as before, but also as an inline property. The latter is really designed for Releases where there is usually no access to source control, only to build artifact drops (though you could put the template in one of these if you wanted)
    • With a build the obvious place to put the release notes file is in the drops location. For a release there is no such artifact drop location, so I just leave the releases notes on the release agent, it is up to the user to get this file copied to a sensible location for their release process.

    To find out more check out the documentation on my GitHub repo and have a look at my sample templates to get you started generating release notes

    Putting a release process around my VSTS extension development

    Updated: 5th Aug 2016 added notes in PublisherID


     

    I have been developing a few VSTS/TFS build related extensions and have published a few in the VSTS marketplace. This has all been a somewhat manual process, a mixture of Gulp and PowerShell has helped a bit, but I decided it was time to try to do a more formal approach. To do this I have used Jesse Houwing’s VSTS Extension Tasks.

    Even with this set of tasks I am not sure what I have is ‘best practice’, but it does work. The doubt is due to the way the marketplace handles revisions and preview flags. What I have works for me, but ‘your mileage may differ’

    My Workflow

    The core of my workflow is that I am building the VSIX package twice, once as a private package and the other as a public one. They both contain the same code and have the same version number, they differ in only visibility flags

    I am not using a the preview flag options at all. I have found they do not really help me. My workflow is to build the private package, upload it and test it by sharing it with a test VSTS instance. if all is good publish the matched public package on the marketplace. In this model there is no need to use a preview, it just adds complexity I don’t need.

    This may not be true for everyone.

    Build

    The build’s job is to take the code, set the version number and package it into multiple VSIX package.

    1. First I have the vNext build get my source from my GitHub repo.
    2. I add two build variables $(Major) and $(Minor) that I use to manually manage my version number
    3. I set my build number format to $(Major).$(Minor).$(rev:r), so the final .number is incremented until I choose to increment the major or minor version.
    4. I then use one of Jesse’s tasks to package the extension multiple times using the extension tag model parameter. Each different package step uses different Visibility settings (circled in red). I also set the version, using the override options, to the $(Build.BuildNumber) (circled in green)image
    5. [Updated Aug 2016] Set the PublisherID and ExtensionID on the tasks, using a pair of build variables is a good idea here to avoid entering strings twice. It is important thay the PublisherID is entered with the correct case – it is case sensitive within the marketplace. Strange things happend of the PublisherID in a VSIX package differ from the one registered on the marketplace
    6. As I am using the VSTS hosted build agent I also need to make sure I check the install Tfx-cli in the global setting section
    7. I then add a second identical publish task, but this time there is no tag set and the visibility is set to public.
    8. Finally I use a ‘publish build artifacts’ task to copy the VSIX packages to a drop location

    Release

    So now I have multiple VSIX packages I can use the same family of tasks to create a release pipeline.

    I create a new release linked to be a Continuous Deployment of the previously created build and set its release name format to Release-$(Build.BuildNumber)

    My first environment uses three tasks, all using the option – to work from a VSIX package.

    Note In all cases I am using the VSIX path in the format $(System.DefaultWorkingDirectory)/GenerateReleaseNotes.Master/vsix/<package name>-<tag>-$(Build.BuildNumber).vsix. I am including the build number variable in the path as I chose to put all the packages in a single folder, so path wildcards are not an option as the task would not know which package to use unless I alter my build to put one VSIX package per folder.

    My tasks for the first environment are

    1. Publish VSTS Extension – using my private package so it is added as a private package to the marketplace
    2. Share VSTS Extension – to my test VSTS account
    3. Install VSTS Extension – to my test VSTS account

    For details in the usage of these tasks and setting up the link to the VSTS Marketplace see Jesse’s wiki

    If I only intend a extension to ever be private this is enough. However I want to make mine public so I add a second environment that has manual pre-approval (so I have to confirm the public release)

    This environment only needs single task

    1. Publish VSTS Extension – using my public package so it is added as a public package to the marketplace

    I can of course add other tasks to this environment maybe send a Tweet or email to publicise the new version’s release

    Summary

    So now I have a formal way to release my extensions. The dual packaging model means I can publish two different versions at the same time one privately and the other public

    image

    It is now just a case of moving all my extensions over to the new model.

    Though I am still interested to hear what other people view are? Does this seem a reasonable process flow?

    How to build a connection string from other parameters within MSDeploy packages to avoid repeating yourself in Release Management variables

    Whilst working with the new Release Management features in VSTS/TFS 2015.2 I found I needed to pass in configuration variables i.e. server name, Db name, UID and Password to create a SQL server via an Azure Resource Management Template release step and a connection string to the same SQL instance for a web site’s web.config, set using an MSDeploy release step using token replacement (as discussed in this post)

    Now I could just create RM configuration variables for both the connection string and ARM settings,

    image

     

    However, this seems wrong for a couple of reason

    1. You should not repeat your self, too easy to get the two values out of step
    2. I don’t really want to obfuscate the whole of a connection string in RM, when only a password really needs to be hidden (note the connection string variable is not set as secure in the above screenshot)

    What did not work

    I first considered nesting the RM variables, e.g. setting a the connection string variable to be equal to ‘Server=tcp: $(DatabaseServer).database.windows.net,1433;Database=$(DatabaseName)….’, but this does not give the desired results, the S(DatabaseServer) and $(DatabaseName) variables are not expanded at runtime, you just get a string with the variable names in it.

    How I got want I was after….

    (In this post as a sample I am using the Fabrikam Fiber solution. This means I need to provide a value for the FabrikamFiber-Express connection string)

    I wanted to build the connection string from the other variables in the MSDeploy package. So to get the behaviour I want…

    1. In Visual Studio load the Fabrikam web site solution.
    2. In the web project, use the publish option to create a publish profile use the ‘WebDeploy package’ option.
    3. If you publish this package you end up with a setparameter.xml file containing the default connection string
      <setParameter name="FabrikamFiber-Express-Web.config Connection String" value="Your value”/>

      Where ‘your value’ is the value you set in the Publish wizard. So to use this I would need to pass in a whole connection string, where I only want to pass parts of this string

    4. To add bespoke parameters to an MSDeploy package you add a parameter.xml file to the project in Visual Studio (I wrote a Visual Studio Extension that help add this file, but you can create it by hand). My tool will create the parameters.xml file based on the AppSettings block of the projects Web.config. So if you have a web.config containing the following
      <appSettings>
          <add key="Location" value="DEVPC" />
        </appSettings>

      It will create a parameters.xml file as follows

      <?xml version="1.0" encoding="utf-8"?>
      <parameters>
        <parameter defaultValue="__LOCATION__" description="Description for Location" name="Location" tags="">
          <parameterentry kind="XmlFile" match="/configuration/appSettings/add[@key='Location']/@value" scope="\\web.config$" />
        </parameter>
      </parameters>
    5. If we publish at this point we will get a setparameters.xml file containing
      <?xml version="1.0" encoding="utf-8"?>
      <parameters>
        <setParameter name="IIS Web Application Name" value="__Sitename__" />
        <setParameter name="Location" value="__LOCATION__" />
        <setParameter name="FabrikamFiber-Express-Web.config Connection String" value="__FabrikamFiberWebContext__" />
      </parameters>

      This is assuming I used the publish wizard to set the site name to __SiteName__ and the DB connection string to __FabrikamFiberWebContext__

    6. Next step is to add my DB related parameters to the paramaters.xml file, this I do by hand, my tool does not help
      <?xml version="1.0" encoding="utf-8"?>
      <parameters>
        <parameter defaultValue="__LOCATION__" description="Description for Location" name="Location" tags="">
          <parameterentry kind="XmlFile" match="/configuration/appSettings/add[@key='Location']/@value" scope="\\web.config$" />
        </parameter>

        <parameter name="Database Server" defaultValue="__sqlservername__"></parameter>
        <parameter name="Database Name" defaultValue="__databasename__"></parameter>
        <parameter name="Database User" defaultValue="__SQLUser__"></parameter>
        <parameter name="Database Password" defaultValue="__SQLPassword__"></parameter>
      </parameters>
    7. If I publish again, this time the new variables also appear in the setparameters .xml file
    8. Now I need to supress the auto generated creation of the connection string  parameter, and replace it with a parameter that uses the other parameters to generate the connection string. You would think this was a case of added more text to the parameters.xml file, but that does not work. If you add the block you would expect (making sure the name matches the auto generated connection string name) as below
      <parameter 
        defaultValue="Server=tcp:{Database Server}.database.windows.net,1433;Database={Database Name};User ID={Database User}@{Database Server};Password={Database Password};Encrypt=True;TrustServerCertificate=False;Connection Timeout=30;"
        description="Enter the value for FabrikamFiber-Express connection string"
        name="FabrikamFiber-Express-Web.config Connection String"
        tags="">
        <parameterentry
          kind="XmlFile"
          match="/configuration/connectionStrings/add[@name='FabrikamFiber-Express']/@connectionString"
          scope="\\web.config$" />
      </parameter>

      It does add the entry to setparameters.xml, but this blocks the successful operations at deployment. It seems that if a value needs to be generated from other variables there can be no entry for it in the setparameters.xml. Documentation hints you can set the Tag to ‘Hidden’ but this does not appear to work.

      One option would be to let the setparameters.xml file be generated and then remove the offending line prior to deployment but this feels wrong and prone to human error

    9. To get around this you need to added a file name <projectname>.wpp.target to the same folder as the project (and add it to the project). In this file place the following
      <?xml version="1.0" encoding="utf-8"?>
      <Project ToolsVersion="4.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
      <Target Name="DeclareCustomParameters"
                BeforeTargets="Package">
          <ItemGroup>
            <MsDeployDeclareParameters Include="FabrikamFiber-Express">
              <Kind>XmlFile</Kind>
              <Scope>Web.config</Scope>
              <Match>/configuration/connectionStrings/add[@name='FabrikamFiber-Express']/@connectionString</Match>
              <Description>Enter the value for FabrikamFiber-Express connection string</Description>
              <DefaultValue>Server=tcp:{Database Server}.database.windows.net,1433;Database={Database Name};User ID={Database User}@{Database Server};Password={Database Password};Encrypt=True;TrustServerCertificate=False;Connection Timeout=30;</DefaultValue>
              <Tags></Tags>
              <ExcludeFromSetParameter>True</ExcludeFromSetParameter>
            </MsDeployDeclareParameters>
          </ItemGroup>
        </Target>
        <PropertyGroup>
          <AutoParameterizationWebConfigConnectionStrings>false</AutoParameterizationWebConfigConnectionStrings>
        </PropertyGroup>
      </Project>

      The first block declares the parameter I wish to use to build the connection string. Note the ‘ExcludeFromSetParameter’ setting so this parameter is not in the setparameters.xml file. This is what you cannot set in the parameters.xml

      The second block stops the auto generation of the connection string. (Thanks to Sayed Ibrahim Hashimi for various posts on getting this working)

    10. Once the edits are made unload and reload the project as the <project>. wpp.targets file is cached on loading by Visual Studio.
    11. Make sure the publish profile is not set to generate a connection string

      image

    12. Now when you publish the project, you should get a setparameters.xml file with only the four  SQL variables, the AppSettings variables and the site name.
      (Note I have set the values for all of these to the format  __NAME__, this is so I can use token replacement in  my release pipeline)
      <?xml version="1.0" encoding="utf-8"?>
      <parameters>
        <setParameter name="IIS Web Application Name" value="__Sitename__" />
        <setParameter name="Location" value="__LOCATION__" />
        <setParameter name="Database Server" value="__sqlservername__" />
        <setParameter name="Database Name" value="__databasename__" />
        <setParameter name="Database User" value="__SQLUser__" />
        <setParameter name="Database Password" value="__SQLPassword__" />
      </parameters>
    13. If you deploy the web site, the web.config should have your values from the setparameters.xml file in it
      <appSettings>
         <add key="Location" value="__LOCATION__" />
      </appSettings>
      <connectionStrings>
           <add name="FabrikamFiber-Express" connectionString="Server=tcp:__sqlservername__.database.windows.net,1433;Database=__databasename__;User ID=__SQLUser__@__sqlservername__;Password=__SQLPassword__;Encrypt=True;TrustServerCertificate=False;Connection Timeout=30;" providerName="System.Data.SqlClient" />
      </connectionStrings>

    You are now in a position manage the values of the setparameters.xml file however you wish. My choice is to use the ‘Replace Tokens’ build/release tasks from Colin’s ALM Corner Build & Release Tools Extension, as this tasks correctly handles secure/encrypted RM variables as long as you use the ‘Secret Tokens’ option on the advanced menu.

    image

    
    

     

    Summary

    So yes, it all seems a but too complex, but it does work, and I think it makes for a cleaner deployment solution, less prone to human error. Which is what any DevOps solution must always strive for.

    Depending on the values you put in the <project>.wpp.targets you can parameterise the connection string however you need.

    Repost: What I learnt extending my VSTS Release Process to on-premises Lab Management Network Isolated Environments

    This a a repost of a guest article first posted on the Microsoft UK Developers Blog: How to extend a VSTS release process to on-premises

    Note that since I write the original post there have been some changes on VSTS and the release to TFS 2015.2 RC1. These mean there is no longer an option to pull build artifacts from the an external TFS server as part of a release; so invalidating some of the options this post discusses. I have struck out the outdated sections. The rest of the post is still valid, especially the section on where to update configuration settings. The release of TFS 2015.2 RC1 actually makes many of options easier as you don’t have to bridge between on premises TFS and VSTS as both build and release features are on the same server.


     

    Background

    Visual Studio Team Services (VSTS) provides a completely new version of Release Management, replacing the version shipped with TFS 2013/2015. This new system is based on the same cross platform agent model as the new vNext build system shipped with TFS 2015 (and also available on VSTS). At present this new Release Management system is only available on VSTS, but the features timeline suggest we should see it on-premises in the upcoming update 2015.2.

    You might immediately think that as this feature is only available in VSTS at present, that you cannot use this new release management system with on-premises services, but this would not be true. The Release Management team have provided an excellent blog post on running an agent connected to your VSTS instance inside your on-premises network to enable hybrid scenarios.

    This works well for deploying to domain connected targets, especially if you are using Azure Active Directory Sync to sync your corporate domain and AAD to provide a directory backed VSTS instance. In this case you can use a single corporate domain account to connect to VSTS and to the domain services you wish to deploy to from the on-premises agent.

    However, I make extensive use of TFS Lab Management to provide isolated dev/test environments (linked to an on-premises TFS 2015.1 instance). If I want to deploy to these VMs it adds complexity in how I need to manage authentication; as I don’t want to have to place a VSTS build agent in each transiently created dev/test lab. One because it is complex and two because there is a cost to having more than one self provisioned vNext build agent.

    It is fair to say that deploying to an on-premises Lab Management environment from a VSTS instance is an edge case, but the same basic process will be needed when the new Release Management features become available on-premises.

    Now, I would be the first to say that there is a good case to look at a move away from Lab Management to using Azure Dev Labs which are currently in preview, but Dev Labs needs fuller Azure Resource Manager support before we can replicate the network isolated Lab Management environments I need.

    The Example

    So at this time, I still need to be able to use the new Release Management with my current Lab Management network isolated labs, but this raises some issues of authentication and just what is running where. So let us work through an example; say I want to deploy a SQL DB via a DACPAC and a web site via MSDeploy on the infrastructure shown below.

     

    image

    Both the target SQL and Web servers live inside the Lab Management isolated network on the proj.local domain, but have DHCP assigned addresses on the corporate LAN in the form vslm-[guid].corp.com (managed by Lab Management), so I can access them from the build agent with appropriate credentials (a login for the proj.local domain within the network isolated lab).

    The first step is to install a VSTS build agent linked to my VSTS instance, once this is done we can start to create our release pipeline. The first stage is to get the artifacts we need to deploy i.e. the output of builds. These could be XAML or vNext build on the VSTS instance, or from the on-premises TFS instance or a Jenkins build. Remember a single release can deploy any number of artifacts (builds) e.g. the output of a number of builds. It is this fact that makes this setup not as strange as it initially appears. We are just using VSTS Release Management to orchestrate a deployment to on-premises systems.

    The problem we have is that though our release now has artifacts, we now need to run some commands on the VM running the vNext Build Agent to do the actual deployment. VSTS provides a number of deployment tasks to help in this area. Unfortunately, at the time of writing, the list of deployment tasks in VSTS are somewhat Azure focused, so not that much use to me.

    clip_image004

    This will change over time as more tasks get released, you can see what is being developed on the VSO Agent Task GitHub Repo (and of course you could install versions from this repo if you wish).

    So for now I need to use my own scripts, as we are on a Windows based system (not Linux or Mac) this means some PowerShell scripts.

    The next choice becomes ‘do I run the script on the Build Agent VM or remotely on the target VM’ (within the network isolated environment). The answer is the age-old consultants answer ‘it depends’. In the case of both DACPAC and MSDeploy deployments, there is the option to do remote deployment i.e. run the deployment command on the Build Agent VM and it remotely connects to the target VMs in the network isolated environment. The problem with this way of working is that I would need to open more ports on the SQL and Web VMs to allow the remote connections; I did not want to do this.

    The alternative is to use PowerShell remoting, in this model I trigger the script on the Build Agent VM, but it uses PowerShell remoting to run the command on the target VM. For this I only need to enable remote PowerShell on the target VMs, this is done by running the following command and follow prompts on each target VM to set up the required services and open the correct ports on the target VMs firewall.

    winrm -qc 

    This is something we are starting to do as standard to allow remote management via PowerShell on all our VMs.

    So at this point it all seems fairly straight forward, run a couple of remote PowerShell scripts and all is good, but no. There is a problem.

    A key feature of Release Management is that you can provide different configurations for different environments e.g. the DB connection string is different for the QA lab as opposed to production. These values are stored securely in Release Management and applied as needed.

    clip_image006

    The way these variables are presented is as environment variables on the Build Agent VM, hence they can accessed from PowerShell in the form env:$__DOMAIN__. IT IS IMPORTANT TO REMEMBER that they are not presented on any target VMs in the isolated lab network environment, or to these VMs via PowerShell remoting.

    So if we are intending to use remote PowerShell execution for our deployments we can’t just access settings environment variables as part of the scripts being run remotely; we would have to pass the environment variable in as PowerShell command line arguments.

    This works OK for the DACPAC deployment as we only need to pass in a few, fixed arguments e.g. The PowerShell script arguments when passing the arguments for the package name, target server and DB name using the Release Management variables in their $(variable) form become:

    -DBPackage $(DBPACKAGE) -TarhegDBName $(TARGETDDBNAME) –TargetServer $(TARGETSERVERNAME)

    However, for the MSDeploy deploy there is no simple fixed list of parameters. This is because as well as parameters like package names, we need to modify the setparameters.xml file at deployment time to inject values for our web.config from the release management system.

    The solution I have adopted is do not try to pass this potentially long list of arguments into a script to be run remotely, the command line argument just becomes hard to edit without making errors, and needs to be updated each time we add an extra variable.

    The alternative is to update the setparameters.xml file on the Build Agent VM before we attempt to run it remotely. To this end I have written a custom build task to handle the process which can found on my GitHub repo. This updates a named setparameters.xml file using token replacement based on environment variables set by Release Management. If you would rather automatically find a number of setparmeters.xml files using wildcards (because you are deploying many sites/services) and update them all with a single set of tokens, have a look at Colin Dembovsky’s build task which does just that.

    So given this technique my release steps become:

    1. Get the artifacts from the builds to the Build Agent VM.

    2. Update the setparameters.xml file using environment variables on the Build Agent VM.

    3. Copy the downloaded (and modified) artifacts to all the target machines in the environment.

    4. On the SQL VM run the sqlpackage.exe command to deploy the DACPAC using remote PowerShell execution.

    5. On the Web VM run the MSDeploy command using remote PowerShell execution.

    clip_image008

    The PowerShell I run in the final two tasks are just simple wrappers around the underlying commands. The key fact is that because they are scripts it allows remote execution. The targeting of the execution is done by associating each task with a target machine group, and filtering either by name or in my case role, to target specific VMs.

    clip_image010

    In my machine group I have defined both my SQL and Web VMs using the names on the corporate LAN. Assigning a role to each to make targeting easier. Note that it is here, in the machine group definition, that I provide the credentials required to access the VMs in my Network Isolated environment i.e. a proj.local set of credentials.

    clip_image012.

    Once I get all these settings in place I am able to build a product on my VSTS build system (or my on-premises TFS instance) and using this VSTS connected, but on-premises located; Build Agent deploy my DB and web site to a Lab Management network isolated test environment.

    There is no reason why I cannot add more tasks to this release pipeline to perform more actions such as run tests (remember the network isolated environment already has TFS Test Agents installed, but they are pointing to the on-premises TFS instance) or to deploy to other environments.

    Summary

    As I said before, this is an edge case, but I hope it shows how flexible the new build and release systems can be for both TFS and VSTS.