BM-Bloggers

The blogs of Black Marble staff

Can’t add users to a VSTS instance backed by an Azure Directory

I have a VSTS instance that is backed by an Azure Directory. This is a great way to help secure a VSTS instance, only users in the Azure Directory can be added to VSTS, not just any old MSA (LiveIDs). This is a directory that can be shared with any other Azure based services such as O365, and centrally managed and linked to an on-premises Active Directory.

When I tried to add a user to VSTS, one that was a valid user in the Azure Directory, their account did not appear in the available users drop down.

 image

Turns out the problem was who I was logged in as. As yo can see from the screenshot I have three Richard accounts in the VSTS instance (and Azure Directory), a couple of MSAs and a guest work account from another Azure Directory. I was logged in as the guest work account.

All three IDs as administrators in VSTS, but it turned out I needed to be logged in as the MSA that owned the Azure subscription contains the Azure Directory. As soon as I used this account the dropdown populated as expected and I could add the users from the Azure Diretcory

image

Debug your Bot with Visual Studio debugger

I previously posted using ngrok to debug your Node/C# bot, and mentioned that you can also use the Visual Debugger under certain circumstances.

These circumstances are:

  • You are using the .NET Bot Builder
  • You are using Azure which supports remote debugging
  • You have a DLL that contains debugging information - typically this means a Debug build

If all these are true then you can download the Azure SDK and attach your local Visual Studio environment to the remote process.

See how in this short video:

Creating Website Slots and SQL Elastic Pools using Azure Resource Templates

Recently I have been helping a number of organisations automate the deployment of their applications to Azure and came across a couple of scenarios that were not documented: Deploying an App Services web site with slots and SQL connection string settings, and the creation of a SQL Elastic Pool. Of those, the SQL Elastic Pool I found to be written up already by Vincent-Philipe Lauzon and all credit to him - my template draws on his excellent article.

The Web slots and configuration, however, I didn't find. There are templates that deploy a web site, and some that deploy configuration settings into that web site (indeed, creating a new Web+SQL template through Visual Studio does just that). However, I could find none that deployed slots and none that added the config to the slot.

You can find the full template in my GitHub Repo. The template code to deploy a slot and associated config is shown below. This sits in the nested resources bock within the website resource, for reference.

The trick with the config, as it turns out, is the resource type. If you examine the connectionStrings node within a slot through Resource Explorer you will see it reported as Microsoft.Web/sites/config. However, if you click the PowerShell tab for the same note you will see the type reported as Microsoft.Web/sites/slots/config. Make sure that the resource name matches the config section (i.e. connectionStrings - or appsettings, etc).

{
          "apiVersion": "2015-08-01",
          "name": "[concat(variables('website').websiteName, '/', variables('website').slotName)]",
          "type": "Microsoft.Web/Sites/slots",
          "location": "[resourceGroup().location]",
          "dependsOn": [
            "[concat('Microsoft.Web/Sites/', variables('website').websiteName)]"
          ],
          "tags": {
            "displayName": "Slot"
          },
          "properties": {
          },
          "resources": [
            {
              "apiVersion": "2015-08-01",
              "name": "[concat(variables('website').websiteName, '/', variables('website').slotName, '/connectionStrings')]",
              "type": "Microsoft.Web/Sites/slots/config",
              "location": "[resourceGroup().location]",
              "dependsOn": [
                "[concat('Microsoft.Web/Sites/', variables('website').websiteName, '/slots/', variables('website').slotName)]"
              ],
              "tags": {
                "displayName": "SlotConnectionStrings"
              },
              "properties": {
                "DefaultConnection": {
                  "value": "[concat('Data Source=tcp:', reference(concat('Microsoft.Sql/servers/', variables('sqlServer').name)).fullyQualifiedDomainName, ',1433;Initial Catalog=', variables('sqlServer').stagingDbname, ';User Id=', parameters('sqlAdminLogin'), '@', variables('sqlServer').name, ';Password=', parameters('sqlAdminPassword'), ';')]",
                  "type": "SQLServer"
                }
              }
            }
          ]
        }

Use NuGet with Azure Functions

Azure Functions are "serverless" pieces of functionality. You can take your existing C# or JavaScript code and it becomes a single unit of maintenance, upgrade, scale etc.
One of the key differences is the way that code is authored out of the box - although you can use an IDE like Visual Studio you can also use the browser as your IDE.
There are a few nuances that you need to be aware of - such as adding NuGet packages.
It's easy once you know how though! see how you can do this in this short video.


The configuration I pasted in to my project.json is here:

{
 "frameworks": 
 {  
  "net46":
  { 
   "dependencies":
   {
    "Newtonsoft.Json" : "9.0.1" 
   }
  }
}

Debug your Bot with ngrok

Once you have deployed your Bot to Azure, what do you do if you need to debug or diagnose any issues with the Bot code?

If you are using the .NET Bot Builder you can use the Visual Studio remote debugger and attach your local debugger in Visual Studio to the remote process.  Azure supports this but other hosting providers do not, and of course your Bot needs to be .NET and you need to have debugging symbols available. 

What do you do if:

  • you are using Node?
  • or, do not have debugging symbols?
  • or, hosting your Bot application with a provider that doesn’t support the VS Remote Debugger?

Well you are in luck because you can use ngrok to deal with all these constraints. 

Ngrok provides a secure tunnel to your local machine via a publicly accessible endpoint.

In this short video see how you can use it to debug your Bot application locally. ngrok will work with any language/any technology using common transport protocols (like HTTP).

So, to summarise:

  1. Download ngrok from https://ngrok.com/
  2. Extract the zip file to a folder of your choosing.
  3. Open a command prompt in the above folder, and run
    ngrok http 80
    (change 80 to the local port you want to expose)
  4. Change your client application to point to the ngrok.io endpoint
  5. Test and debug!

Version 2.0.x of my Generate Release Notes VSTS Task has been released with release rollup support

I have just released a major update to my Generate Release Notes VSTS Build extension. This V2 update adds support to look back into past releases to find when there was a successful release to a given stage/environment and creates a rollup set of build artifacts, and hence commits/changesets and workitems, in the release notes.

 

 

This has been a long running request on GitHub for this extension which I am pleased to have been able to address.

To aid backwards compatibility, the default behaviour of the build/release tasks is as it was before, it can be used in a build or in and release, and if in a release it only consider the artifacts in the current release that ran the task.

If you want to use the new features you need to enable them. This is all on the advanced properties

 

image

 

You get new properties to enable scanning past releases until the task find a successful deployment to, by default, the same stage/environment that is currently being released too. You can override this stage name to allow more complex usage e.g. generating the releases notes for what is changed since the last release to production whist in a UAT environment.

This change also means there is new variable that can be accessed in templates, this $Releases which contains all the releases being used to get build artifacts. This can be used on release notes to show the releases being used e.g.

 

**Release notes for release $defname**
**Release Number**  : $($release.name)   
**Release completed** $("{0:dd/MM/yy HH:mm:ss}" -f [datetime]$release.modifiedOn) **Changes since last successful release to '$stagename'**  
**Including releases:**  
$(($releases | select-object -ExpandProperty name) -join ", " )  

 

Generating a content

 

Release notes for release Validate-ReleaseNotesTask.Master
Release Number : Release-69 
Release completed 05/01/17 12:40:19
Changes since last successful release to 'Environment 2' 
Including releases: 
Release-69, Release-68, Release-67, Release-66 

 

Hope you find this extension useful

A nice relaxing Christmas break (and by the way I migrated our on-premises TFS to VSTS as well)

Over the Christmas break I migrated our on premises TFS 2015 instance to VSTS. The reason for the migration was multi-fold:

  • We were blocked on moving to TFS 2017 as we could not easily upgrade our SQL cluster to SQL 2014
  • We wanted to be on the latest, greatest and newest features of VSTS/TFS
  • We wanted to get away from having to perform on-premises updates every few months

To do the migration we used the public preview of the TFS to VSTS Migrator.

So what did we learn?

The actual import was fairly quick, around 3 hours for just short of 200Gb of TPC data. However, getting the data from our on-premises system up to Azure was much slower, constrained by the need to copy backups around our LAN and our Internet bandwidth to get the files to Azure storage, a grand total of more like 16 hours. But remember this was mostly spent watching various progress bars after running various commands; so I was free to enjoy the Christmas break, I was not a slave to a PC.

This all makes it sound easy, and to be honest the actual production migration was, but this was only due to doing the hard work prior to the Christmas break during the dry run phase. During the dry run we:

  • Addressed the TFS customisations that needed to be altered/removed
  • Sorted the AD > AAD sync mappings for user accounts
  • Worked out the backup/restore/copy process to get the TPC data to somewhere VSTS could import it from
  • Did the actual dry run migration
  • Tested the dry run instance after the migrate to get a list of what else needed addressing and anything our staff would have to do to access the new VSTS instance
  • Documented (and scripted where possible) all the steps
  • Made sure we had fall back processes in place if the migration failed.

And arguably most importantly, discovered how long each step would take so we could set expectations. This was the prime reason for picking the Christmas break as we knew we could have a number of days where there should be no TFS activity (we close for an extended period) hence de-risking the process to a great degree. We knew we could get the migration done over weekend, but a weeks break was easier, more relaxed, Christmas seemed a timely choice.

You might ask the question ‘what did not migrate?’

Well a better question might be ’what needed changing due to the migration?’

It was not so much items did not migrate, just they are handled a bit differently in VSTS. The list of areas we needed to address were

  • User Licensing – we needed to make sure your user’s MSDN subscription are mapped to their work IDs.
  • Build/Release Licensing – we needed to decide how many private build agents we really needed (not just spin up more on a whim as we had done with our on-premises TFS), they cost money on VSTS
  • Release pipeline – now these don’t migrate as of the time of writing, but I wrote a quick tool to get 95% of their content moved.  After using this tool we did then need to also edit the pipelines, re-entering ‘secrets’ which are not exported, before retesting them

But that was all the issues we had to address, everything else seems to be fine with users just changing the URL they connected to from on-premises to VSTS.

So if you think migrating your TFS to VSTS seems like a good idea, why not have a look at the blog post and video on  the Microsoft ALM Blog about the migration tool. Remember that this is a Microsoft Gold DevOps Partner led process, so please get in touch with us at Black Marble or me directly via this blog if you want a chat about the migrations or other DevOps service we offer.

My TFSAlertsDSL project has moved to GitHub and become VSTSServiceHookDsl

Introduction

A while ago I create the TFSAlertsDSL project to provide a means to script responses to TFS Alert SOAP messages using Python. The SOAP Alert technology has been overtaken by time with the move to Service Hooks.

So I have taken the time to move this project over to the newer technology, which is supported both on TFS 2015 (onwards) and VSTS. I also took the chance to move from CodePlex to GitHub and renamed the project to VSTSServiceHookDsl.

Note: If you need the older SOAP alert based model stick with the project on CodePlex, I don’t intend to update it, but all the source is there if you need it.

What I learnt in the migration

Supporting WCF and Service Hooks

I had intended to keep support for both SOAP Alerts and Service Hooks in the new project, but I quickly realised there was little point. You cannot even register SOAP based alerts via the UI anymore and it added a lot of complexity. So I decided to remove all the WCF SOAP handling.

C# or REST TFS API

The SOAP Alert version used the older TFS C# API, hence you had to distribute these DLLs with the web site. Whilst factoring I decided to swap all the TFS calls to using the new REST API. This provided a couple of advantages

  • I did not need to distribute the TFS DLLs
  • Many of the newer function of VSTS/TFS are only available via the REST API

    Exposing JObjects to Python

    I revised the way that TFS data is handed in the Python Scripts. In the past I hand crafted data transfer objects for consumption within the Python scripts. The problem with this way of working is that it cannot handle custom objects, customised work items are a particular issue. You don’t know their shape.

    I found the best solution was to just return the Newtonsoft JObjects that I got from the C# based REST calls. These are easily consumed in Python in the general form

    workitem["fields"]["System.State"] 


    Downside is that this change does mean that any scripts you had created for the old SOAP Alert version will need a bit of work when you transfer to the new Service Hook version.

    Create a release pipeline

    As per all good projects, I created a release pipeline for my internal test deployment. My process was as follows

    • A VSTS build that builds the code from Github this
      • Complies the code
      • Run all the unit test
      • Packages as an MSDeploy Package
    • Followed by a VSTS release that
      • Sets the web.config entries
      • Deploys the MSDeploy package to Azure
      • Then uses FTP to uploaded DSL DLL to Azure as it is not part of the package

    image 

    Future Steps

    Add support for more triggers

    At the moment the Service Hook project supports the same trigger events as the old SOAP project, with the addition of support Git Push triggers.

    I need to add in the handlers for all the older support triggers in VSTS/TFS, specifically the release related ones. I suspect these might be useful.

    Create an ARM template

    At the moment the deployment relies on the user creating the web site. It would be good to add an Azure Resource Management (ARM) Template to allow this site to be created automatically as part of the release process

    Summary

    So we have a nice new Python and Service Hook based framework to help manage your responses to Service Hook triggers for TFS and VSTS.

    If you think it might be useful to you why not have a look at https://github.com/rfennell/VSTSServiceHookDsl.

    Interested to hear your feedback  

  • Transform tool for transferring TFS 2015.3 Release Templates to VSTS

    If you are moving from on-premises TFS to VSTS you might hit the same problem I have just have. The structure of a VSTS releases is changing, there is now the concept of multiple ‘Deployment Steps’ in an environment. This means you can use a number of different agents for a single environment – a good thing.

    The downside this that if you export a TFS2015.3 release process and try to import it to VSTS it will fail saying the JSON format is incorrect.

    Of course you can get around this with some copy typing, but I am lazy, so….

    I have written a quick transform tool that converts the basic structure of the JSON to the new format. You can see the code as Github Gist

    It is a command line tool, usage is as follows

    1. In VSTS create a new empty release, and save it
    2. Use the drop down menu on the newly saved release in the release explorer and export the file. This is the template for the new format e.g. template.json
    3. On your old TFS system export the release process in the same way to get your source file e.g. source.json
    4. Run the command line tool providing the name of the template, source and output file

      RMTransform template.json source.json output.json
    5. On VSTS import the newly create JSON file release file.
    6. A release process should be created, but it won’t be possible to save it until you have fixed a few things that are not transferred
      1. Associated each Deployment step with Agent Pool
      2. Set the user accounts who will do the pre-and post approvals
      3. Any secret variable will need to be reentered
        IMPORTANT - Make sure you save the imported process as soon as you can (i.e. straight after fixing anything that is stopping it being saved). If you don't save and start clicking into artifacts or global variable it seems to loose everything and you need to re-import

    image

    It is not perfect, you might find other issues that need fixing, but it save a load of copy typing

    Deleting unwanted orphan XAML Build Controllers on a migrated VSTS instance

    Whilst working with the VSTS Data Import Service I ended up migrating a TFS TPC up to VSTS that had an old XAML Build Controller defined. I did not need this XAML build controller, in fact I needed to remove it because it was using my free private build controller slot. Problem was I could not find a way to remove it via the VSTS (or Visual Studio Team Explorer) UI, and the VM that had been running the build controller was long gone.

    The way I got rid of it in the end was the TFS C# API and a quick command line tool as shown below.

    Note that you will need to delete any queued builds on the controller before you can delete it. You can do this via the VSTS browser interface.