Announcing a new VSTS Extension for Starting and Stopping Azure DevTest Labs VMs

Background

I have recently been posting on using Azure to host private VSTS build/release agents to avoid agent queue deadlocking issues with more complex release pipelines.

One of the areas discussed is reducing cost of running a private agent in Azure by only running the private agent within a limited time range, when you guess it might be needed. I have done this using DevTest Labs Auto Start and Auto Stop features. This works, but is it not better to only start the agent VM when it is actually really needed, not when you guess it might be? I need this private agent only when working on my VSTS extensions, not something I do everyday. Why waste CPU cycles that are never used?

New VSTS Extension

I had expected there would already be a VSTS  extension to Start and Stop DevTest Lab VMs, but the Microsoft provided extension for DevTest Labs only provides tasks for the creation and deletion of VMs within a lab.

So I am pleased to announce the release of my new DevTest Labs VSTS Extension to fill this gap, adding tasks to start and stop a DevTest Lab VM on demand from within a build or a release.

My Usage

I have been able to use the tasks in this extension to start my private Azure hosted agent only when I need it for functional tests within a release.

However, they could equally be used for a variety of different testing scenarios where any form of pre-built/configured VMs needs to be started or stopped as opposed to slower processes of creating/deploying a new deployment of a DevTest lab VM.

In may case I added an extra agent phases to my release pipeline to start the VM prior to it being needed.

image

I could also have used another agent phase to stop the VM once the tests were completed. However, I made the call to leave the VM running and let DevTest Labs’ Auto Stop shut it down at the end of the day. The reason for this is that VM start up and shutdown is still fairly slow, a minute or two, and I often find I need to run a set of function tests a few times during my development cycle; so it is a bit more efficient to leave the VM running until the end of the day. Only taking the start-up cost once.

You may have course have different needs, hence providing both the Start and Stop Tasks

Development

This new extension aims to act as a supplement to the Microsoft provided Azure DevTest Lab Extension. Hence to make development and adoption easier, it uses exactly the same source code structure and task parameters as the Microsoft provided extension. The task parameters being:

  • Azure RM Subscription – Azure Resource Manager subscription to configure before running.
  • Source Lab VM ID – Resource ID of the source lab VM. The source lab VM must be in the selected lab, as the custom image will be created using its VHD file. You can use any variable such as $(labVMId), the output of calling Create Azure DevTest Labs VM, that contains a value in the form /subscriptions/{subId}/resourceGroups/{rgName}/providers/Microsoft.DevTestLab/labs/{labName}/virtualMachines/{vmName}.

The issue I had was that the DevTest Labs PowerShell API did not provide a command to start or stop a VM in a lab. I needed to load the Azure PowerShell library to use the Invoke-AzureRmResourceAction  command. This requires you first call Login-AzureRmAccount to authenticate prior to calling the actual Invoke-AzureRmResourceAction required. This required a bit of extra code to get and reuse the AzureRM endpoint to find the authentication details.

# Get the parameters
$ConnectedServiceName = Get-VstsInput -Name "ConnectedServiceName"
# Get the end point from the name passed as a parameter
$Endpoint = Get-VstsEndpoint -Name $ConnectedServiceName -Require
# Get the authentication details
$clientID = $Endpoint.Auth.parameters.serviceprincipalid
$key = $Endpoint.Auth.parameters.serviceprincipalkey
$tenantId = $Endpoint.Auth.parameters.tenantid
$SecurePassword = $key | ConvertTo-SecureString -AsPlainText -Force
$cred = new-object -typename System.Management.Automation.PSCredential -argumentlist $clientID, $SecurePassword
# Authenticate
Login-AzureRmAccount -Credential $cred -TenantId $tenantId -ServicePrincipal

Important to note that for this code to work you have to set the task’s task.json to run PowerShell3 and package the Powershell VSTS API module in with the task.

"execution": {
  "PowerShell3": {
     "target": "$(currentDirectory)\\StartVM.ps1",
     "argumentFormat": "",
     "workingDirectory": "$(currentDirectory)"
    }
  }

If the folder structure is correct changing to PowerShell3 will automatically load the required module from the tasks ps_module folder

In Summary

I have certainly found this extension useful, and I have leant more that I had expect I would about VSTS endpoints and Azure authentication.

Hope it is useful to you too.

Creating a VSTS build agent on an Azure DevLabs Windows Server VM with no GUI – Using Artifacts

In my last post I discussed creating a private VSTS build agent within an Azure DevTest Lab on a VM with no GUI. It was pointed out to me today, by Rik Hepworth, that I had overlooked an obvious alternative way to get the VSTS agent onto the VM i.e. not having to use a series of commands at an RDP connected command prompt.

The alternative I missed is to use a DevTest Lab Artifact; in fact there is such an artifact available within the standard set in DevTest Labs. You just provide a few parameters and you are good to go.

image

Well you should be good to go, but there is an issue.

The PowerShell used to extract the downloaded Build Agent ZIP file does not work on a non-UI based Windows VM. The basic issue here is discussed in this post by my fellow ALM MVP Ricci Gian Maria. Luckily the fix is simple; I just used the same code to do the extraction of the ZIP file that I used in my previous post.

I have submitted this fix as a Pull Request to the DevTest Lab Team so hopefully the standard repository will have the fix soon and you won’t need to do a fork to create a private artifacts repo as I have.

Update 1st December 2017 The Pull Request to the DevTest Lab Team with the fixed code has been accepted and the fix is now in the master branch of the public artifact repo, so automatically available to all

Creating a VSTS build agent on an Azure DevLabs Windows Server VM with no GUI

Updates


As I posted recently I have been trying to add more functional tests to the VSTS based release CI/CD pipeline for my VSTS Extensions, and as I noted depending on how you want to run your tests e.g. trigger sub-builds, you can end up with scheduling deadlocks where a single build agent is scheduling the release and trying to run a new build. The answer is to use a second build agent in a different agent pool e.g. if the release is running on the Hosted build agent use a private build agent for the sub-build, or of course just pay for more hosted build instances.

The problem with a private build agent is where to run it. As my extensions are a personal project I don’t have a corporate Hyper-V server to run any extra private agents on, as I would have for an company projects. My MVP MSDN Azure benefits are the obvious answer, but I want any agents to be cheap to run, so I don’t burn through all my MSDN credits for a single build agent.

To this end I created a Windows Server 2016 VM in DevLabs (I prefer to create my VMs in DevLabs as it makes it easier tidying up of my Azure account) using an A0 sizing VM. This is tiny so cheap; I don’t intend to ever do a build on this agent, just schedule releases, so need to install few if any tools, so the size should not be an issue. To further reduce costs I used the auto start and stop features on the VM so it is only running during the hours I might be working. So I get an admittedly slow and limited private build agent but for less that $10 a month.

As the VM is small it makes sense to not run a GUI. This means when you RDP to the new VM you just get a command prompt. So how do you get the agent onto the VM and setup? You can’t just open a browser to VSTS or cut and paste a file via RDP, and I wanted to avoid the complexity of having to open up PowerShell remoting on the VM.

The process I used was as follows:

  1. In VSTS I created a new Agent Pool for my Azure hosted build agents
  2. In the Azure portal, DevLabs I created a new Windows Server 2016 (1709) VM
  3. I then RDP’d to my new Azure VM, in the open Command Prompt I ran PowerShell
    powershell
  4. As I was in my users home directory, I  cd’d into the downloads folder
    cd downloads
  5. I then ran the following PowerShell command to download the agent (you can get the current URI for the agent from your VSTS Agent Pool ‘Download Agent’ feature, but an old version will do as it will auto update.
    invoke-webrequest -UseBasicParsing -uri https://github.com/Microsoft/vsts-agent/releases/download/v2.124.0/vsts-agent-win7-x64-2.124.0.zip -OutFile vsts-agent-win7-x64-2.124.0.zip
  6. You can then follow the standard agent setup instructions from the VSTS Agent Pool ‘Download Agent’ feature
    mkdir \agent ; cd \agent
    PS
    Add-Type -AssemblyName System.IO.Compression.FileSystem ; [System.IO.Compression.ZipFile]::ExtractToDirectory(“$HOME\Downloads\vsts-agent-win7-x64-2.124.0.zip”, “$PWD”)
  7. I then configured the agent to run as a service, I exited back to the command prompt to do this this, so the commands were
    exit
    config.cmd

I now had an other build agent pool to use in my CI/CD pipelines at a reasonable cost, and the performance was not too bad either.


	

Moving BM-Bloggers from BlogEngine.NET to WordPress

BlogEngine.Net has served us well as a blogging platform for a good few years. However, it is no longer under active support, so it is time to move on, too much risk of future security issues to ignore the lack of support. After a bit of thought we decided on WordPress as a replacement. OK this has had its own history of problems, but it has an active community and is well supported and in the Azure Marketplace.

The process to move content from BlogEngine to WordPress requires a few steps, and the available community documentation is a bit out of date, mostly due to change in multi-blog support in BlogEngine.NET. So these are the steps I followeded

Steps

Setting up a WordPress Network

The first step is to create a standard WordPress App Service site on Azure. I used the option to create a MySQL DB in the App Service instance.

Once this was created I needed to make some WordPress setting changes to enable multi blog (network) usage.

  • First run the standard WordPress setup to create a single site
  • Apply any upgrades available
  • Next I needed to update the settings, this involves editing text files on the instance so I used FTP (Filezilla) and a text editor to edit the required files
  • First I needed to update the php execute timeout to give the content import enough time to run for our larger blogs, this means a custom config file. (Actually not sure this is 100% required as the import retry, as discussed below, would probably have been enough)
    • In the Azure portal set the AppSetting PHP_INI_SCAN_DIR to D:\home\site\wwwroot
    • In the root of the site create a text file phpconfig.ini
    • In the file set max_execution_time = 600 and save and upload the file
    • As the linked post notes you can check these edit have worked checking with the phpinfo()  function on a PHP page on the site.
  • Next create the WordPress network, edit wp-config.php and add  define( ‘WP_ALLOW_MULTISITE’, true );
  • Once the site reloads you can now create the network, as the wizard completes it gives instructions to edit the wp-config.php and web.config, I found note that web.config settings the documentation/wizard gives are missing one element, I needed to use file following
<?xml version="1.0" encoding="UTF-8"?>
<configuration> <system.webServer> <rewrite> <rules> <rule name="WordPress Rule 1 Identical" stopProcessing="true"> <match url="^index\.php$" ignoreCase="false" /> <action type="None" /> </rule> <rule name="WordPress Rule 3 Identical" stopProcessing="true"> <match url="^([_0-9a-zA-Z-]+/)?wp-admin$" ignoreCase="false" /> <action type="Redirect" url="{R:1}wp-admin/" redirectType="Permanent" /> </rule> <rule name="WordPress Rule 4 Identical" stopProcessing="true"> <match url="^" ignoreCase="false" /> <conditions logicalGrouping="MatchAny"> <add input="{REQUEST_FILENAME}" matchType="IsFile" ignoreCase="false" /> <add input="{REQUEST_FILENAME}" matchType="IsDirectory" ignoreCase="false" /> </conditions> <action type="None" /> </rule> <rule name="WordPress Rule 5 R2" stopProcessing="true"> <match url="^([_0-9a-zA-Z-]+/)?(wp-(content|admin|includes).*)" ignoreCase="false" /> <action type="Rewrite" url="{R:2}" /> </rule> <rule name="WordPress Rule 6 Shorter" stopProcessing="true"> <match url="^([_0-9a-zA-Z-]+/)?(.*\.php)$" ignoreCase="false" /> <action type="Rewrite" url="{R:2}" /> </rule> <rule name="WordPress Rule 7 Identical" stopProcessing="true"> <match url="." ignoreCase="false" /> <action type="Rewrite" url="index.php" /> </rule> </rules> </rewrite> </system.webServer> </configuration>

Once this all done you should have a WordPress network and can start to import the old BlogEngine content into sub sites

Create Sites

Next step is to login as a WordPress network admin (the original account you created in the wizard) and create a sub site.

When you do this a numeric folder will be created in the form /site/wwwroot/wp-content/uploads/sites/123 , make a note of this number as you need it for fixing the import content in the next step, you might need to create a test post with an image to force this folder creation.

Import the BlogEngine Contents

Next we needed to import the content our BlogEngine.NET. This is done using the basic process document in this blog post, but it needs a few updates due to the posts age and the fact we had a multi-blog setup.

The only export option in BlogEbgine.NET is BlogML and if you are running in multisite mode this appears to be broken. The fix is to edit  /admin/themes/standard/sidebar.cshtml around line 90 to remove the if test logic blocking showing the export options in multisite mode. Once this is done you can log into to a sub blog and export its contents as a BlogML XML file.

Note: This is not without its problems, when you are logged into the sub site as an admin and select the settings>advanced>export option you get an error as it tries to load the page http://yoursite/blogml.axd, this is due to the simple hack used to enable the export features, you need to manually edit this URL to http://yoursite/[blogname]/blogml.axd and the export works OK

You now move the media files associated with the blog posts. The only difference from moving a single blog setup is you need to place them under the /site/wwwroot/wp-content/uploads/sites/123 previously created. I suggest creating a folder for all the historic post media e.g. /site/wwwroot/wp-content/uploads/sites/123/historic and FTPing up all you old images from blogengine/App_Data/blogs/[name]/files

I next hit the major issues and that is that the BlogML plugin (which you need to install as the WordPress network administrator) is 7 years old, and won’t activate on current versions of WordPress. The issue is changes in the PHP language. The fix is to use the edit option for the plugin and replace all the references to break $parseBlock to break 1 in the file xpath.class.php. Once this is done the plugin activates at the network level, so can be used in each sub site

But before we try the import we need to edit exported BlogML file as the blog post says. However, we can improve on the documented process. The blog post says the tags and categories are lost in the import process, this is true for tags, but it is possible to fix the categories. To do this, and fix the images paths, I have written some PowerShell to do the required updates, it is ugly but works opening the file as text and XML separately

 param
  (
     $filein = "C:\Users\fez\Downloads\BlogML (5).xml",
     $outfile = "C:\Users\fez\Downloads\BlogML rfennell.xml",
     $blogname = "rfennell",
     $blogid = "2"
  )
  # Fix the image path
[string]$document = Get-Content -Path $filein

write-output "Replacing name in $oldstring with BlogId $blogid"

$oldstring = "http://blogs.blackmarble.co.uk/blogs/$blogname/image.axd?picture="
$document =  $document -Replace [regex]::Escape($oldstring) ,"/wp-content/uploads/sites/$blogid/historic/"

#seems to have image URLs with missing slashs in path, take the chance to fix them
$oldstring = "http://blogs.blackmarble.co.uk/blogs/$($blogname)image.axd?picture="
$document =  $document -Replace [regex]::Escape($oldstring) ,"/wp-content/uploads/sites/$blogid/historic/"

Set-Content -Value $document -Path $outFile

[xml]$XmlDocument = Get-Content -Path $outFile

# fix the categories block
foreach ($item in $xmlDocument.blog.categories.category) {
    try {
        Write-output "Setting $($item.id) to $($item.title.'#cdata-section')"
        # fix all the categories on the post
        $XmlDocument = $XmlDocument.OuterXml.Replace($($item.id),$($item.title.'#cdata-section'))

        # fix the categories block
        $item.id = $item.title.'#cdata-section'
    } catch {}
  }

$xmlDocument.Save($outfile)

Make sure the parameters are correct for your blog export and target site then process you BlogEngine Export.

You can now login to your newly create WordPress subsite and using the Tools/Import option to run the BlogML wizard. You are prompted to remap the authors of the posts as needed. I unified them to the site owner if needed, the started the import. Now we did edit of the PHP timeout, but I found that for my largest 7Mb export file I still got Error 500 timeouts. The good news is that you can just rerun the import (maybe a few time) and it is clever enough to pickup where it left off and will eventually finish, with no duplications. Now there maybe a different timeout you need to set but I did not find it.

You should now have imported post content into you sub site. Unfortunately, you will have to handle static pages manually.  

Finishing Up

You are now in realm of WordPress, so you can add users, plug-ins and themes as needed to style your set of blogs.

One I found very useful was the WDS Multisite Aggregator which allowed the root site to be an aggregation of all the sub sites, just the same as I had on BlogEngine.NET multisite.

Also as I was running on Azure I needed some special handling for email to use SMTP with the plug-in WP Mail. Once this change was done it could configure the network root site’s email to allow user password resets. For comments each individual sub sites email needs configuring.

I had had concerns over links in old post (as the URL structure had changed), but WordPress seems to sort most of this out, the remained were sorted with redirection rules in the web.config.

Conclusion

This whole process took some experimentation, but once done the rest was a ‘handle turning process’. Lets hope WordPress works as well for us as BlogEngine.NET did in the past.