BM-Bloggers

The blogs of Black Marble staff

Steps Required to Configure WSUS to Distribute Windows 10 Version 1511 Upgrade

Microsoft recently made a hotfix available that patches WSUS on Windows Server 2012 and 2012 R2 to allow Windows 10 upgrade to version 1511. Installing the update is not, however, the only step that is required…

  1. Install the hotfix. This can be downloaded from https://support.microsoft.com/en-us/kb/3095113. Ensure that you pick the appropriate hotfix for the version of Windows Server on which you’re running WSUS. Note that if you’re running Windows Server 2012 R2, there’s also a pre-requisite install.
  2. Once the hotfix is installed and you’ve restarted your WSUS server, look in the ‘Products and Classifications’ option under the Classifications tab and ensure that the checkbox for upgrades is selected. This is not selected automatically for you:
    Upgrades Option
    Note that the upgrade files may take quite some time to download to your WSUS server at the next synchronisation.
  3. Add a MIME-Type for ‘.esd application/octet-stream’ in IIS on the WSUS server. To do this:
    Open IIS Manager
    Select the server name
    From the ‘IIS’ area in the centre of IIS Manager, open ‘MIME Types’
    Click ‘Add…’
    Enter the information above:
    Esd MIME Type
    Click OK to close the dialog.
    Note: Without this step, clients will fail to download the upgrade with the following error:
    Installation Failure: Windows failed to install the following update with error 0x8024200D: Upgrade to Windows 10 [SKU], version 1511, 10586.
  4. Approve the Upgrade for the classes of computer in your organisation that you want to be upgraded.

Once all of the above steps are in place, computers that are targeted for the upgrade should have this happen automatically at the next update cycle.

New books on VSTS/TFS ALM DevOps

It has been a while since I have mentioned any had new books on TFS/VSTS, and just like buses a couple come along together.

These two, one from Tarun Arora and the other from Mathias Olausson and Jakob Ehn are both nicely on trend for the big area of interest for many of the companies I am working with at present; best practice ‘cook book’ style guidance on how to best use the tools in an ALM process.

 

          

If your are working with TFS/VSTS worth a look

Exchange 2013 Cert Change - Unable to Support the STARTTLS SMTP Verb

I saw a issue recently on an Exchange server after the certificate used to secure SMTP and IIS services was changed as the old certificate was about to expire.

The original default certificate that is self-generated had been replaced with one from a certificate authority. This had been used for several years without issue. The Exchange server is configured in hybrid mode, with incoming e-mail routed through Microsoft’s Exchange Online Protection (EOP) and TLS is configured to be required.

The actions taken were:

  1. Add the new certificate to the certificate store on the Exchange server. This made the new certificate available within the Exchange Admin Center on the server.
  2. Modify the services assigned to the new certificate to bind SMTP and IIS to the new certificate.
  3. Remove the original certificate from the server.
  4. Restart the Microsoft Exchange Transport Service on the server.

At this point, an error was thrown in the Application Event Log on the server and incoming mail from Exchange Online Protection stopped flowing. The error thrown was:

Log Name:      Application
Source:        MSExchangeFrontEndTransport
Date:          04/02/2016 12:17:20
Event ID:      12014
Task Category: TransportService
Level:         Error
Keywords:      Classic
User:          N/A
Computer:      <Exchange Server FQDN>
Description:
Microsoft Exchange could not find a certificate that contains the domain name <I><Cert Issuer Details><S><Cert Subject Details> in the personal store on the local computer. Therefore, it is unable to support the STARTTLS SMTP verb for the connector <Receive Connector Name> with a FQDN parameter of <I><Cert Issuer Details><S><Cert Subject Details>. If the connector's FQDN is not specified, the computer's FQDN is used. Verify the connector configuration and the installed certificates to make sure that there is a certificate with a domain name for that FQDN. If this certificate exists, run Enable-ExchangeCertificate -Services SMTP to make sure that the Microsoft Exchange Transport service has access to the certificate key.

I ran the suggested command to ensure that the Exchange Transport Service had access to the certificate key, but this didn’t help.

Restoring the soon-to-expire certificate to the certificate store on the server and restarting the Microsoft Exchange Transport Service fixed the error, however the certificate in question was going to expire soon, and the use of expired certificates for TLS to EOP is no longer allowed, so this didn’t really help much.

While digging into the configuration for the receive connector specified in the error thrown, I noticed something interesting. Despite the new certificate being supplied by the same certificate authority as the old one, the issuer specified for the certificate had changed slightly. The subject information was still the same. Sure enough, the properties of the receive connector in question still showed the old certificate details even through Exchange had been configured with the new certificate for SMTP and IIS. The information on the receive connector can be found by issuing the following command:

Get-ReceiveConnector "<Receive Connector Name>" | fl

The property we’re interested in is TlsCertificateName.

To correct the error, the following steps were taken:

  1. Locate the issuer and subject information from the new certificate. This can be done by examining the certificate directly via the certificate store, or using PowerShell, e.g.
    $certs = Get-ExchangeCertificate
    Locate the certificate you want to use. The one we wanted was the first on the list.
  2. Assemble the new issuer and subject information in a suitable format for the Receive Connector configuration. Again this can be done by copying the required text from the certificate information, or using PowerShell, e.g.:
    $certinfo = “<I>” + $certs[0].issuer + “<S>” + $certs[0].subject
  3. Modify the Receive Connector configuration to include the new certificate information assembled above, e.g.:
    Set-ReceiveConnector “<Receive Connector Name>” –TlsCertificateName $certinfo
  4. Restart the Microsoft Exchange Transport Service.
  5. Remove the old certificate from the server.

A quick test of incoming and outgoing mail indicated that everything was flowing as expected.

A vNext build task and PowerShell script to generate release notes as part of TFS vNext build.

Updated 22 Mar 2016: This task is now available as an extension in the VSTS marketplace

A common request I get from clients is how can I create a custom set of release notes for a build? The standard TFS build report often includes the information required (work items and changesets/commits associate with the build) but not in a format that is easy to redistribute. So I decided to create a set to tools to try to help.

The tools are available on my github account in two forms:

Both generate a markdown release notes file based on a template passed into the tool. The output report being something like the following:

Release notes for build SampleSolution.Master

Build Number: 20160229.3
Build started: 29/02/16 15:47:58
Source Branch: refs/heads/master

Associated work items
  • Task 60 [Assigned by: Bill <TYPHOONTFS\Bill>] Design WP8 client
Associated change sets/commits
  • ID bf9be94e61f71f87cb068353f58e860b982a2b4b Added a template
  • ID 8c3f8f9817606e48f37f8e6d25b5a212230d7a86 Start of the project

The Template

The use of a template allows the user to define the layout and fields shown in the release notes document. It is basically a markdown file with tags to denote the fields (the properties on the JSON response objects returned from the VSTS REST API) to be replaced when the tool generates the report file.

The only real change from standard markdown is the use of the @@TAG@@ blocks to denote areas that should be looped over i.e: the points where we get the details of all the work items and commits associated with the build.

#Release notes for build $defname  
**Build Number**  : $($build.buildnumber)   
**Build started** : $("{0:dd/MM/yy HH:mm:ss}" -f [datetime]$build.startTime)    
**Source Branch** : $($build.sourceBranch) 
###Associated work items 
@@WILOOP@@ 
* **$($widetail.fields.'System.WorkItemType') $($widetail.id)** [Assigned by: $($widetail.fields.'System.AssignedTo')] $($widetail.fields.'System.Title') 
@@WILOOP@@ 
###Associated change sets/commits 
@@CSLOOP@@ 
* **ID $($csdetail.changesetid)$($csdetail.commitid)** $($csdetail.comment)   
@@CSLOOP@@  

Note 1: We can return the builds startTime and/or finishTime, remember if you are running the template within an automated build the build by definition has not finished so the finishTime property is empty to can’t be parsed. This does not stop the generation of the release notes, but an error is logged in the build logs.

Note 2: We have some special handling in the @@CSLOOP@@ section, we include both the changesetid and the commitid values, only one of there will contain a value, the other is blank. Thus allowing the template to work for both GIT and TFVC builds.

What is done behind the scenes is that each line of the template is evaluated as a line of PowerShell in turn, the in memory versions of the objects are used to provide the runtime values. The available objects to get data from at runtime are

  • $build – the build details returned by the REST call Get Build Details
  • $workItems – the list of work items associated with the build returned by the REST call Build Work Items
  • $widetail – the details of a given work item inside the loop returned by the REST call Get Work Item
  • $changesets – the list of changeset/commit associated with the build build returned by the REST call Build Changes
  • $csdetail – the details of a given changeset/commit inside the loop by the REST call to Changes or Commit depending on whether it is a GIT or TFVC based build

There is a templatedump.md file that just dumps out all the available fields in the PowerShell repo to help you find all the available options

Differences between the script and the task

The main difference between the PowerShell script and the build task is the way the connection is made to the REST API. Within the build task we pickup the access token from the build agent’s context. For the PowerShell script we need to pass credentials in some form or the other, either via parameters or using the default Windows credentials.

Usage

PowerShell

The script can be used in a number of ways

To generate a report for a specific build on VSTS

 .\Create-ReleaseNotes.ps1 -collectionUrl https://yoursite.visualstudio.com/defaultcollection -teamproject "Scrum Project" –defname "BuildTest" -outputfile "releasenotes.md" -templatefile "template.md" -buildnumber "yourbuildnum" -password yourpersonalaccesstoken

Or for the last successful build just leave out the buildnumber

 .\Create-ReleaseNotes.ps1 -collectionUrl https://yoursite.visualstudio.com/defaultcollection -teamproject "Scrum Project" –defname "BuildTest" -outputfile "releasenotes.md" -templatefile "template.md" -password yourpersonalaccesstoken

Authentication options

  1. VSTS with a personal access token – just provide the token using the password parameter
  2. If you are using VSTS and want to use alternate credentials just pass a username and password
  3. If your are using the script with an on-premises TFS just leave off both the username and password and the Windows default credentials will be used.

In all cases the debug output is something like the following


VERBOSE: Getting details of build [BuildTest] from server [https://yoursite.visualstudio.com/defaultcollection/Scrum Project]
VERBOSE: Getting build number [20160228.2]
VERBOSE:    Get details of workitem 504
VERBOSE:    Get details of changeset/commit ba7e613388c06b8440c9e7601a8d6fa29d588051
VERBOSE:    Get details of changeset/commit 52570b2abb80b61a4a629dfd31c0ce071c487709
VERBOSE: Writing output file  for build [BuildTest] [20160228.2].

You should expect to get a report like the example shown at the start of this post.

Build Task

The build task needs to be built and uploaded as per the standard process detailed on my vNext Build’s Wiki (am considering creating a build extensions package to make this easier, keep an eye on this blog)

Once the tool is upload to your TFS or VSTS server it can be added to a build process

image

The task takes two parameters

  • The output file name which defaults to $(Build.ArtifactStagingDirectory)\releasenotes.md
  • The template file name, which should point to a file in source control.

There is no need to pass credentials, this is done automatically

When run you should expect to see a build logs as below and a releases notes file in your drops location.

image

Summary

So I hope some people find these tools useful in generating release notes, let me know if they help and how they could be improved.

My Resource Templates from demos are now on GitHub

I’ve had a number of people ask me if I can share the templates I use in my Resource Template sessions at conferences. It’s taken me a while to find the time, but I have created a repo on GitHub and there is a new Visual Studio solution and deployment project with my code.

One very nice feature that this has enabled me to provide is the same ‘Deploy to Azure’ button as you’ll find in the Azure Quickstart Templates. This meant a few changes to the templates – it turns out that Github is case sensitive for file requests, for example, whilst Azure Storage isn’t. The end result is that you can try out my templates in your own subscription directly from Github!

Build Invites

Today I started sending out emails inviting people to join in with Build Bites 2016. I'm hoping to make this year even bigger and better than last!!!

Using MSDeploy to deploy to nested virtual applications in Azure Web Apps

Azure provides many ways to scale and structure web site and virtual applications. I recently needed to deploy the following structure where each service endpoint was its own Visual Studio Web Application Project built as a MSDeploy Package

  • http://demo.azurewebsites.net/api/service1
  • http://demo.azurewebsites.net/api/service2
  • http://demo.azurewebsites.net/api/service3

To do this in the Azure Portal in …

  1. Created a Web App for the site http://demo.azurewebsites.net This pointed to the disk location site\wwwoot, I disabled the folder as an application as there is not application running at this level 
  2. Created a virtual directory api point to \site\wwroot\api, again disabling this folder as an application 
  3. Created a virtual application for each of my services, each with their own folder

image

I knew from past experience I could use MSDeploy to deploy to the root site or the api virtual directory. However I found when I tried to deploy to any of the service virtual applications I got an error that the web site could not be created. Now I would not expect MSDEPLOY to create a directory so I knew something was wrong at the Azure end.

The fix in the end was simple, it seems the folder service folders e.g \site\wwwroot\api\service1 had not been created by the Azure Portal when I created the virtual directory. I FTP’d onto the web application and create the folder \site\wwwroot\api\service1  once this was done MSDEPlOY worked perfectly, and I could build the structure I wanted. 

Running Pester PowerShell tests in the VSTS hosted build service

Updated 22 Mar 2016 This task is available in the VSTS Marketplace

If you are using Pester to unit test your PowerShell code then there is a good chance you will want to include it in your automated build process. To do this, you need to get Pester installed on your build machine. The usual options would be

If you own the build agent VM then any of these options are good, you can even write the NuGet restore into your build process itself. However there is a problem, both the first two options need administrative access as they put the Pester module in the $PSModules folder (under ‘Program Files’); so these can’t be used on VSTS’s hosted build system, where your are not an administrator

So this means you are left with copying the module (and associated functions folder) to some local working folder and running it manually; but do you really want to have to store the Pester module in your source repo?

My solution was to write a vNext build tasks to deploy the Pester files and run the Pester tests.

image_thumb[12]

The task takes two parameters

  • The root folder to look for test scripts with the naming convention  *.tests.ps1. Defaults to $(Build.SourcesDirectory)\*
  • The results file name, defaults to $(Build.SourcesDirectory)\Test-Pester.XML

The Pester task does not in itself upload the test results, it just throws and error if tests fails. It relies on the standard test results upload task. Add this task and set

  • it to look for nUnit format files
  • it already defaults to the correct file name pattern.
  • IMPORTANT: As the Pester task will stop the build on an error you need to set the ‘Always run’ to make sure the results are published.

image_thumb[11]

Once all this is added to your build you can see your Pester test results in the build summary

image_thumb[10]

image_thumb[14]

You can find the task in my vNextBuild repo

A vNext build task to get artifacts from a different TFS server

With the advent of TFS 2015.2 RC (and the associated VSTS release) we have seen the short term removal of the ‘External TFS Build’ option for the Release Management artifacts source. This causes me a bit of a problem as I wanted to try out the new on premises vNext based Release Management features on 2015.2, but don’t want to place the RC on my production server (though there is go live support). Also the ability to get artifacts from an on premises TFS instance when using VSTS open up a number of scenarios, something I know some of my clients had been investigating.

To get around this blocker I have written a vNext build task that does the getting of a build artifact from the UNC drop. It supports both XAML and vNext builds. Thus replacing the built in artifact linking features.

Usage

To use the new task

  • Get the task from my vNextBuild repo (build using the instructions on the repo’s wiki) and install it on your TFS 2015.2 instance (also use the notes on the repo’s wiki).
  • In your build, disable the auto getting of the artifacts for the environment (though in some scenarios you might choose to use both the built in linking and my custom task)

image

  • Add the new task to your environment’s release process, the parameters are
    • TFS Uri – the Uri of the TFS server inc. The TPC name
    • Team Project – the project containing the source build
    • Build Definition name – name of the build (can be XAML or vNext)
    • Artifact name – the name of the build artifact (seems to be ‘drop’ if a XAML build)
    • Build Number – default is to get the latest successful completed build, but you can pass a specific build number
    • Username/Password – if you don’t want to use default credentials (the user the build agent is running as), these are the ones used. These are passed as ‘basic auth’ so can be used against an on prem TFS (if basic auth is enabled in IIS)  or VSTS (with alternate credentials enabled).

image

 

When the task runs it should drop artifacts in the same location as the standard mechanism, so can be picked up by any other tasks on the release pipeline using a path similar to $(System.DefaultWorkingDirectory)\SABS.Master.CI\drop

Limitations

The task in its current form does not provide any linking of artifacts to the build reports, or allow the selection of build versions when the release is created. This removing audit trail features.

However, it does provide a means to get a pair of TFS servers working together, so can certainly enable some R&D scenarios while we await 2015.2 to RTM and/or the ‘official’ linking of External TFS builds as artifacts

Azure - A really Useful Seminar

 

Last week I had the delight of helping present a really useful seminar at the University of Hull alongside Peter Roberts. Here is a similar blogpost he has written where he also walks through how you can use azure and a MySQL database to store high scores of a video game.

Overall the presentation was a success with a fair few people turning up and seemingly enjoying and looking forward to perusing azure in their own ventures

We were also asked by Rob Miles to run a cloud workshop later in the semester which we are now looking into :D