BM-Bloggers

The blogs of Black Marble staff

If I add a custom field to a VSTS work item type what is it’s name?

The process customisation options in VSTS are now fairly extensive. You can add fields, states and custom items, making VSTS is ‘very possible’ option for many more people.

As well as the obvious uses of this customisation such as storing more data or matching your required process, customisation can also aid in migrating work items into VSTS from other VSTS instances, or on-premises TFS.

Whether using TFS Integration (now with no support – beware) or Martin Hinshelwood’s vsts-data-bulk-editor (an active open source solution so probably a much better choice for most people) as mentioned in my past post you need to add a custom field on the target VSTS server to contain the original work item ID. Commonly called ReflectedWorkItemId

This can be added in VSTS  add detailed in MSDN

 

image

Note: In the case of Martin’s tool the field needs to be a string as it is going to contains a URL not the simple Integer you might expect.

The small issue you have when you add a custom field is that this UI does not make it clear what the full name of field is. You need to remember that it is in the form <name of custom process>.<field name> e.g.  MigrateScrum.ReflectedWorkItemId.

If you forget this you can always download the work item definition using the TFS Power Tools to have a look (yes this even works on VSTS).

image

Offline Domain Join with Direct Access

I was recently in the position that I needed to rebuild a workstation at a remote location, but wanted to end up with it joined to the domain, and able to install software via the SCCM Software Center. Enter Offline Domain Join (djoin.exe)!

Offline Domain Join allows the creation of a machine account and the establishment of a trust relationship between a computer running Windows and a Domain. As part of the process, group policy information can also be transferred to the machine that will be joined to the domain.

Assuming Direct Access is available, the appropriate group policy information for Direct Access can be transferred as part of the process, and this should then allow the remote machine to establish a connection to the domain and from there all remaining group policy information can be transferred, the Configuration Manager client installed etc.

Information on ‘djoin.exe’ including examples for use can be found at https://technet.microsoft.com/en-us/library/offline-domain-join-djoin-step-by-step

My scenario was:

  • The machine account already existed in the correct OU and was a member of the appropriate groups for Direct Access (the machine name had already been used; this was a rebuild) and therefore I needed to use the ‘/reuse’ parameter.
  • The only group policy information I wanted to transfer to the remote machine was for Direct Access. I anticipated that all other group policy information would be transferred automatically once a Direct Access connection had been established.

In my case, the command I used on the provisioning server were:

djoin /provision /domain domain.com /machine MyWorkstation /savefile MyWorkstation-blob.txt /reuse /policynames “Direct Access Client”

The resultant blob should be transferred securely – take note of what the TechNet page says on the matter:

The base64-encoded metadata blob that is created by the provisioning command contains very sensitive data. It should be treated just as securely as a plaintext password. The blob contains the machine account password and other information about the domain, including the domain name, the name of a domain controller, the security ID (SID) of the domain, and so on. If the blob is being transported physically or over the network, care must be taken to transport it securely.

On the remote workstation, the command I used was:

djoin /requestODJ /loadfile MyWorkstation-blob.txt /windowspath %SystemRoot% /localos

At this point you’re prompted to reboot the workstation. Once the reboot was complete, I left the machine for a few minutes to allow it to establish a connection, then signed in. Everything worked as anticipated and I could log in as a domain user and a Direct Access connection was established. Following a group policy update, the Configuration Manager client was transferred and installed, and a short time later the Software Center became available and I could add software made available from SCCM.

DPM Protection for Windows 10 Anniversary Edition

Attempting to add protection to a Windows 10 Anniversary Edition workstation recently failed with the DPM server showing the workstation as ‘unavailable’ when looking at the ‘Production Servers’ list in the console.

It appears that the upgrade to Anniversary Edition removes a file that the DPM agent relies on, ‘sisbkup.dll’, and that as a consequence the services cannot start on the protected workstation.

The resolution is to copy the ‘sisbkup.dll’ file from c:\Windows\System32 on an older version of Windows 10 into C:\Windows\System32 on the Anniversary Update machine and then retry the connection from DPM.

DPM Protection for Windows 10 Anniversary Edition

Attempting to add protection to a Windows 10 Anniversary Edition workstation recently failed with the DPM server showing the workstation as ‘unavailable’ when looking at the ‘Production Servers’ list in the console.

It appears that the upgrade to Anniversary Edition removes a file that the DPM agent relies on, ‘sisbkup.dll’, and that as a consequence the services cannot start on the protected workstation.

The resolution is to copy the ‘sisbkup.dll’ file from c:\Windows\System32 on an older version of Windows 10 into C:\Windows\System32 on the Anniversary Update machine and then retry the connection from DPM.

Typemock have released official VSTS build extension

Typemock have just released an official VSTS build extension to run Typemock Isolator based tests. Given there is now an official extension I have decided to deprecate mine, it is still available in the Marketplace but I would recommend using the official one 

The new Typemock extension includes two tasks

SmartRunner Task

The SmartRunner is a unit test runner, that can run nunit and mstest based tests. It handles the deployment of Typemock Isolator.  SmartRunner can run on both Shared and On Premises Agents

Typemock with VSTests

This task acts as a wrapper to enable Typemock Isolator and then run your tests via VSTest. This task can only be used with On Premises Agents as the build agent needs to be running with admin privileges.

Fix for my Docker image create dates being 8 hours in the past

I have been having a look at Docker for Windows recently. I have been experiencing a problem that when I create a new image the created date/time (as shown with docker images) is 8 hours in the past.

image

Turns out the problem seems to be due to putting my Windows 10 laptop into sleep mode. So the process to see the problem is

  1. Create a new Docker image – the create date is correct, the current time
  2. Sleep the PC
  3. Wake up the PC
  4. Check the create date, it is now 8 hours off in the past

Now the create date is not an issue in itself, but the fact that the time within the Docker images is also off by 8 hours can be, especially when trying to connect to cloud based services. I needed to sort it out

Turns out the fix is simple, you need to stop and restart the Docker process (or restarting the PC has the same effect as this restarts the Docker process). Why the Docker process ends up 8 hours off, irrespective of the time the PC is slept, I don’t know. Just happy to have a quick fix.

I am speaking at Microsoft UK TechDays Online event on Azure DevTest Labs

The registration link for Microsoft UK TechDays Online is now live. This is a 5 day event live broadcast from the Microsoft Campus in Reading. You will be able to view the sessions live at https://channel9.msdn.com/

The themes for each day are:

  • Monday, 12 September: Explore the world of Data Platform and BOTs
  • Tuesday, 13 September: DevOps in practice
  • Wednesday, 14 September: A day at the Office!
  • Thursday, 15 September: The inside track on Azure and UK Datacenter
  • Friday, 16 September: Find out more about Artificial Intelligence

I am doing a session on the Thursday on Azure DevTest Labs.

Hope you find time to watch some or all of the events. For more details see the registration link

Why have I got a ‘.NETCore50’ and a ‘netcore50’ folder in my nuget package?

I recently posted on how we were versioning our Nuget packages as part of a release pipeline. In test we noticed that the packages being produced by this process has an extra folder inside them.

image 

We expected there to be a netcore50 folder, but not a .NETCore50 folder. Strangely if we build the package locally we only saw the expect netcore50 folder. The addition of this folder did not appear to be causing any problem, but I did want to find out why it had appeared and remove it as it was not needed.

Turns out the issue was the version of Nuget.exe, the automatically installed version on the on-prem TFS build agent was 3.2, my local copy 3.4. As soon as I upgraded the build box’s nuget.exe version to 3.4 the problem went away

Experiences versioning related sets of NuGet packages within a VSTS build

Background

We are currently packaging up a set of UX libraries as NuGet packages to go on our internal NuGet server. The assemblies that make up the core of this framework are all in a single Visual Studio solution, however it makes sense to distribute them as a set of NuGet packages as you might not need all the parts in a given project. Hence we have a package structure as follows…

  • BM.UX.Common
  • BM.UX.Controls
  • BM.UX.Behaviours
  • etc…

There has been much thought on the versioning strategy of these packages. We did consider independent versioning of each of these fundamental packages, but decided it was worth the effort, keeping their versions in sync was reasonable  i.e. the packages have the same version number and are released as a set.

Now this might not be the case for future ‘extension’ packages, but it is an OK assumption for now, especially as it makes the development cycle quicker/easier. This framework is young and rapidly changing, there are often changes in a control that needs associated changes in the common assembly; it is hence good that a developers does not have to check-in a change on the common package before they can make an associated changed to the control package whist debugging a control prior to it being released.

However, this all meant it was important to make sure the package dependencies and versions are set correctly.

Builds

We are using Git for this project (though this process is just as relevant for TFVC) with a development branch and a master branch. Each branch has its own CI triggered build

  • Development branch build …
    • Builds the solution
    • Runs Unit tests
    • Does SonarQube analysis
    • DOES NOT store any built artifacts
    • [Is used to validate Pull requests]
  • Master branch build …
    • Versions the code
    • Builds the solution
    • Runs Unit tests
    • Creates the NuGet Packages
    • Stores the created packages (to be picked up by a Release pipeline for publishing to our internal NuGet server)

Versioning

So within the Master build we need to do some versioning, this needs to be done to different files to make sure the assemblies and the NuGet packages are ‘stamped’ with the build version.

We get this version for the build number variable, $(Build.BuildNumber), we use the format $(Major).$(Minor).$(Year:yy)$(DayOfYear).$(rev:r)  e.g. 1.2.16123.3

Where

  • $(Major) and $(Minor) build variables we manage (actually our release pipeline updates the $(Minor) on every successful release to production using a VSTS task)
  • $(Year:yy)$(DayOfYear) gives a date in the form 16123
  • and $(rev:r) is a count of builds on a given day

We have chosen to use this number format to version both the assemblies and Nuget packages, if you have different plans, such as semantic versioning , you will need to modify this process a bit.

Assemblies

The assemblies themselves are easy to version, we just need to set the correct value in their assemblyinfo.cs or assemblyinfo.vb files. I used my Assembly versioning VSTS task to do this

NuGet Packages

The packages turn out to be a bit more complex. Using the standard NuGet Packager task there is a checkbox to say to use the build number as the version. This works just fine versioning the actual package, adding the –Version flag to the package command to override the value in the project .nuspec file. However it does not help with managing the versions of any dependant packages in the solution, and here is why. In our build …

  1. AssemblyInfo files updated
  2. The solution is built, so we have version stamped DLLs
  3. We package the first ‘common’ Nuget package (which has no dependencies on other projects in the solution) and it is versioned using the –version setting, not the value in it’s nuspec file.
  4. We package the ‘next’ Nuget package, the package picks up the version from the –version flag (as needed), but it also needs to add a dependency to a specific version of the ‘common’ package. We pass the –IncludeReferencedProjects  argument to make sure this occurs. However, Nuget.exe gets this version number from  the ‘common’ packages .nuspec file NOT the package actually built in the previous step. So we end up with a mismatch.

The bottom line is we need to manage the version number in the .nuspec file of each package. So more custom VSTS extensions are needed.

Initially I reused my Update XML file task, passing in some XPath to select the node to update, and this is a very valid approach if using semantic versioning as it is a very flexible way yo build the version number. However, in the end I added an extra task to my versioning VSTS extension for Nuget to make my build neater and consistent with my other versions steps.

Once all the versioning was done I could create the packages. I ended up with a build process as shown below

image

A few notes about the NuGet packaging

  • Each project I wish to create a Nuget package for has a nuspec file of the same ‘root’ name in the same folder as the csproj eg. mypackage.csproj and mypackage.nuspec. This file contains all descriptions, copyright details etc.
  • I am building each package explicitly, I could use wildcards in the ‘Path/Pattern to nuspec files’ property, I choose not to at this time. This is down to the fact I don’t want to build all the solution’s package at this point in time.
  • IMPORTANT I am passing in the .csproj file names, not the .nuspec file names to the ‘Path/Pattern to nuspec files’ property. I found I had to do this else the   –IncludeReferencedProjects  was ignored. The Nuget documentation seems to suggest as long as the .csproj and .nuspec files have the same ‘root’ name then you could reference the .nuspec file but this was not my experience
  • I still set the flag to use the build version to version the package – this is not actually needed as the .nuspec file has already been update
  • I pass in the  –IncludeReferencedProjects  argument via the advanced parameters, to pick up the project dependancies.

Summary

So now I have a reliable way to make sure my NuGet packages have consistent version numbers 

Tidy up those VSTS release pipelines with meta-tasks

Do you have repeating blocks in your VSTS release pipelines?

I certainly do. A common one is to run a set of functional test, so I need to repeatedly …

  1. Deploy some test files to a VM
  2. Deploy a test agent to the VM – IMPORTANT I had not realised you can only run one test run against this deployed agent. You need to redeploy it for the next run
  3. Run my tests
  4. … and repeat for next test type/configuration/test plan/DLL etc.

 

In the past this lead to a lot of repeat tasks in my release pipeline, all very messy.

Now in VSTS we have the option of  Meta-tasks, these allow tasks to be grouped into in-effect functions with their own properties.

 

image

In the above screen shot below you can see I use a meta-task ‘Run Tests’ that wrappers the four tasks shown below.

image

Much neater, but as you might expect with something new I have come across a few minor gotchas

  • You cannot order the list of properties for the meta-task
  • This is a problem as the first one is used to generate the instance name in the pipeline. No a major problem you can always edit it.
  • Meta-tasks properties are auto-detected from any variables used with in the meta-task tasks, the auto-detection mechanism is case sensitive, unless the rest of VSTS variable handling. So be careful to not end up with duplicates.

That all said, I think this is big step forward in readability and reuse for release management