The blogs of Black Marble staff

Build 2016 - Day 0

So Day 0 of Build. A day to settle in and relax and register for Build. Myself, my mother and sister found ourselves wandering through San Francisco. Found our way back to the Microsoft Store.....

We then headed off to our first interview with Sam Guckenheimer. I'll writing a separate blogpost talking about how that went so please go have a look.

It was a quick dash across San Francisco to make it to the Moscone centre for registration where I got my Build badge!! And adorned it with the iconic Black Marble.

Our final stop for the day was to a student panel close to the Moscone centre. Myself and four other MSP's from around the world were sat on this panel infront of Gartner, Forrester and a number of other large analytic firms and reporters. I was joined by two US Imagine cup finalists, and two other MSP's from across Europe. We were asked questions such as what we what we find most exciting thing in technology. Many answers were said such as the hardware or the reward or writing bug free software. I personally believe that the most exciting thing in technology is seeing what a difference it is making to the lives around us. As a small example in the UK the BBC are pushing an initiative where they want all children to be able to code. To do this they have launched a microcomputer called the Microbit. Every child aged 11 will receive one and there are lots of resources to help both students and teachers get the most out of it. Things like that I personally find exciting in computing and make me so happy to be in the field myself.  Other questions included what we thought of the gender divide in computing and did we find it was prevalent in our own places of work/ universities. On our panel of five it so happened that three of us were female but I would say that gives a skewed perspective. In my year out of close to 200 students less than ten are female and the other two MSP's there agreed with me that there is a similar spread for them also. Over the hour we were asked many questions and though at first it seemed like one of the most terrifying things that I had ever done I actually enjoyed it lots. Though the last question of what do you want to do after being a student scared most of us really. It was really interesting to also hear the thoughts and opinions of other like minded and similar aged people from around the world and what they thought. A huge thank you to Jennifer Ritzinger for being so lovely on the day as well :)

Due to hunger and the adrenalin wearing off we found myself heading back for food once more. This time The Cheesecake Factory. Where the food portions are as big as your head... Unfortunately due to the fact they don't take bookings we didn't finish eating until close to ten. Definite time for bed!!


Build 2016 - Getting there

Many would say starting a nearly 24 hour journey with a storm is far from ideal but I would say it added some mild fun to the whole affair. Sadly storms mean planes cannot take off which turned a one hour flight into a one hour forty wait and then an hour flight. Alas it meant plenty of time for snoozes.

Unfortunately our delay in Manchester meant that we had only half an hour to get off our plane, get through security in Paris and then run across the terminal to get to our gate. We luckily made it with seconds to spare and the gates were closed as soon as we walked through. As always with a long haul flight I found my options of what to do limited to either napping or watching whatever inflight entertainment that there was on offer. There was only one option......

There was only one choice...

 After a long 11 hour flight we landed in joyous sunny San Francisco. Moments after leaving the airport I found myself removing my jacket and donning layers due to the warm temperatures and bright sunshine. After arriving safely back at our hotel there was the obvious question of where to stop first. There was an unanimous decision to stop at the Microsoft store.

As ever all the latest tech was out from phones to surface books and I fawned over them all. Who doesn't like shiny new tech...

Sadly the phones were secured down 'sigh' maybe next time. Considering that for us it was getting close to 4pm(Midnight in the UK) it was decided that it would be best for an early meal then and then an early bedtime.

Luckily there are some amazing restaurants nearby including a Mexican. After eating what I can only describe as my body weight in guacamole it was time to call it a night and prepare for Build Day 0.

Azure Logic Apps–Parsing JSON message from service bus

What I want: When the logic app trigger receives a JSON formatted message from Azure Service Bus topic, I want to send a notification to the “email” field.  My sample message structure looks like this:


What happens: Because a message received on service bus doesn’t have a predefined format – it could be JSON, XML, or anything else – so Logic Apps doesn’t know the structure of the message.  So in the designer, it looks like:


Which is great, but it just dumps out the entire object, and not the email field that I need.

How to fix it: Fortunately the fix is pretty easy, basically you need

1) Select the Content output (above), you are going to edit this value.

2) Switch over to ‘Code view’ and manually type the expression (below).

If you haven't used it before, code view can be found in the toolbar:


Once you are in the code view, scroll down to the connector you are interested in. You will see the expression for the trigger body. This is the entire message received from the trigger, basically.


You need to modify this to parse the entire message using the ‘json’ function, then you can access it’s typed fields.

If you have ever used JSON.parse (or any object deserialization in pretty much any language for that matter) this concept should be familiar to you.  When I was done I ended up with:


I’ve broken the entire segment into two parts, a) parses the content and b) accesses the ‘email’ field of the parsed JSON object.

Hope this helps someone!


Update: if you are seeing an error when trying to parse see my new blog post Azure Logic Apps-The template language function 'json' parameter is not valid.

In place upgrade times from TFS 2013 to 2015

There is no easy way to work out how long a TFS in place upgrade will take, there are just too many factors to make any calculation reasonable

  • Start and end TFS version
  • Quality/Speed of hardware
  • Volume of source code
  • Volume of work items
  • Volume of work item attachments
  • The list goes on….

The best option I have found to a graph various upgrades I have done and try to make an estimate based in the shape of the curve. I did this for 2010 > 2013 upgrades, and now I think I have enough data from upgrades of sizable TFS instances to do the same for 2013 to 2015.



Note: I extracted this data from the TFS logs using the script in this blog post it is also in my git repo 

So as a rule of thumb, the upgrade process will pause around step 100 (the exact number varies depending on your starting 2013.x release), time this pause, and expect the upgrade to complete in about 10x this period.

It is not 100% accurate, but close enough so you know how long to go for a coffee/meal/pub or bed for the night

Announcing release of my vNext build tasks as extensions in the VSTS/TFS Marketplace

In the past I have posted about the vNext TFS build tasks I have made available via my GitHub repo. Over the past few weeks I have been making an effort to repackage these as extensions in the new VSTS/TFS Marketplace, thus making them easier to consume in VSTS or using the new extensions support in TFS 2015.2

This it is an ongoing effort, but I pleased to announce the release of the first set of extension.

  • Generate Release Notes – generates a markdown release notes file based on work items associated with a build
  • Pester Test Runner – allows Pester based tests to be run in a build
  • StyleCop Runner – allows a StyleCop analysis to be made of files in a build
  • Typemock TMockRunner – used TMockrunner to wrapper MSTest, allowing Typemock test to be run on a private build agent

To try to avoid people going down the wrong path I intend to go back through my older blog posts on these tasks to update them to point at new resources.

Hope you find these tasks useful. If you find any log any issues on Github

Unblocking a stuck Lab Manager Environment (the hard way)

This is a post so I don’t forget how I fixed access to one of our environments yesterday, and hopefully it will be useful to some of you.

We have a good many pretty complex environments deployed to our lab hyper-V servers, controlled by Lab manager. Operations such as starting, stopping or repairing those environments can take a long, long time, but this time we had one that was quite definitely stuck. The lab view showed the many servers in the lab with green progress bars about halfway across but after many hours we saw no progress. The trouble is, at this point you can’t issue any other commands to the environment from within the Lab Manager console – it’s impossible to cancel the operation and regain access to the environment.

Normally in these situations, stepping from Lab Manager to the SCVMM console can help. Stopping and restarting the VMs through SCVMM can often give lab manager the kick it needs to wake up. However, this time that had no effect. We then tried restarting the TFS servers to see if they’d got stuck, but that didn’t help either.

At this point we had no choice but to roll up our sleeves and look in the TFS database. You’d be surprised (or perhaps not) at how often we need to do that…

First of all we looked in the LabEnvironment table. That showed us our environment, and the State column contained a value of Repairing.

Next up, we looked in the LabOperation table. Searching for rows where the DataspaceId column value matched that of our environment in the LabEnvironment table showed a RepairVirtualEnvironment operation.

In the tbl_JobSchedule table we found an entry where the JobId column matched the JobGuid column from the LabOperation table. The interval on that was set to 15, from which we inferred that the repair job was being retried every fifteen minutes by the system. We found another entry for the same JobId in the tbl_JobDefinition table.

Starting to join the dots up, we finally looked in the LabObject database. Searching for all the rows with the same DataspaceId as earlier returned all the lab hosts, environments and machines that were associated with the Team Project containing the lab. In this table, our environment row had a PendingOperationId which matched that of the row in the LabOperation table we found earlier.

We took the decision to attempt to revive our stuck environment by removing the stuck job. That would mean carefully working through all the tables we’d explored and deleting the rows, hopefully in the correct order. As the first part of that, we decided to change the value of the State column in the LabEnvironment table to Started, hoping to avoid crashing TFS should it try to parse all the information about the repair job we were about to slowly remove.

Imagine our surprise, then, when having made that one change, TFS itself cleaned up the database, removed all the table entries referring to the repair environment job and we were immediately able to issue commands to the environment again!

Net Writer: A great UWP blog editor

I came across Net Writer some months ago, when it's creator, Ed Anderson blogged about how he'd taken the newly-released Open Live Writer code and used it in his just-started Universal Windows Platform (UWP) app for Windows 10. In January it only supported blogger accounts, which meant that I was unable to use it. However, I checked again this weekend and discovered that it now supports a wide range of blog software including that powers

I'm writing this post using the app. It's great for quick posts (there's no plugin support so posting code snippets is tricky) and most importantly, it works on my phone! That's the big win as far as I'm concerned. I've been hankering for the ability to easily manage my blog form my phone for a long time and now I can.

You can find Net Writer in the Windows Store and learn more about it at Ed's blog.

Steps Required to Configure WSUS to Distribute Windows 10 Version 1511 Upgrade

Microsoft recently made a hotfix available that patches WSUS on Windows Server 2012 and 2012 R2 to allow Windows 10 upgrade to version 1511. Installing the update is not, however, the only step that is required…

  1. Install the hotfix. This can be downloaded from Ensure that you pick the appropriate hotfix for the version of Windows Server on which you’re running WSUS. Note that if you’re running Windows Server 2012 R2, there’s also a pre-requisite install.
  2. Once the hotfix is installed and you’ve restarted your WSUS server, look in the ‘Products and Classifications’ option under the Classifications tab and ensure that the checkbox for upgrades is selected. This is not selected automatically for you:
    Upgrades Option
    Note that the upgrade files may take quite some time to download to your WSUS server at the next synchronisation.
  3. Add a MIME-Type for ‘.esd application/octet-stream’ in IIS on the WSUS server. To do this:
    Open IIS Manager
    Select the server name
    From the ‘IIS’ area in the centre of IIS Manager, open ‘MIME Types’
    Click ‘Add…’
    Enter the information above:
    Esd MIME Type
    Click OK to close the dialog.
    Note: Without this step, clients will fail to download the upgrade with the following error:
    Installation Failure: Windows failed to install the following update with error 0x8024200D: Upgrade to Windows 10 [SKU], version 1511, 10586.
  4. Approve the Upgrade for the classes of computer in your organisation that you want to be upgraded.

Once all of the above steps are in place, computers that are targeted for the upgrade should have this happen automatically at the next update cycle.

New books on VSTS/TFS ALM DevOps

It has been a while since I have mentioned any had new books on TFS/VSTS, and just like buses a couple come along together.

These two, one from Tarun Arora and the other from Mathias Olausson and Jakob Ehn are both nicely on trend for the big area of interest for many of the companies I am working with at present; best practice ‘cook book’ style guidance on how to best use the tools in an ALM process.



If your are working with TFS/VSTS worth a look