The blogs of Black Marble staff

Build 2016 - Day 2

Onto Day 2 and the second keynote. Once again us marvellous students were given our special seating to the front of the keynote. Having started the conference with a flying start the day before, Day 2's keynote had big shoes to fill in terms of everything. And boy did they provide. Not far in and what were we gifted?

Why its only Xamarin of course! We already knew that Microsoft had recently acquisitioned Xamarin but the best part came when it was announced that it would be released free of charge to all Visual Studio customers. And when they say all Visual Studio customers they really mean it!! Even those who use Visual Studio Community, the free edition, would have access to the whole Xamarin platform which sent the crowd wild. As a student I don't think the news could've gotten any better. Though I personally use Visual Studio's Enterprise version there are a lot of students who use the Community version as they unable to gain access to paid versions. With this they are able to build apps for so many platforms without having to even pay a penny. The response I got from my peers at university was nothing short of amazing. Cries of joy and excitement and even a 'Darn why didn't I have this before I started my final year project...' I'm already decided to make use of Xamarin in mine. New technologies for the win! :D

Lots of cool and interesting stuff from Azure next with a never before seen use of Internet of Things...

I'm 'azure' that's a cool t-shirt.. Yeah? Yeah? Ok.....

We then had some cool videos and talks from the guys at both Accuweather and BMW. Accuweather are an American company that provide weather forecasting services across the world. They use Microsoft's Azure to provide more than 4 billion daily data requests using real time data. BMW made use of IoT to allow our friend David to integrate his digital info across multiple devices. In this case his home and his car. Now in terms of concept videos it couldn't have looked cooler. You wake up and your house helps you plan your day and route to work before you get in your car to find that your route that you just planned is already up and ready to go. You're driving along and your car warns you of rock slides and allows you to hold conference calls with someone somewhere else in world just before you sync calendars so you can meet up for lunch when you're next in town. Oh that's all while you're still driving on your way to work. I can't speak for everyone in the keynote but I would certainly love me one some of that tech...

Ooh Calicowind spotted in the keynote :D
We were then given a number of announcements in terms of mobile and data services found within azure. There was also some interesting additions to the azure portal and application insights that I'm looking forward to having a go with.

An all in all very good keynote though personally I would say it would have been tough to beat day 1's :P
Next on Calicowind's agenda was the second day of luncheon talks for students. However instead we were lucky enough to borrow some of Gabe Aul's time for an interview. Please read up in my following blog post about how that went :)

We were then straight off to our next interview with the lovely Craig Kitterman. We found him near the .NetRocks booth at the back of the expo hall and they were very keen to see what myself and my sister even going to say that they had heard of the 'girls running around with a camera'. They very kindly let us use their booth to record in and they were just so nice. From both myself and my sister we both want to say thanks for the Visual Studio socks!! Please read my upcoming blog post about how my interview with Craig Kitterman went :D

I saw some more great sessions run in the pop up theatres on topics such as a break down of the new IoT kit, creating your own smart bot and a really interesting one about Rendering Mars!!
Only one more day left I can't wait :D

Build 2016 - Day 1

So finally it was here. The glorious first day of Build. The start to what could only be something spectacular. We were lucky as students to be given special seating at the front and next to the journalists during the keynote. To ensure our seating we had to be there for 7:45. Which is fine. That's almost 4 in the afternoon in the UK and as a student about the time I get up anyway. The keynote hall was huge. Considering it had to seat thousands I'm not surprised...

We managed to get seats fairly close to the front compared to some. We were only about ten from the front. Funnily they seated all the students and Regional Directors together. I didn't mind though as it meant I could sit with my father through the keynotes. The keynote was excellent and fun to watch. Bryan Roper did an amazing piece on new parts coming to Windows 10 with inking and Cortana. Then we got the lovely announcement that every Xbox One would be able to be turned into a developer kit for free from that moment on.

Then it was Hololens news with the announcement that they would begin shipping dev units out. I'm just now biding my time until I can get my hands on one. Cool demos were shown on stage and students studying medicine showed how Hololens could be used to help their studies of anatomy and save on time understanding layouts of say the brain. I think that this is a fabulous use of technology. Yes its cool its fun its shiny but it is being used to make a difference in our society. Then Hololens and Nasa news which as a keen astronomer and programmer I would argue as one of the best parts. The announcement of Destination Mars had everyone excited and cheering. Especially as they walked through parts of how they did it, had Buzz Aldrin as a 3-D guide during the experience and that it would be available to have a go with at the Kennedy Space Centre from the summer.

I was thrilled when I tweeted out saying I couldn't wait to have a go and I got a reply from Kennedy Space Centre saying I should visit and have a go. Now it would be just plain rude to turn down such an offer. I just need to convince my parents to take me.....

More from Cortana with more bad (amazing) jokes.

Then it was all about those bots as we were introduced to these intelligent, machine learning pieces of software that are able to be embedded in any app that requires one. On stage we saw a demo of a pizza being ordered and sent out for delivery all through the use of bots. It was even able to be programmed and taught to understand slang words such as 'crib'. The most incredible part was when they showed this video. The applause and response from the crowd was outstanding. Microsoft have created something that can allow an otherwise allow a blind man to be able to see the world around him. Every single person left that keynote feeling pride in the fact that they worked with Microsoft and it really couldn't have set a better tone for the rest of the conference.

Us lucky students then had a lunch talk put on for us by Scott Hanselman whom last year I actually interviewed about the fun things around Build. He gave a delightful talk about what he does and some of the great things technology can do. Like a robot that can help you argue on the internet. I suppose leaves you to do other things? Students freebies in hand we headed to the exhibit hall to have a look around. Especially compared to last year there seemed to be so much. Whereas last year the exhibitors were spread out in the open spaces on every floor this year they were all in a large hall allowing you to explore. Towards the back of the hall they had set up mini stages that had people giving only 20 minute presentations on a variety of things. As I was running around meeting people I found this excellent as I could still catch some sessions.

I listened to some interesting sessions including one on Unity where the presenter game up and running in his thirty minute slot and one talking about the Windows Insider Program. Though seats were limited at these pop up theatres which meant you were usually standing or on the floor. Especially at the more popular ones.

I was then lucky enough to receive one of the golden tickets

Ok so its more of an orange but its close enough. Located at the back of the expo hall was the Hololens demo experience. Upon acquiring a ticket and arriving with plenty of time to spare just in case you join the back of a line to have a go at the Hololens demo, Destination Mars. When its finally your turn you get to watch a short intro video about what Destination Mars is about and how they got there. We went in the room in groups of eight and first had to measure the distance between our pupils to make sure that the experience was at its optimum for us. We were then taken to the Hololens where we then put them on. I must say I wasn't sure quite what to expect. I had some mild trouble regardless of help getting the device to not slip on my hair and to not occasionally knock my glasses forcing me to readjust both them and the Hololens. That aside once the device is on and secure, using a 3D model of Mars in the middle of the room to help you adjust the right place for your Hololens, you are lead into a large room where the experience really gets going. After a moment Buzz Aldrin pops out of one of the corners of the room talks to the group about Mars before exclaiming 'Lets go to Mars!' Where the landscape suddenly transforms into that of the red planet. He tells you to look around and see what you can see. The view with the Hololens is only actually about the size of a postcard so around the sides you are still able to see the room that you are in. As you move around though you are encouraged by Buzz to look out for any arrows that appear anywhere in your vision. Following the arrows lead you to an interesting part of the surface of Mars for you to learn more about. For example I was looking up at one point when an arrow appeared encouraging me to look at the ground around me. There the ground started to transform to show me how erosion had taken place on the surface with audio in my Hololens explaining what was going on. We moved to another area of Mars where we saw a to scale model of the curiosity rover and we got to hear more facts about that as well. The audio in the Hololens worked marvellously with it being loud enough that you could hear it very clearly but not so much that you could hear other peoples. After another couple of minutes the experience came to an end and we handed in our hololens. Darn... As I said before the Hololens for me kept slipping which meant after some use I started getting a mild headache but honestly I think I just needed to secure it better as my sister had no problems with hers at all. Besides that it was amazing to be able to explore Mars in the way I did. It was fun and just delightful to see how the Hololens has come along. I can only bide the time till I get to have another go. On the way out they also let you pose with the Hololens and send a photo of yourself. Needless to say we took full advantage of this....

The future looks bright and Hololensy from where we stand. Onto Day 2!!!

Build 2016 - Sam Guckenheimer

On the Tuesday we were very lucky to be given the chance to interview Sam Guckenheimer. Sam is a Product Owner of Visual Studio Cloud Services working with Team Foundation Server and Azure services such as Application Insights. Listen to me ask about some of the cool and upcoming things coming to Visual Studio and his own favourites!!

Then watch Niamh ask him about how we can get more children into coding using Visual Studio :D

Forgive the authentic San Francisco traffic noises in the background! As in the way of our videos we try to grab our participants wherever we can but we hope you enjoy it regardless!!!

SPWakeUp for SharePoint 2016

If you use SharePoint, you’ll know that some mechanism to wake up the hosted sites after the application pools are recycled overnight is very helpful (essential even) for the end user experience.

I’ve compiled a version of SPWakeUp for SharePoint 2016, which can be downloaded from

If you want to compile this for yourself, this is the method I followed to get the above version:

  1. Grab a copy of the source code for SPWakeUp from and unpack it.
  2. Open the solution in Visual Studio (I used Visual Studio 2015) and allow the automatic upgrade.
  3. Replace the reference to the Microsoft.SharePoint.dll in the solution with one pointing to the SharePoint 2016 version. You’ll want to grab a copy from C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\16\ISAPI on a SharePoint 2016 server.
  4. Modify the target framework for the application. I used 4.6.1 for the build above.
  5. Build either the debug or release version. steps with Language Understanding Intelligent Sevice (beta)

LUIS is part of the suite of Microsoft Cognitive Services announced at build.  It allows you to process unstructured language queries and use them to interface with your applications.  It has particular relevance any business process which benefits from human interaction patterns (support, customer service etc.).  Having looked at LUIS from the point of absolute novice, I’ve put together a few points which may be useful to others.

I’ve broken this down into ‘things to do’ when getting started. To illustrate the steps, I’ll use the very simple example of ordering a sandwich.  A sandwich has a number of toppings and can also have sauce on it. 

The role of LUIS is to take plain English and parse it into a structure that my app can process.  When I say “can I order a bacon sandwich with brown sauce” I want LUIS to tell me that a) an order has been placed, b) the topping is bacon, c) the sauce is brown.  Once LUIS has provided that data then my app can act accordingly.

So for LUIS to understand the sandwich ordering language: You need to define entities, define intents and then train and test the model before using it.  Read on to understand what I mean by these high level statements. 

1. Define your ‘entities’

These are the ‘things’ you are talking about.  For me, my entities are ‘sauce’, and ‘topping’.  I also have ‘extra’ which is a generalisation of any other unstructured information the customer might want to provide – so things like Gluten free, no butter etc.

2. Define your ‘intents’

These are the context in which the entities are used.  For me, I only have one intent defined which is ‘order’.

3. Test/train the model – use ‘utterances’

After entities and intents have been defined you can begin to test the model. 

Initially, the untrained LUIS will be really bad at understanding you.  That’s the nature of machine learning. But as it is trained with more language patterns and told what these mean it will become increasingly accurate.

To begin the process LUIS needs to be interacted with.  LUIS calls interactions ‘utterances’.  An utterance is an unstructured sentence that hasn’t been processed in any way.

In the portal you can enter an utterance to train or test the model. 

Here, I am adding the utterance “can I order a sausage sandwich with tomato sauce”.  I’ll select the entities that are part of that utterance (sausage and tomato sauce) and tell LUIS what they are.


You can repeat this process with as many variations of language as possible, for example “give me a bacon sandwich”, and  “I want a sausage sandwich with brown sauce” etc. It’s recommended to try this exercise with different people as different people will say the same thing with unique speech patterns. The more trained variations the better, basically.  You can, and will come back to train it later though, so don’t feel it has to be 100% at this stage.

Once you go live with the model - LUIS will come across patterns that it cannot fully process, for this the feedback loop for training is very important.  LUIS will log all the interactions it has had, you can access them using the publish button.


These logs are important as they give you insight into your customers language.  You should use this data to train LUIS and improve its future accuracy.

4. Use the model

Finally, and I guess most importantly you need to use the model. If you look in the screenshot above there is a box where you can type in a query, aka an utterance. The query string will look something like this:<appid>&subscription-key=<subscriptionkey>&q=can%20i%20order%20a%20bacon%20sandwich%20with%20brown%20sauce

This basically issues a HTTP GET against the LUIS API and returns the processed result in a JSON object. 


I’ve annotated the diagram so you can see:

A) the query supplied to LUIS.

B) the topping entity that it picked out.

C) the sauce entity that it picked out.

In addition to this, you will see other things such as the recognised intent, the confidence of the results, etc.  I encourage you to explore the structure of this data.  You can use this data in any application that can issue a HTTP get, and process them accordingly. 

I’ll write later on bot framework which has built in forms engine to interface with LUIS models, enhancing the language capabilities with structured processing.

This is a simple example but hopefully it shows the potential use cases for this, and gives you some pointers to get started.

Notes from the field: Using Hyper-V Nat Switch in Windows 10

The new NAT virtual switch that can be created on Windows 10 for Hyper-V virtual machines is a wonderful thing if you're an on-the-go evangelist like myself. For more information on how to create one, see Thomas Maurer's post on the subject.

This post is not about creating a new NAT switch. It is, however, about recreating one and the pitfalls that occur, and how I now run my virtual environment with some hack PowerShell and a useful DHCP server utility.

Problems Creating Nat Switch? Check Assigned IP Addresses

I spent a frustrating amount of time this week trying to recreate a NAT switch after deleting it. Try as I might, every time I executed the command to create the new switch it would die. After trial and error I found that the issue was down to the address range I was using. If I created a new switch with a new address range everything worked, but only that one time: If I deleted the switch and tried again, any address range that I'd used would fail.

This got me digging.

I created a new switch with a new address range. The first thing I noticed was that I had a very long routing table. Get-netroute showed routes for all the address ranges I had previously created. That let me to look at the network adapter created by the virtual switch. When you create a new nat switch the resulting adapter gets the first IP address in the range bound to it (so will result in an IP of My adapter had an IP address for every single address range I'd created and then deleted.

Obviously, when the switch is removed the IP configuration is being stored by windows somewhere. When a new switch is created all that old binding information is reapplied to the new switch. I'm not certain whether this is related to the interface index, name or what, since when I remove and re-add the switch on my machine it always seems to get the same interface index.

A quick bit of PowerShell allowed me to rip all the IP addresses from the adapter at once. The commands below are straightforward. The first allows me to find the adapter by name (shown in the Network Connections section of control panel) - replace the relevant text with the name of your adapter. From that I can find the interface index, and the second command gets all the IPv4 addresses (only IPv4 seems to have the problem here) and removes them from the interface - again, swap your interface index in here. I can then use PowerShell to remove the VMswitch and associated NetNat object.

Get-NetAdapter -Name "vEthernet (NATSwitch)"
Get-NetIPAddress -InterfaceIndex 13 -AddressFamily IPv4 | Remove-NetIPAddress

Once that's done I can happily create new virtual switches using NAT and an address range I've previously had.

Using DHCP on a NAT switch for ease

My next quest was for a solution to the IP addressing conundrum we all have when running VMs: IP addresses. I could assign each VM a static address, but then I have to keep track of them. I also have a number of VMs in different environments that I want to run and I need external DNS to work. DHCP is the answer, but Windows 10 doesn't have a DHCP server and I don't want to build a VM just to do that.

I was really pleased to find that somebody has already written what I need: DHCP Server for Windows. This is a great utility that can run as a service or as a try app. It uses an ini file for configuration and by editing the ink file you can manage things like address reservations. Importantly, you can choose which interface the service binds to which means it can be run only against the virtual network and not a use issues elsewhere.

There's only one thing missing: DNS. Whilst the DHCP serer can run it's own DNS if you like, it still has a static configuration for the forwarder address. In a perfect world I'd like to be able to tell it to had my PCs primary DNS address to clients requesting an IP.

Enter PowerShell, stage left...

Using my best Google-fu I tracked down a great post by Lee Homes from a long time ago about using PowerShell to edit ini files through the old faithful Windows API calls for PrivateProfileString. I much prefer letting Windows deal with my config file than write some complex PowerShell parser.

I took Lee's code and created a single PowerShell module with three functions as per his post which I called Update-Inifiles.psm1. I then wrote another script that used those functions to edit the ini file for DHCPserver.

It's dirty and not tested on anything but my machine, but here it is:

import-module C:\src\Update-IniFiles.psm1

$dnsaddr = (Get-DnsClientServerAddress -InterfaceIndex (get-netroute -DestinationPrefix[0].ifIndex -AddressFamily IPv4).ServerAddresses[0]

if ($dnsaddr.Length -gt 0)
Set-PrivateProfileString "C:\Program Files\DHCPSrv\dhcpsrv.ini" GENERAL DNS_0 $dnsaddr
Set-PrivateProfileString "C:\Program Files\DHCPSrv\dhcpsrv.ini" GENERAL DNS_0

The second line is the one that may catch you out. It gets the DNS server information for the interface that is linked to the default IPv4 route. On my machine there are multiple entries returned by the get-netroute command, so I grab the first one from the array. Similarly, there are multiple DNS servers returned and I only want the first one of those, too. I should really expand the code and check what's returned, but this is only for my PC - edit as you need!

Just in case I get nothing back I have a failsafe which is to set the value to the Google public DNS server on

Now I run that script first, then start my DHCP server and all my VMs get valid IP information and can talk on whatever network I am connected to, be it physical or wireless.

Azure Logic Apps-Service Bus connector not automatically triggering?

By default logic apps will be set to trigger every 60 minutes which, if you are not aware, may lead you to thinking that your logic app isn’t working at all!

As Logic Apps are preview there are some features that are not available through the designer yet, but you can do a lot through the Code view.

In this instance you can set the frequency to Second, Minute, Hour, Day, Week, Month or Year.  For a frequency of every minute it is required to be on a standard service plan or better. If your service plan doesn’t allow the frequency you will get an error as soon as you try and save the logic app.  Here’s what I set to have it run every minute.

. image

More information can be found at

Build 2016 - Day 0

So Day 0 of Build. A day to settle in and relax and register for Build. Myself, my mother and sister found ourselves wandering through San Francisco. Found our way back to the Microsoft Store.....

We then headed off to our first interview with Sam Guckenheimer. I'll writing a separate blogpost talking about how that went so please go have a look.

It was a quick dash across San Francisco to make it to the Moscone centre for registration where I got my Build badge!! And adorned it with the iconic Black Marble.

Our final stop for the day was to a student panel close to the Moscone centre. Myself and four other MSP's from around the world were sat on this panel infront of Gartner, Forrester and a number of other large analytic firms and reporters. I was joined by two US Imagine cup finalists, and two other MSP's from across Europe. We were asked questions such as what we what we find most exciting thing in technology. Many answers were said such as the hardware or the reward or writing bug free software. I personally believe that the most exciting thing in technology is seeing what a difference it is making to the lives around us. As a small example in the UK the BBC are pushing an initiative where they want all children to be able to code. To do this they have launched a microcomputer called the Microbit. Every child aged 11 will receive one and there are lots of resources to help both students and teachers get the most out of it. Things like that I personally find exciting in computing and make me so happy to be in the field myself.  Other questions included what we thought of the gender divide in computing and did we find it was prevalent in our own places of work/ universities. On our panel of five it so happened that three of us were female but I would say that gives a skewed perspective. In my year out of close to 200 students less than ten are female and the other two MSP's there agreed with me that there is a similar spread for them also. Over the hour we were asked many questions and though at first it seemed like one of the most terrifying things that I had ever done I actually enjoyed it lots. Though the last question of what do you want to do after being a student scared most of us really. It was really interesting to also hear the thoughts and opinions of other like minded and similar aged people from around the world and what they thought. A huge thank you to Jennifer Ritzinger for being so lovely on the day as well :)

Due to hunger and the adrenalin wearing off we found myself heading back for food once more. This time The Cheesecake Factory. Where the food portions are as big as your head... Unfortunately due to the fact they don't take bookings we didn't finish eating until close to ten. Definite time for bed!!


Build 2016 - Getting there

Many would say starting a nearly 24 hour journey with a storm is far from ideal but I would say it added some mild fun to the whole affair. Sadly storms mean planes cannot take off which turned a one hour flight into a one hour forty wait and then an hour flight. Alas it meant plenty of time for snoozes.

Unfortunately our delay in Manchester meant that we had only half an hour to get off our plane, get through security in Paris and then run across the terminal to get to our gate. We luckily made it with seconds to spare and the gates were closed as soon as we walked through. As always with a long haul flight I found my options of what to do limited to either napping or watching whatever inflight entertainment that there was on offer. There was only one option......

There was only one choice...

 After a long 11 hour flight we landed in joyous sunny San Francisco. Moments after leaving the airport I found myself removing my jacket and donning layers due to the warm temperatures and bright sunshine. After arriving safely back at our hotel there was the obvious question of where to stop first. There was an unanimous decision to stop at the Microsoft store.

As ever all the latest tech was out from phones to surface books and I fawned over them all. Who doesn't like shiny new tech...

Sadly the phones were secured down 'sigh' maybe next time. Considering that for us it was getting close to 4pm(Midnight in the UK) it was decided that it would be best for an early meal then and then an early bedtime.

Luckily there are some amazing restaurants nearby including a Mexican. After eating what I can only describe as my body weight in guacamole it was time to call it a night and prepare for Build Day 0.

Azure Logic Apps–Parsing JSON message from service bus

What I want: When the logic app trigger receives a JSON formatted message from Azure Service Bus topic, I want to send a notification to the “email” field.  My sample message structure looks like this:


What happens: Because a message received on service bus doesn’t have a predefined format – it could be JSON, XML, or anything else – so Logic Apps doesn’t know the structure of the message.  So in the designer, it looks like:


Which is great, but it just dumps out the entire object, and not the email field that I need.

How to fix it: Fortunately the fix is pretty easy, basically you need

1) Select the Content output (above), you are going to edit this value.

2) Switch over to ‘Code view’ and manually type the expression (below).

If you haven't used it before, code view can be found in the toolbar:


Once you are in the code view, scroll down to the connector you are interested in. You will see the expression for the trigger body. This is the entire message received from the trigger, basically.


You need to modify this to parse the entire message using the ‘json’ function, then you can access it’s typed fields.

If you have ever used JSON.parse (or any object deserialization in pretty much any language for that matter) this concept should be familiar to you.  When I was done I ended up with:


I’ve broken the entire segment into two parts, a) parses the content and b) accesses the ‘email’ field of the parsed JSON object.

Hope this helps someone!


Update: if you are seeing an error when trying to parse see my new blog post Azure Logic Apps-The template language function 'json' parameter is not valid.