The blogs of Black Marble staff

vNext Release Management and Network Isolation

If you are trying to use Release Management or any deployment tool with a network isolated Lab Management setup you will have authentication issues. Your isolated domain is not part of your production domain, so you have to provide credentials. In the past this meant Shadow Accounts or the simple expedient of  running a NET USE at the start of your deployment script to provide a login to the drops location.

In Release Management 2013.4 we get a new option to address this issue if you are using DSC based deployment. This is Deploy from a build drop using a shared UNC path. In this model the Release Management server copies the contents of the drops folder to a known share and passes credentials to access it down to the DSC client (you set these as parameters on the server).

This is a I nice formalisation of the tricks we had to pull by hand in the past. And something I had missed when the Update 4 came out

Thoughts in vNext deployment in Release Management

The DCS two step

When working with DSC a difficult concept can be that your desired state script is ‘compiled’ to a MOF file that is then ‘run’ by the desired state manager on the target machine. This is a two step affair and you have no real input on the second part. This is made more complex when Release Management is involved.

Colin Dembovsky did an excellent pair of posts on getting Release Management working with DSC based vNext templates. The core of the first post is that he made use of the Script DSC resource to run SQLPackage.EXE and MSDeploy to do the actual deployment of a system as well as using it to manage the transformation of the configuration files in his MSDeploy package.

This is where the two step issue raises it head. Even with his post I managed to get myself very confused.

The problem is Release Management passes variable into the DSC script and these are evaluated when the MOF is compiled. For most resources this is fine, but for the Script resource you have to be very aware that the string that is the actual script is treated as just a string, not code, so variables are not evaluated until the MOF is run by the desired state manager, by which point there are no Release Management variables set (they are long gone).

Colin’s provides an answer to this problem with his cScriptWithParams resource. This resource takes the Release Management provided properties and passes them into as parameters into the MOF compilation, forcing their evaluation, neatly side stepping the problem. He uses this  technique for the SetParameters.XML transform.

This is all good, but it got me thinking, his post has a number of hard coded paths, and also copies the deployment files to a ‘known location’. Is this really all required if we can pass in the Release Management $ApplicationPath?

So I swapped all my Script resources to use the cScriptWithParams resource passing in the applicationpath thus removing the need to copy the files from their default location.

      cScriptWithParams SetConStringDeployParam
            GetScript = { @{ Name = "SetDeployParams" } }
            TestScript = { $false }
            SetScript = {
                $paramFilePath = "$folder\_PublishedWebsites\WcfService_Package\WcfService.SetParameters.xml"
                $paramsToReplace = @{
                      "__DBContext__" = $context
                      "__SiteName__" = $siteName
                $content = gc $paramFilePath
                $paramsToReplace.GetEnumerator() | % {
                    $content = $content.Replace($_.Key, $_.Value)
                sc -Path $paramFilePath -Value $content
            cParams =
                context = $context;
                siteName = $siteName;
                folder = $ApplicationPath;


        cScriptWithParams DeploySite
            GetScript = { @{ Name = "DeploySite" } }
            TestScript = { $false }
            SetScript = {
                & "$folder\_PublishedWebsites\WcfService_Package\WcfService.deploy.cmd" /Y
            cParams =
                folder = $ApplicationPath;

            DependsOn = "[cScriptWithParams]SetConStringDeployParam"

I think this gave a easier to follow script, though I do wonder about my naming convention, maybe I need to adopt a nomenclature for inner script variables as opposed to global ones

Where are my parameters stored?

However, this does raise the question of where do these ‘global’ parameters come from? We have two options

  • A PowerShell Configuration data file (the standard DSC way)
  • Release management parameters

Either are valid, if you want to source control all  your configuration the first option is good. However what happens if you need to store secrets? In this case the ability to store a value encrypted within Release Management is useful.

In reality I expect will will use a combination. maybe with everything bar secrets in the configuration file.

It’s beginning to look a lot like Christmas…

Day 931 in the House of Black Marble:image_thumb4

You know what that means don’t you…? It’s time for Christmas jumpers, Rockin’ Around the Christmas Tree, and hot chocolate. Because who doesn’t like snuggling into chunky knit, with liquid joy, and Brenda Lee crooning at you? No-one that’s who…

And to top if off why not try a couple of festive recipes to get you in the mood – all courtesy of our Christmas card this year: Black Marble Brigade and  the Christmas Cookery Conundrum – coming to a desk or doormat near you. If you haven’t checked out all the goodies then you’re missing out - make sure you try the Stained Glass Biscuits.

We had a great time taste-testing all the creations that have gone into the card and on to the website, and they all have the Black Marble Seal of Approval – so get scoffing! And if you’d like to see what we’ve cooked up in previous years check out our Christmas website. Or for festive Black Marble pictures - take a peek at our Facebook page.

Here’s to having a Merry Christmas and a Happy New Year!


Lumia Denim

Don’t worry, Nokia are not planning to revive 70s nostalgia aftershave, Denim is the new update for Lumia's providing better low level support of the hardware.

Denim offers several key updates

Camera – for the 830/930/1520 updates specifically for HDR, Dynamic flash and 4k video and speed fixes for the 1520. as a user you will see this surfaced in the new Lumia Camera application update.

Image Quality – Denim also comes with a new generation of image processing algorithms which “should” improve picture quality.

Cortana – for the 1520 and 930 there is a new “Hey Cortana’ voice recognition which will start Cortana.

Glance – Now has Bing weather, Bing Health and other applications

Availability as ever is network dependant, but as this is generally improving, hopefully you will catch it soon.

Check for your area here

I haven’t received it on my 930 where lets hope glance makes a sorely needed appearance.


Azure Websites: Blocking access to the url

I’ve been setting up one of our services as the backend service for Azure API management. Part of this process we have mapped DNS to point to the service. As the service is hosted in Azure Websites there are now two urls that exist which can be used to access the service. I wanted to stop a user from accessing the site using the url and only access it via the mapped domain. This is easy to achieve and can be configured in the web.config file of the service.

In the <system.webServer> section add the following configuration

        <rule name="Block traffic to the raw azurewebsites url"  patternSyntax="Wildcard" stopProcessing="true">
          <match url="*" />
            <add input="{HTTP_HOST}" pattern="**" />
          <action type="CustomResponse" statusCode="403" statusReason="Forbidden"
          statusDescription="Site is not accessible" />

Now if I try and access my site through the url, I get a 403 error, but accessing through the mapped domain is fine.

Setting a build version in a JAR file from TFS build

Whilst helping a Java based team (part of larger organisation that used many sets of both Microsoft and non-Microsoft tools) to migrate from Subversion to TFS I had to tackle their Jenkins/Ant based builds.

They could have stayed on Jenkins and switched to the TFS source provider, but they wanted to at least look at how TFS build would better allow them to  trace their builds against TFS work items.

All went well, we setup a build controller and agent specifically for their team and installed Java onto it as well the TFS build extensions. We were very quickly able to get our test Java project building on the new build system.

One feature that their old Ant scripts used was to store the build name/number into the Manifest of any JAR files created, a good plan as it is always good to know where something came from.

When asked as how to do this with TFS build I thought ‘no problem I will just use TFS build environment variable’ and add something like the following

<property environment="env"/>

<target name="jar">
        <jar destfile="${basedir}/javasample.jar" basedir="${basedir}/bin">
                <attribute name="Implementation-Version" value="${env.TF_BUILD_BUILDNUMBER}" />

But this did not work, I just saw the text ${env.TF_BUILD_BUILDNUMBER}" in my manifest, basically the environment variable could not be resolved.

After a bit more of think I realised the problem is that the Ant/Maven build extensions for TFS are based on TFS 2008 style builds, the build environment variables are a TFS 2012 and later feature, so of course they are not set.

A quick look in the automatically generated TFSBuild.proj file generated for the build showed that the MSBuild $(BuildNumber) was passed into the Ant script as a property, so it could be referenced in the Ant Jar target (note the brackets change from () to {})

<target name="jar">
        <jar destfile="${basedir}/javasmaple.jar" basedir="${basedir}/bin">
                <attribute name="Implementation-Version" value="${BuildNumber}" />


Once this change was made I then got the manifest I expected including the build number

Manifest-Version: 1.0
Ant-Version: Apache Ant 1.9.4
Created-By: 1.8.0_25-b18 (Oracle Corporation)
Implementation-Version: JavaSample.Ant.Manual_20141216.7

Azure Media Services Live Media Streaming General Availability

Yesterday Scott Guthrie announced a number of enhancements to Microsoft Azure. One of the enhancements is the General Availability of Azure Media Services Live Media Streaming. This gives us the ability to stream live events on a service that has already been used to deliver big events such as the 2014 Sochi Winter Olympics and the 2014 FIFA World Cup.

I’ve look at this for a couple of our projects and found it relatively fast and easy to set up a live media event even from my laptop using its built in camera. There’s a good blob post that walks you through the process of setting up the Live Streaming service. I used this post and was quickly streaming both audio and video from my laptop.

The main piece of software that you need to install is a Video/Audio Encode the supports Smooth Streaming or RTMP. I used the WireCast encoder as specified in the post. You can try out the encoder for 2 weeks as long as you don’t mind seeing the Wirecast Logo on your video (which is removed if you buy a license). Media services pricing can be found here

The Media Services team have provided a MPEG-DASH player to help you test your live streams.

It appears that once you have created a stream that is is still accessible on demand after the event has completed.Also there is around a 20s delay when you receive the stream on your player.

Gadgeteer Traffic Lights

The other workshop I took part in at the Inspiring Students event yesterday was a Gadgeteer one with Paul Foster (Microsoft).

We made a set of traffic lights using the kits. When it was red, it beeped when you pressed a button.

There was a display with a Christmas tree on it! 

I really enjoyed it, I liked plugging together the different parts and writing code to make it work.


3D Printing myself in Carbonite

In the Inspiring Students event I went to yesterday, I took part in a workshop with David Grey from the University of Hull.

It was very interesting, and I loved the 3D printer.  I was scanned using Kinect, and then printed like I was in carbonite!

It was fun :)

Inspiring Students in Computing

oday I went along to an event run by a local software company, Black Marble, where they get professional speakers to speak to local A Level students, and try to get them to keep studying Computing.

It was an early start, but it was worth it; a really fun and interesting day.

First up was Robert Hogg, the MD of Black Marble.  He was talking about design for the Modern UI and the importance of a clean user experience.  He spoke about developing for different devices, and the Internet of Things.

Next up was Andrew Fryer, from Microsoft, and he was using Pharrell Williams' Happy to make his point :)

Will Smith from the University of York did a great talk on vision.

Lunch was excellent, everyone was hungry.  We had yummy sandwiches.  I had Cajun Chicken.

In the afternoon, I saw the two Garys.  Gary Pennington spoke about Operating systems, and Gary Short talked about the Maths behind Facebook. 

Everyone left with some swag, as well as a selection box :)