But it works on my PC!

The random thoughts of Richard Fennell on technology and software development

Publishing more than one Azure Cloud Service as part of a TFS build

Using the process in my previous post you can get a TFS build to create the .CSCFG and .CSPKG files needed to publish a Cloud Service. However, you hit a problem if your solution contains more that one Cloud Service project; as opposed to a single cloud service project with multiple roles, which is not a problem.

The method outlined in the previous post drops the two files into a Packages folder under the drops location. The .CSPKG files are fine, as they have unique names. However there is only one ServiceConfiguration.cscfg, whichever one was created last.

Looking in the cloud service projects I could find no way to rename the ServiceConfiguration file. It looks like it is like a app.config or web.config file i.e. it’s name is hard coded.

The only solution I could find was to add a custom target that is set to run after the publish target. This was added to the end of each .CCPROJ files using a text editor just before the closing </project>

 <Target Name="CustomPostPublishActions" AfterTargets="Publish">
    <Exec Command="IF '$(BuildingInsideVisualStudio)'=='true' exit 0
    echo Post-PUBLISH event: Active configuration is: $(ConfigurationName) renaming the .cscfg file to avoid name clashes
    echo Renaming the .CSCFG file to match the project name $(ProjectName).cscfg
    ren $(OutDir)Packages\ServiceConfiguration.*.cscfg $(ProjectName).cscfg
    " />
  </Target>
   <PropertyGroup>
    <PostBuildEvent>echo NOTE: This project has a post publish event</PostBuildEvent>
  </PropertyGroup>

 

Using this I now get unique name for the .CSCFG files as well as for .CSPKG files in my drops location. All ready for Release Management to pickup

Notes:

  • I echo out a message in the post build event too just as a reminder that I have added a custom target that cannot be seen in Visual Studio, so is hard to discover
  • I use an if test to make sure the commands are only run on the TFS build box, not on a local build. The main reason for this is the path names are different for local builds as opposed to TFS build. If you do want a rename on a local build you need to change the $(OutDir)Packages path to $(OutDir)app.publish. However, it seemed more sensible to leave the default behaviour occur when running locally

Getting the correct path and name for a project to pass as an MSBuild argument in TFS Build

I have been sorting out some builds for use with Release Management that include Azure Cloud Solutions. To get the correct packages built by TFS I have followed the process in my past blog post. The problem was I kept getting the build error

The target "Azure Packages\BlackMarble.Win8AppBuilder.AzureApi" does not exist in the project.

The issue was I could not get the solution folder/project name right for the MSBUILD target parameter. Was it the spaces in the folder? I just did not know.

The solution was to check the .PROJ file that was actually being run by MSBUILD. As you may know a .SLN file is not in MSBUILD format so you can’t just open it in notepad and look (unlike a .CSPROJ or .VBPROJ files), it is created by MSBUILD on the fly. To see this generated code, at a developer’s command prompt, run the following commands

cd c:\mysolutionroot
Set MSBuildEmitSolution=1
msbuild

When the MSBUILD command is run, whether the build works or not, there should be mysolution.sln.metaproj  file created. If you look in this file you will see the actual targets MSBUILD thinks it is dealing with.

In my case I could see

<Target Name="Azure Packages\BlackMarble_Win8AppBuilder_AzureApi:Publish">

So the first issue was my . were replaced by _

I changed my MSBUILD target argument to that shown in the file, but still had a problem. However, once I changed by space in the solution folder to %20 all was OK. So my final MSBUILD argument was

/t:Azure%20Packages\BlackMarble_Win8AppBuilder_AzureApi:Publish

image

Building Azure Cloud Applications on TFS

If you are doing any work with Azure Cloud Applications there is a very good chance you will want your automated build process to produce the .CSPKG deployment file, you might even want it to do the deployment too.

On our TFS build system, it turns out this is not a straight forward as you might hope. The problem is that the MSbuild publish target that creates the files creates them in the $(build agent working folder)\source\myproject\bin\debug folder. Unlike the output of the build target which puts them in the $(build agent working folder)\binaries\ folder which gets copied to the build drops location. Hence though the files are created they are not accessible with the rest of the built items to the team.

I have battled to sort this for a while, trying to avoid the need to edit our customised TFS build process template. This is something we try to avoid where possible, favouring environment variables and MSbuild arguments where we can get away with it. There is no point denying that editing build process templates is a pain point on TFS.

The solution – editing the process template

Turns out a colleague had fixed the same problem a few projects ago and the functionality was already hidden in our standard TFS build process template. The problem was it was not documented; a lesson for all of us, that it is a very good idea to put customisation information in a searchable location so others find customisations that are not immediate obvious. Frankly this is one of the main purposes of this blog, somewhere I can find what I did that years, as I won’t remember the details.

Anyway the key is to make sure the publish target for the MSBbuild uses the correct location to create the files. This is done using a pair of MSBuild arguments in the advanced section of the build configuration

  • /t:MyCloudApp:Publish -  this tells MSbuild to perform the publish action for just the project MyCloudApp. You might be able to just go /t:Publish if only one project in your solution has a Publish target
  • /p:PublishDir=$(OutDir) - this is the magic. We pass in the temporary variable $(OutDir). At this point we don’t know the target binary location as it is build agent/instance specific, customisation in the TFS build process template converts this temporary value to the correct path.

In the build process template in the Initialize Variable sequence within Run on Agent add a If Activity.

image

  • Set the condition to MSBuildArguments.Contains(“$(OutDir)”)
  • Within the true branch add an Assignment activity for the MSBuildArguments variable to MSBuildArguments.Replace(“$(OutDir)”, String.Format(“{0}\{1}\\”, BinariesDirectory, “Packages”))

This will swap the $(OutDir) for the correct TFS binaries location within that build.

After that it all just works as expected. The CSPKG file etc. ends up in the drops location.

Other things that did not work (prior to TFS 2013)

I had also looked a running a PowerShell script at the end of the build process or adding an AfterPublish target within the MSBuild process (by added it to the project file manually) that did a file copy. Both these methods suffered the problem that when the MSBuild command ran it did not know the location to drop the files into. Hence the need for the customisation above.

Now I should point out that though we are running TFS 2013 this project was targeting the TFS 2012 build tools, so I had to use the solution outlined above, a process template edit. However, if we had been using the TFS 2013 process template as our base for customisation then we would have had another way to get around the problem.

TFS 2013 exposes the current build settings as environment variables. This would allow us to use a AfterPublish MSBuild Target something like

<Target Name="CustomPostPublishActions" AfterTargets="AfterPublish" Condition="'$(TF_BUILD_DROPLOCATION)' != ''">
  <Exec Command="echo Post-PUBLISH event: Copying published files to: $(TF_BUILD_DROPLOCATION)" />
  <Exec Command="xcopy &quot;$(ProjectDir)bin\$(ConfigurationName)\app.publish&quot; &quot;$(TF_BUILD_DROPLOCATION)\app.publish&quot; /y " />
</Target>

So maybe a simpler option for the future?

The moral of the story document your customisations and let your whole team know they exist

Black Marble is hosting the Yorkshire Chapter of the Global Windows Azure Bootcamp on the 27th of April

Black Marble is hosting the Yorkshire Chapter of the Global Windows Azure Bootcamp taking place in several locations globally on the April 27th, 2013. This free community organised event is one day deep dive class where you will get you up to speed on developing for Windows Azure. The class includes a trainer with deep real world experience with Windows Azure, as well as a series of labs so you can practice what you just learned.

Black Marble’s event will be run by Robert Hogg (Microsoft Integration MVP) and Steve Spencer (Windows Azure MVP). Come along and join the global Azure event of the year!

Check out the prerequisites you need to install on your PC and here to sign up

Global Azure Bootcamp Logo

A fix for my failure to login to TFSpreview.com problems

I use a number of site collections on the Azure hosted Team Foundations Service (http://tfspreview.com); I have just solved a problem that I could not login to one of them via Visual Studio (2010, Dev11 or also TEE 11, I tried then all), but I could login to my other collections. Also I could access the collection if I logged in via a browser, just not with VS; all very good for work item management, but not much help for source code check-ins.

The Problem

The problem was that when I loaded Visual Studio and tried to select the collection https://mycollection.tfspreview.com in Team Explorer the ‘Sign into Team Foundation Server’ form loaded and uploaded a few times whilst trying to redirect to an authentication provider. I then ended up with a TF31003 error. A retry or use of different credentials did not help

image

If a deleted the server from the list and tried to re-add it I got similar results, but ended up at the LiveID sign in screen, but just an error message and no means to enter details.

image

The Solution

The problem was due to cached LiveID credentials. It was suggested I clear IE9 cookies but this did not help. In the end I found the solution in the Credential Manager (Control Panel > User Accounts > Manage Users > Advanced > Manage Passwords).

I had recently installed Skydrive on my PC. This had stored a cached LiveID, the issue was it seems this cached Skydrive LiveID was being used to access TFSpreview. Unfortunately this was my personal LiveID not my work one. This personal LiveID had no rights to access the problem site collection, but I could get into the other collections because both my personal and work LiveID both had access.

So I deleted the offending cached LiveID and tried Team Explorer again and this time I was prompted for a LiveID (though the user name field did contain the wrong LiveID, I could correct it) and I could login.

image

I then loaded SkyDrive (which I had exited) it prompted me to re-enter my credential. It recreated it cached credentials and seemed happy.

Interestingly they did not seem to cause a problem this time, maybe it is an entry order issue?

I need to keep an eye on it.

PDC 2010 thoughts - the next morning

I sat in the office yesterday with a beer in my hand watching the PDC2010 keynote. I have to say I preferred this to the option of a flight, jet lag and a less than comfortable seat in a usually overly cooled conference hall. With the Silverlight streaming the experience was excellent, especially as we connected an Acer 1420P to our projector/audio via a single HDMI cable and it just worked.

So what do you lose by not flying out? Well the obvious is the ‘free’ Windows Phone 7 the attendees got; too many people IMHO get hooked up on the swag at conferences, you go for knowledge not toys. They also forget they (or their company) paid for item anyway in their conference fee. More seriously you miss out on the chats between the sessions, and as the conference is on campus the easier access to the Microsoft staff. Also the act of travelling to a conference isolates you from the day to day interruptions of the office, the online experience does not and you will have to stay up late to view sessions live due to timezones. The whole travelling experience still cannot be replaced by the online experience, not matter how good the streaming.

However, even though I don’t get the ‘conference corridor experience’ it does not mean I cannot check out sessions, it is great to see they are all available free and live, or immediately available recordings if I don’t want to stay up.

The keynote was pretty much as I had expected. There were new announcements but nothing that was ground breaking, but good vNext steps. I thought the best place to start for me was the session “Lessons learned from moving team foundation server to the cloud”, this was on TFS, and obvious area of interest for me, but more importantly no real world experience to move a complex application to Azure. This is something that is going to effect all of us if Microsoft’s bet on the cloud is correct. Seems, though there are many gottas, the process was not as bad as you would expect. For me the most interesting point was the port to Azure caused changes to the codebase that actually improved the original implementation either in manageability or performance. Also that many of the major stumbling blocks were business/charging models not technology. This is going to effect us all as we move to service platforms like Azure or even internally host equivalents like AppFabic

So one session watched, what to watch next?

PDC Keynote Day 1 thoughts

So the PDC2009 day 1 keynote is over and what was the story? Well it is more of a vision thing, but then again this is a PDC not a TechEd so what do you expect. For me the two major themes were

  • Dallas – a centralised data service that allows unified access to both public and private via subscriptions. Thus allowing core data being used for any purpose the user requires within the EULA of the data in question. It will be interesting what will be published in this manner, is there a market for a centralised data clearing house? only time will tell.
  • AppFabric – Basically taking the operating model for the Azure services and allow a company to have a similar model in their own IT system. Thus allowing code to be written that can work on the corporate system or Azure cloud without alteration. This I see as being big.,

So what was not mentioned, well it was mobile. The only comment was a ‘come to Mix in the spring for stuff about the next mobile offering. Whatever is shown there is going to have to very good to address the momentum of the iPhone. I think a good bet is that leveraging the Azure fabric might be important for the mobile offering