The blogs of Black Marble staff

Using StyleCop in TFS Team Build

The recent release of the MSBuild Extensions includes a task for StyleCop 4.3. I have been trying to get this integrated into our TFS TeamBuild, I think it is a far more preferable way to do it than editing the various project files in our solution to link in StyleCop as you had to do in 4.2.

There are a good few steps I had to follow to get it doing:

  • Install the StyleCop 4.3 Msi
  • Install the MSBuild Extensions Msi
  • Now we have to do some fixed/changes. First copy the MSBuild.ExtensionPack.StyleCop.dll from the C:\Program Files\MSBuild\ExtensionPack to C:\Program Files\MSBuild\Microsoft\StyleCop\v4.3. We need to do this as the StyleCop DLLs are not automagically found (you could fix this using a search path I suppose)
  • Next we need to modify the C:\Program Files\MSBuild\ExtensionPack\MSBuild.ExtensionPack.tasks file to fix a typo that is a known issue. The StyleCop line at the end of the file should read

    <UsingTask AssemblyFile="$(MSBuildExtensionsPath)\Microsoft\StyleCop\v4.3\MSBuild.ExtensionPack.StyleCop.dll" TaskName="MSBuild.ExtensionPack.CodeQuality.StyleCop"/>

  • Now edit your Team Build tfsbuild.proj file; import the extension tasks

      <Import Project="$(MSBuildExtensionsPath)\ExtensionPack\MSBuild.ExtensionPack.tasks"/>
  • Now you need to edit or add the AfterCompile target, something like as shown below. I have added comments for each block.

<Target Name="AfterCompile">

   <!— Put up the start processing message – we clear it later -->
   <BuildStep TeamFoundationServerUrl="$(TeamFoundationServerUrl)"
            Message="StyleCop step is executing.">
    <Output TaskParameter="Id" PropertyName="StyleCopStep" />

   <!-- Create a collection of files to scan, the ** means subdirectories --> 
    <CreateItem Include="$(SolutionRoot)\MyTeamProject\MySolution\**\*.cs">
      <Output TaskParameter="Include" ItemName="StyleCopFiles"/>

   <!-- Run the StyleCop MSBuild task using the setting file in the same directory as sln file and also stored in TFS -->
    <Output TaskParameter="Succeeded" PropertyName="AllPassed"/>
    <Output TaskParameter="ViolationCount" PropertyName="Violations"/>
    <Output TaskParameter="FailedFiles" ItemName="Failures"/>

<!—Log the summary of the results -->
<Message Text="StyleCop Succeeded: $(AllPassed), Violations: $(Violations)"/>

  <!-- FailedFile format is:
          <FailedFile Include="filename">
              <CheckId>SA Rule Number</CheckId>
              <RuleDescription>Rule Description</RuleDescription>
              <RuleName>Rule Name</RuleName>
              <LineNumber>Line the violation appears on</LineNumber>
              <Message>SA violation message</Message>

<!-- Log the details of any violations -->
<Warning Text="%(Failures.Identity) - Failed on Line %(Failures.LineNumber). %(Failures.CheckId): %(Failures.Message)"/>

<!-- The StyleCop task does not throw an error if the analysis failed, 
so we need to check the return value and if we choose to treat errors as warnings 
we need to set the error state, if set it will cause us to jump to the failure target -->
  <Error Text="StyleCop analysis warnings occured" Condition="'$(AllPassed)' == 'False'"  />

  <!-- List out the issues, we only need this if we are not forcing the error above, as if we have we never get here –>
<!— you would normally have either this OR the error line uncommented, as if there are
are not errors this following line can generate a empty failure line -->
  <BuildStep TeamFoundationServerUrl="$(TeamFoundationServerUrl)"
          Message="%(Failures.Identity) - Failed on Line %(Failures.LineNumber). %(Failures.CheckId): %(Failures.Message)"/>

  <!—Complete the stylecop step, we get here if we have no thrown an error  -->
<BuildStep TeamFoundationServerUrl="$(TeamFoundationServerUrl)"
                Message="StyleCop Succeeded: $(AllPassed), Violations: $(Violations)"/>

  <!-- If an error has been raised we need to call the failure target
       You might have thought you could get the same effect as the error line a few lines above by adding a condition to the OnError as shown commented out below. However this does not work as the OnError condition is not evaluated unless an error has previously occured in a task, the condition clause is secondary-->
  <OnError ExecuteTargets="FailTheBuild" />
  <!--<OnError ExecuteTargets="FailTheBuild" Condition="'$(AllPassed)' == 'False'"  />-->


<Target Name="FailTheBuild">
  <!-- We are failing the build due to stylecop issues -->
  <BuildStep TeamFoundationServerUrl="$(TeamFoundationServerUrl)"
          Message="StyleCop Failed: $(AllPassed), Violations: $(Violations) [See $(DropLocation)\$(BuildNumber)\StyleCopLog.txt]"/>

  <!-- List out the issues-->
  <BuildStep TeamFoundationServerUrl="$(TeamFoundationServerUrl)"
          Message="%(Failures.Identity) - Failed on Line %(Failures.LineNumber). %(Failures.CheckId): %(Failures.Message)"/>


So this will run StyleCop as a separate build step and you have the option to fail the build or not depending on your view of StyleCop violations by commenting out one line. Either way violations will be listed as rows in the list of build steps.

I had really wanted to get the number of violations added to either the total errors or warnings as listed in the Results Details at the end of the build, also I had wanted a simple way to access the StyleCopLog.txt created in the drops directory. However, I have not worked out how this final step yet. If I manage to work it out I will blog on the solution – or if you know to do please post a comment.

Updated –17 & 21 Oct 08 – Added some extra details to the comments in the XML sample and removed hard coded paths, now use team build parameters so everything is referenced via build agents root location.

Vista on the Dell Mini 9: Redux

I've been chipping away at this for a while now today, and I've learned a few things on the way:

  • When Vista says it suggests installing the battery drivers for the system, don't. The zip file it suggested I install broke power management.
  • Patch the system fully as an admin user before logging in as a restricted user. It will save you hours of time.
  • Sysinternals Diskmon doesn't work with Vista - you need to run it as admin, and that certainly isn't an option for my restricted users.
  • Vista when hibernating just shows a black screen. That's not very helpful the first time you try it, on a silent machine with no disk activity lights at all.
  • think Vista takes longer to hibernate and come back from hibernation than XP, although coming back from sleep is much quicker than it's older sibling.

Overall, I'm still happy. I have Vista, Office 2007 and Live Writer and 3.5Gb of disk space free. With no serious hacking the Dell runs at around 50% memory usage witrh a browser and live writer running. I can live with that. Battery life appears OK. It's 10:30 and I've been using the Dell since 8pm, thrashing the disk (as much as there is one) and the wi-fi, and I'm at 38% battery. That puts me on track for about four hours or so and I can live with that.

Ironically, having been using the Aspire One all weekend, the Dell keyboard is more annoying than I found it before I got the Acer. Comparing the two, however the Dell is a good inch narrower and a little lighter. If portability is critical then the Dell has the edge, although I'm starting to favour the Acer for ergonomics.

Top tips, then:

  • Try to use slipstream media - it saves a bit of tidying up.
  • Turn off system restore to save a bit of disk space initially and quite a bit in the longer term.
  • Keep the cruft at a minimum - additional windows components take up disk space, but shoving lots of apps on gobbles memory, which is quite tight with 1Gb of memory.
  • I tried to install from a USB memory stick with Vista installation media on it and it didn't work - a USB optical drive is the easiest way.
  • Make sure you copy the drivers folder off the Min 9 before wiping the disk for Vista - you'll need the drivers folder to install the appropriate system drivers before Windows can work it's Update magic.
  • The 16Gb SSD isn't that big when you try installing Vista and Office. I've not reduced Vista's footprint with tools like vlite, but they might help. Certainly, being hard on yourself is important - do you really need Microsoft Access on a netbook?

The next step is to run the new Mini 9 with Vista in parallel with the XP Pro install on Richard's and see which is the better long-term bet. Watch this space...

Getting Vista on the Dell Mini 9

Our second Mini 9 arrived in the office today. This one is for Andy and myself to use whilst out of the office. Richard has successfully upgraded his to XP Professional, so we had to try to push the bar out a little further – we’re running Vista Business.

I have not spent any time tweaking or prodding yet. I used install media with SP1 included and obliterated the partition on the SSD, then installed the drivers from Dell where necessary, and a driver for the battery hardware that Vista itself suggested rather than the Dell solution.

I have disabled System Restore to claw back some disk space, but even so, prior to installing the bits of Office 2007 we need (Word, Excel, PowerPoint, OneNote) I had around 7Gb free. That’s enough – we’re going to be using this for note taking and document writing, not heavy lifting.

Performance-wise, it’s quite nippy. Aero is disabled, and the sidebar is off (it’s a small screen, why waste bits of it?) and I have well over half the system ram available at idle. Right now I have no complaints at all.

Depending on performance I may well look at upgrading the RAM to 2Gb, but in the short term I’m sure I have an SD card kicking around somewhere I could use for ReadyBoost, should the need arise.

Why Vista? UAC. Shocked? I probably would have been a few months ago, but in all seriousness, UAC means that it’s much easier to run with limited-rights user accounts and still be able to do admin-stuff if the need arises. You can’t do that with XP Pro.

Christmas Houses

At the moment I'm working on the Christmas Cards - the backgrounds of these always take the longest. Somehow if a house doesn't look right it matters more than if a character doesn't..

This is one of the houses I drew today. Its loosely based on a Susie style etching from her times in Portland.


Installation of SCVMM 2008 beta disables non-admin access to remote machines via Hyper-V manager

Yesterday I finally got around to installing SCVMM 2008 beta onto a virtual machine (mainly to help us with some virtual machine migrations we've got coming up).  I must say that I think SCVMM 2008 beta is very nice indeed!

On my Vista machine I use Tore Lervik's Hyper-V Monitor Gadget for Windows Sidebar, and have done for some time.  With the number of virtual machines we run, I have found it an invaluable addition to my sidebar.

This morning however, when I tried to connect to one of the virtual machines listed by the gadget, I got an error message 'An error occurred trying to find the virtual machine <GUID> on the server <servername>'.  In addition, when I tried to use Hyper-V manager, I received the error 'The virtual machine management service is not available'.

We thought for a while that it was related to remote rights (WMI/DCOM) on the servers in question (well, technically it is...) and I spent a while trawling through John Howard's articles relating to the required rights for remote management (well worth a read by the way).  Unfortunately even working through the articles didn't solve my problem.

After a little more rummaging, it turns out that installation of the SCVMM agent onto the servers hosting the virtual machines I want to remotely manage is what is causing the problem.  Anyone who is a local admin on the servers in question can freely manage the remote virtual machines; if you're not a local admin, you can't.  There are two potential solutions to the problem:

  1. uninstall the SCVMM agent from the servers in question (which would no longer allow us to manage them from SCVMM)
  2. Make anyone who needs to remotely manage virtual machines a local administrator on the servers in question

Lets be honest, neither option is entirely appealing (it's not that we don't trust some of the people who need to remotely manage specific machines, I just always would prefer to work from a 'minimum rights necessary' point of view), but as we have some migrations coming soon for which SCVMM is going to really help, we've gone for the latter.

I hope that this is something that is corrected in the RTM version of SCVMM 2008!

DDD7 Agenda Published

Well the votes are in and my proposed session for DDD7 on automated testing did not make the cut, but thanks to anyone who voted for it. I can’t say I am surprised that I am not on the list given the larger number of very interesting sessions proposed.

However, I am a little disappointed that even though there were a good percentage of the proposed sessions on testing related subjects only Ian’s on TDD and  Ben’s on Pex made it through; I really had expected to see Gojko’s on Fitnesse.Net on the list.

Given that much of the last Alt.Net conference was focused on acceptance testing the relative lack of testing related sessions surprised me. There seems to be a difference in interests between the DDD voting community and the Alt.Net attendees. Does this mean that the average person attending (or at least voting) for DDD sessions does not care about testing or thinks they have nothing to learn? or have I missed something about the nature of the two events?

Microsoft CRM 4.0 adapter for BizTalk 2006

Microsoft have just released the Microsoft BizTalk® Server 2006 adapter for Microsoft Dynamics® CRM 4.0, the adapter has the same functionality as the 3.0 adapter plus :-

•             Support for both 32-bit and 64-bit deployments of Microsoft Dynamics CRM 4.0

•             Support for form-based authentication

•             Support the joy that is multi-tenancy

Best of all the adapter is free.


Download Here


Enigma, Bletchley Park and the Battle of the Atlantic

I attended a very interesting BCS talk last night hosted by the West Yorkshire Branch about Enigma, Bletchley Park and the Battle of the Atlantic.

Dr Mark Baldwin is a superb speaker; he spoke about Enigma machine itself, the decoding efforts started by the Poles in the early 1930's, subsequent wartime efforts to break the codes, the machines used to aid in this process, the effects that code breaking had on the battle of the Atlantic and Bletchley Park itself for 2 hours without any notes! At the end of the talk, there was also the opportunity to examine a rare 4-rotor Enigma machine that Dr Baldwin had brought with him.

I was particularly intrigued to hear that the Germans thought that the sheer number of possible combinations that the Enigma machine allowed for (3 × 10114, a number significantly larger than the number of atoms in the observable universe!) precluded anyone being able to decode their messages; an assumption that remained until many years after the war. The rotors used with the Enigma machine were also not rewired at any point during the war. In addition, because they assumed that nobody could read the messages produced by such a system, they made very little effort to break the codes produced by our Typex system!

I was also saddened to see the state that Bletchley Park is now in. Many of the huts where so much incredibly important work was carried out are in a very poor state, some have even already been destroyed. Bletchley Park has receives no external funding and has been deemed ineligible for Heritage Lottery funding. I would urge you to sign the petition located at in the hope that the government will do something to help save this crucial piece of British history.

On a happier note, Robert mentioned to me that Black Marble does sponsor Bletchley Park! I look forward to being able to visit in the near future.