The blogs of Black Marble staff

Vista on Dell Mini 9: Using junctions to move files off the SSD

Flush with my success earlier in getting apps installed on the SD card now mounted as 'c:\SD Program Files' I installed a few things. I then hit a snag.

When you install apps using an MSI, the installation files get cached by Windows Installer. Steadily, c:\windows\installer gets bigger and bigger, so whilst my apps were no longer taking up space, the install files were (and some of those are quite large).

I now have an image of the Vista install so I'm becoming more cavalier. I wondered if I could move the Installer folder from c:\windows onto my SD card by copying it into the mount point, but still get Windows Installer to work as though nothing had changed.

So, first I used Robocopy with the /SEC option to copy the installer folder tree over. Next, I deleted c:\windows\installer and created a directory junction which allows the moved content to still be accessed as c:\windows\installer.

To create a directory junction, use the mklink command with the /J switch to create the installer folder in c:\windows and point it at 'c:\sd program files\installer'.

So far, it's working fine. I need to try a few more installs to be sure, though...

Vista on the Dell Mini 9: Installing applications on an SD card

I’m still trying new things with the Mini 9. I now have an image file that I can restore to the Mini which has my base install after running sysprep. The problem I have is storage space – the SSD isn’t _quite_ big enough.

So, Richard wandered in this morning and handed me a 4Gb SD card to experiment with. The question: Can we use the SD card and install app onto it?

Our initial finding was a big fat no. Visual Studio setup refused to install to ‘removable media’. Bearing in mind all we’d done at this point was stuff an SD card into the card slot I wasn’t too surprised.

I fired up computer manager and went into disk admin. The partition on the SD card was FAT32, so we replaced that with an NTFS partition for a start.

I then went into the drive properties and changed the Policy settings. By default the SD reader is set to allow instant removal. Changing that to enable write caching means I will have to use the tray icon to ‘Safely Remove Hardware’, but I think that’s the key setting to allow me to install apps. However, I wanted to make things a little more integrated, so I created a folder on the C: drive called ‘SD Program Files’ and used it as a mount point for the SD card.

Visual Studio is now happily installing to the SD card.

As a 16Gb SD card is around twenty-five quid, this makes a reasonable approach to increasing storage space, assuming apps run reasonably quickly from the drive.

That’s the next test…

TFS Iterations not appearing in IterationPath

I have been working on site that has had to do a disaster recovery of their TFS application tier (AT) due to hardware failure. For a short period they have had to use a spare PC as their AT. Due to the hast required to get the developers working this AT was only configured to for source control and work item editing.

So I was onsite to put the proper replacement AT in place. All seemed to go OK until we added a new Iteration to a team project. It did not appear in the IterationPath field for work items.

This problem actually manifested itself for us in the inability to add a new sprint from inside eScrum. Unlike most team process templates the eScrum front end creates sprints by creating an iterations and an associated work item (to hold extra information) all in one go. This was failing as after the iteration was created it’s creation was not propagated to allow a work item to be associated with it.

After checking the ATs event log we saw TF53010 and TF51338 errors. I then ran the TFS Best Practice Analyser (BPA) and this showed two issues:

  • the MyDomain\TFSService account not being in the TFS [Server]\Service Accounts group. I think this was due to fact that the temporary AT system had used using different accounts and the installation of the new AT had left some behind.
  • due to this the TFS Scheduler was not running reliably, this would explain why the new iterations were not being propagated.

We fixed this using the tfssecurity /g command to add the MyDomain\TFSService  account to the TFS [Server]\Service Accounts group and then restarted the server.

Once this was done we checked the configuration was right using the BPA again, and finally checked we could create sprints in eScrum.

TFS 2008 SP1 resets service accounts

I installed the TFS 2008 SP1 on a site that was using custom accounts for the identities that run the application pools for the WSS instance and Report Services.

These user accounts got reset back to Network Service when the service pack was installed; I had not see this occur on any site I had upgraded previously. This meant you could not start WSS or Reporting Services.

Manually resetting them back to their old correct accounts fixed the problem.

Using StyleCop in TFS Team Build

The recent release of the MSBuild Extensions includes a task for StyleCop 4.3. I have been trying to get this integrated into our TFS TeamBuild, I think it is a far more preferable way to do it than editing the various project files in our solution to link in StyleCop as you had to do in 4.2.

There are a good few steps I had to follow to get it doing:

  • Install the StyleCop 4.3 Msi
  • Install the MSBuild Extensions Msi
  • Now we have to do some fixed/changes. First copy the MSBuild.ExtensionPack.StyleCop.dll from the C:\Program Files\MSBuild\ExtensionPack to C:\Program Files\MSBuild\Microsoft\StyleCop\v4.3. We need to do this as the StyleCop DLLs are not automagically found (you could fix this using a search path I suppose)
  • Next we need to modify the C:\Program Files\MSBuild\ExtensionPack\MSBuild.ExtensionPack.tasks file to fix a typo that is a known issue. The StyleCop line at the end of the file should read

    <UsingTask AssemblyFile="$(MSBuildExtensionsPath)\Microsoft\StyleCop\v4.3\MSBuild.ExtensionPack.StyleCop.dll" TaskName="MSBuild.ExtensionPack.CodeQuality.StyleCop"/>

  • Now edit your Team Build tfsbuild.proj file; import the extension tasks

      <Import Project="$(MSBuildExtensionsPath)\ExtensionPack\MSBuild.ExtensionPack.tasks"/>
  • Now you need to edit or add the AfterCompile target, something like as shown below. I have added comments for each block.

<Target Name="AfterCompile">

   <!— Put up the start processing message – we clear it later -->
   <BuildStep TeamFoundationServerUrl="$(TeamFoundationServerUrl)"
            Message="StyleCop step is executing.">
    <Output TaskParameter="Id" PropertyName="StyleCopStep" />

   <!-- Create a collection of files to scan, the ** means subdirectories --> 
    <CreateItem Include="$(SolutionRoot)\MyTeamProject\MySolution\**\*.cs">
      <Output TaskParameter="Include" ItemName="StyleCopFiles"/>

   <!-- Run the StyleCop MSBuild task using the setting file in the same directory as sln file and also stored in TFS -->
    <Output TaskParameter="Succeeded" PropertyName="AllPassed"/>
    <Output TaskParameter="ViolationCount" PropertyName="Violations"/>
    <Output TaskParameter="FailedFiles" ItemName="Failures"/>

<!—Log the summary of the results -->
<Message Text="StyleCop Succeeded: $(AllPassed), Violations: $(Violations)"/>

  <!-- FailedFile format is:
          <FailedFile Include="filename">
              <CheckId>SA Rule Number</CheckId>
              <RuleDescription>Rule Description</RuleDescription>
              <RuleName>Rule Name</RuleName>
              <LineNumber>Line the violation appears on</LineNumber>
              <Message>SA violation message</Message>

<!-- Log the details of any violations -->
<Warning Text="%(Failures.Identity) - Failed on Line %(Failures.LineNumber). %(Failures.CheckId): %(Failures.Message)"/>

<!-- The StyleCop task does not throw an error if the analysis failed, 
so we need to check the return value and if we choose to treat errors as warnings 
we need to set the error state, if set it will cause us to jump to the failure target -->
  <Error Text="StyleCop analysis warnings occured" Condition="'$(AllPassed)' == 'False'"  />

  <!-- List out the issues, we only need this if we are not forcing the error above, as if we have we never get here –>
<!— you would normally have either this OR the error line uncommented, as if there are
are not errors this following line can generate a empty failure line -->
  <BuildStep TeamFoundationServerUrl="$(TeamFoundationServerUrl)"
          Message="%(Failures.Identity) - Failed on Line %(Failures.LineNumber). %(Failures.CheckId): %(Failures.Message)"/>

  <!—Complete the stylecop step, we get here if we have no thrown an error  -->
<BuildStep TeamFoundationServerUrl="$(TeamFoundationServerUrl)"
                Message="StyleCop Succeeded: $(AllPassed), Violations: $(Violations)"/>

  <!-- If an error has been raised we need to call the failure target
       You might have thought you could get the same effect as the error line a few lines above by adding a condition to the OnError as shown commented out below. However this does not work as the OnError condition is not evaluated unless an error has previously occured in a task, the condition clause is secondary-->
  <OnError ExecuteTargets="FailTheBuild" />
  <!--<OnError ExecuteTargets="FailTheBuild" Condition="'$(AllPassed)' == 'False'"  />-->


<Target Name="FailTheBuild">
  <!-- We are failing the build due to stylecop issues -->
  <BuildStep TeamFoundationServerUrl="$(TeamFoundationServerUrl)"
          Message="StyleCop Failed: $(AllPassed), Violations: $(Violations) [See $(DropLocation)\$(BuildNumber)\StyleCopLog.txt]"/>

  <!-- List out the issues-->
  <BuildStep TeamFoundationServerUrl="$(TeamFoundationServerUrl)"
          Message="%(Failures.Identity) - Failed on Line %(Failures.LineNumber). %(Failures.CheckId): %(Failures.Message)"/>


So this will run StyleCop as a separate build step and you have the option to fail the build or not depending on your view of StyleCop violations by commenting out one line. Either way violations will be listed as rows in the list of build steps.

I had really wanted to get the number of violations added to either the total errors or warnings as listed in the Results Details at the end of the build, also I had wanted a simple way to access the StyleCopLog.txt created in the drops directory. However, I have not worked out how this final step yet. If I manage to work it out I will blog on the solution – or if you know to do please post a comment.

Updated –17 & 21 Oct 08 – Added some extra details to the comments in the XML sample and removed hard coded paths, now use team build parameters so everything is referenced via build agents root location.

Vista on the Dell Mini 9: Redux

I've been chipping away at this for a while now today, and I've learned a few things on the way:

  • When Vista says it suggests installing the battery drivers for the system, don't. The zip file it suggested I install broke power management.
  • Patch the system fully as an admin user before logging in as a restricted user. It will save you hours of time.
  • Sysinternals Diskmon doesn't work with Vista - you need to run it as admin, and that certainly isn't an option for my restricted users.
  • Vista when hibernating just shows a black screen. That's not very helpful the first time you try it, on a silent machine with no disk activity lights at all.
  • think Vista takes longer to hibernate and come back from hibernation than XP, although coming back from sleep is much quicker than it's older sibling.

Overall, I'm still happy. I have Vista, Office 2007 and Live Writer and 3.5Gb of disk space free. With no serious hacking the Dell runs at around 50% memory usage witrh a browser and live writer running. I can live with that. Battery life appears OK. It's 10:30 and I've been using the Dell since 8pm, thrashing the disk (as much as there is one) and the wi-fi, and I'm at 38% battery. That puts me on track for about four hours or so and I can live with that.

Ironically, having been using the Aspire One all weekend, the Dell keyboard is more annoying than I found it before I got the Acer. Comparing the two, however the Dell is a good inch narrower and a little lighter. If portability is critical then the Dell has the edge, although I'm starting to favour the Acer for ergonomics.

Top tips, then:

  • Try to use slipstream media - it saves a bit of tidying up.
  • Turn off system restore to save a bit of disk space initially and quite a bit in the longer term.
  • Keep the cruft at a minimum - additional windows components take up disk space, but shoving lots of apps on gobbles memory, which is quite tight with 1Gb of memory.
  • I tried to install from a USB memory stick with Vista installation media on it and it didn't work - a USB optical drive is the easiest way.
  • Make sure you copy the drivers folder off the Min 9 before wiping the disk for Vista - you'll need the drivers folder to install the appropriate system drivers before Windows can work it's Update magic.
  • The 16Gb SSD isn't that big when you try installing Vista and Office. I've not reduced Vista's footprint with tools like vlite, but they might help. Certainly, being hard on yourself is important - do you really need Microsoft Access on a netbook?

The next step is to run the new Mini 9 with Vista in parallel with the XP Pro install on Richard's and see which is the better long-term bet. Watch this space...

Getting Vista on the Dell Mini 9

Our second Mini 9 arrived in the office today. This one is for Andy and myself to use whilst out of the office. Richard has successfully upgraded his to XP Professional, so we had to try to push the bar out a little further – we’re running Vista Business.

I have not spent any time tweaking or prodding yet. I used install media with SP1 included and obliterated the partition on the SSD, then installed the drivers from Dell where necessary, and a driver for the battery hardware that Vista itself suggested rather than the Dell solution.

I have disabled System Restore to claw back some disk space, but even so, prior to installing the bits of Office 2007 we need (Word, Excel, PowerPoint, OneNote) I had around 7Gb free. That’s enough – we’re going to be using this for note taking and document writing, not heavy lifting.

Performance-wise, it’s quite nippy. Aero is disabled, and the sidebar is off (it’s a small screen, why waste bits of it?) and I have well over half the system ram available at idle. Right now I have no complaints at all.

Depending on performance I may well look at upgrading the RAM to 2Gb, but in the short term I’m sure I have an SD card kicking around somewhere I could use for ReadyBoost, should the need arise.

Why Vista? UAC. Shocked? I probably would have been a few months ago, but in all seriousness, UAC means that it’s much easier to run with limited-rights user accounts and still be able to do admin-stuff if the need arises. You can’t do that with XP Pro.

Christmas Houses

At the moment I'm working on the Christmas Cards - the backgrounds of these always take the longest. Somehow if a house doesn't look right it matters more than if a character doesn't..

This is one of the houses I drew today. Its loosely based on a Susie style etching from her times in Portland.


Installation of SCVMM 2008 beta disables non-admin access to remote machines via Hyper-V manager

Yesterday I finally got around to installing SCVMM 2008 beta onto a virtual machine (mainly to help us with some virtual machine migrations we've got coming up).  I must say that I think SCVMM 2008 beta is very nice indeed!

On my Vista machine I use Tore Lervik's Hyper-V Monitor Gadget for Windows Sidebar, and have done for some time.  With the number of virtual machines we run, I have found it an invaluable addition to my sidebar.

This morning however, when I tried to connect to one of the virtual machines listed by the gadget, I got an error message 'An error occurred trying to find the virtual machine <GUID> on the server <servername>'.  In addition, when I tried to use Hyper-V manager, I received the error 'The virtual machine management service is not available'.

We thought for a while that it was related to remote rights (WMI/DCOM) on the servers in question (well, technically it is...) and I spent a while trawling through John Howard's articles relating to the required rights for remote management (well worth a read by the way).  Unfortunately even working through the articles didn't solve my problem.

After a little more rummaging, it turns out that installation of the SCVMM agent onto the servers hosting the virtual machines I want to remotely manage is what is causing the problem.  Anyone who is a local admin on the servers in question can freely manage the remote virtual machines; if you're not a local admin, you can't.  There are two potential solutions to the problem:

  1. uninstall the SCVMM agent from the servers in question (which would no longer allow us to manage them from SCVMM)
  2. Make anyone who needs to remotely manage virtual machines a local administrator on the servers in question

Lets be honest, neither option is entirely appealing (it's not that we don't trust some of the people who need to remotely manage specific machines, I just always would prefer to work from a 'minimum rights necessary' point of view), but as we have some migrations coming soon for which SCVMM is going to really help, we've gone for the latter.

I hope that this is something that is corrected in the RTM version of SCVMM 2008!