But it works on my PC!

The random thoughts of Richard Fennell on technology and software development

The battle of the Lenovo W520 and projectors

My Lenovo W520 is the best laptop I have owned, but I have had one major issue with it, external projectors. The problem is it does not like to duplicate the laptop screen output to a projector, it works fine if extending the desktop, not duplicating.

Every time I have tried to use it with a projector I either end up only showing on the projector and looking over my shoulder, or fiddling for ages until it suddenly works, usually at a low resolution, I don’t know what I did to get to this point so I don’t dare fiddle any more so use it anyway. A bit of a problem given the number of presentations I do. A quick search shows I am not alone in this problem.

The issue it seems is down to the fact the Lenovo has two graphics systems, an integrated (Intel) one and a discrete (Nvidia) one. The drivers in Windows 7 allow it to switch dynamically between the two to save power. This is called Nvidia Optimus Switching.

The answer to the problem is to disable this Optimus feature in the BIOS, this is at the cost of some battery life, but better to have a system that works as I need and have to plug it in, than does not work at most client sites.

So to make the change

  1. Reboot into BIOS (press the ThinkVantage button)
  2. Select the Discrete graphics option (the Nvidia 1000M)
  3. Disable the Opitmus features
  4. Save and Reboot
  5. Windows 7 re-detects all the graphics drivers and then all seems OK (so far…)

On more point it is worth noting I again fell for the problem that as my WIndows 7 partition is BitLockered you have to enter your recovery key if you change anything in the BIOS, see my past post for details of how to fix this issue. Was a bit surprised by this as I thought BitLocker would only care about changes to the master boot record, but you live and learn.

Error 0x80070490 when trying to make any purchase on WP7 MarketPlace

Recently I have had a problem with my LG E900 Windows Phone 7 running Mango. Whenever I try to make a purchase on marketplace I was getting the error “There has been a problem completing your request. Try again later” and seeing the error code 0x80070490. A search on the web, asking around everyone I thought might have an answer and placing a question on Microsoft Answers got me no where.

The phone had been working fine until a few days ago. The problem started when I tired to run the Amazon Kindle app (now my primary platform for reading, yes decided not to buy an actual Kindle at this point), this failed to start, it just kept returning to the phone home page. A power cycle of the phone had no effect. I have seen this before and fixed it with a remove and re-install of the app. However, though the remove was fine, whenever I try to reinstall I got the 0x80070490 error.

I tried installing another WP7 application (not a reinstall) but I got the same error.

As this is a development phone I was able to try to deploy an app XAP file I created from my PC. This worked without a problem.

I checked my account in Zune, I could login and see the applications I have purchased in the past, so I suspected the issue was corruption of the local catalogue on the phone, but I had no way to prove it.

At this point I was out of ideas so did a reset to factory settings on the phone. This was a bit of pain as my phone is one of the ones form the PDC last year, which Microsoft sourced in Germany. So it was off to Google Translate to help me through enough German screens to set the language to English.  But on the plus side I have learnt ‘notruf’ is German for ‘emergency call’.

So I had to

  • Sync with Zune to get my data off the phone
  • Factory reset (Settings|About)
  • Set to English
  • Reinstall Apps I had previous purchased
  • Re-Sync with Zune and put back any music, podcasts etc.
  • Set the APN (Setting|Mobile Network) as with Vodafone UK the phone does not seem to pick this automatically
  • Set things like ring tones, screen locks
  • And I am sure there are things I will notice I missed over the next few days…..

So this took about 30 minutes to get my phone back to something like my settings. Not a great owner experience, but we repave our PCs regularly to get ride of the accumulated rubbish, so why not our phones?

When you forget to save a word document

We have all done it, opened Word typed all morning, not bothering to save the file as we go along and then for some mad reason exited Word say you did not want to save. So you loose the mornings work.

Now we know that Word does an auto save, but if you are stupid enough to say yes on exit without saving how do you get the auto backup file? Does Word even keep a backup if you never saved the file for the first time?

This is just the problem I had recently.

It used to be that to get back an auto recovery file you were hunting around in the

C:\Users\[user]\AppData\Roaming\Microsoft\Word\ auto recover

folder (or wherever it was set in the Word options). Hopefully Word would do this for you, but remember Word will not look for these files if it exited without error. It only tries to recover files if it crashed.

What I did not know was that there was a way to hunt for these files via the menus in Word 2010.

  1. In Word click the "File" menu, and select the option for "Recent."

  2. Click the option for "Recover Unsaved Documents."

  3. image

  4. You should get the following dialog and your file(s) should be listed


Isn’t it amazing how many features there are in products you use every day you don’t know about. This one saved me a good few hours this week!

My experiences moving to BlogEngine.NET


I have recently moved this blog server from using Community Server 2007 (CS2007) to BlogEngine.NET.

We started blogging in 2004 using .Text, moving through the free early versions of Community Server then purchased Community Server Small Business edition in 2007. This cost a few hundred pounds. We recently decided that we had to bring this service up to date, if for no other reason, to patch the underling ASP.NET system up to date. We checked how much it would cost to bring Community Server to the current version and were shocked by the cost, many thousands of dollars. Telligent, the developers, have moved to only servicing enterprise customers, they have no small business offering. So we needed to find a new platform.

Being a SharePoint house, we consider SharePoint as the blog host. However, we have always had the policy to have systems that have external content creation i.e. you can post a comment, not be on our primary business servers. As we did not want to install a dedicated SharePoint farm for just the blogs we decided to use another platform, remembering we needed on that could support multiple blogs that we could aggregate to provide a BM-Bloggers shared service.

We looked at what appears to be the market leader Wordpress, but to host this we needed a MySql Db, which we did not want to install, we don’t need another DB technology on our LAN to support. So we settled on BlogEngine.NET, the open source .NET4 blogging platform that can use many different storage technologies, we chose SQL2008 to use our existing SQL server investment.


So we did a default install of BlogEngine.NET. We did it manually as I knew we were going to use a custom build of the code, but we could have used the Web Platform Installer

We customised a blog as a template and the used this to create all the child blogs we needed. If we were not bring over old content we would have been finished here. It really would have been quick and simple.

Content Migration

To migrate our data we used BlogML. This allowed us to export CS2007 content as XML files which we then imported to BlogEngine.NET.

BlogEngine.NET provides support for BlogML our the box, but we had install a plug-in for CS2007

This was all fairly straight forward, we exported each blog and imported it to the new platform, but as you would expect we did find a few issues

Fixing Image Path (Do this prior to import)

The image within blog posts are hard coded as URLs in the export file. If you copied over the image files (that are stored on the blog platform) from the old platform to the new server, on matching urls, then there should be no problems.

However, I decided I wanted images in the location they are meant to be in i.e the [blog]\files folder using BlogEngine.NETs image.axd file to  load them. It was easiest to fix these in the BlogML XML file prior to importing it. The basic edited was to change




I did these edits with simple find and replace in a text editor, but you could use regular expressions.

Remember also the images need to be copied from the old server (…\blogs\rfennell\image_file.png)  to a the new server ( …\App_Data\blogs\rfennell\files\image_file.png)

We also had posts written with older versions of LiveWriter. This placed images in a folder structure (e.g.  ..\blogs\rfennell\livewriter\postsname\image_file.png). We also need to move these to the new platform and fix the paths appropriately.

Post Ownership

All the imported posts were shown to have an owner ID not the authors name e.g. 2103 as opposed to Richard.The simplest fix for this was a SQL update after import e.g.

update [BlogEngine].[dbo].[be_Posts] set [Author] = 'Richard' where [Author]='2103'

The name set should match the name of a user account created on the blog

Comment Ownership

Due to the issues over spam we had forced all users to register on CS2007 to post a comment. These external accounts were not pulled over in the export. However, BlogEngine.NET did not seem that bothered by this.

However no icons for these users was show.


These icons should be rendered using the websnapr.com as a image of the commenter's homepage, but this was failing. This it turned our due to their recent a API changes, you now need to pass a key. As an immediate solution to this I just removed the code that calls websnapr so the default noavatar.jpg image is shown. I intend to look at this when the next release of BlogEngine.NET appears as I am sure this will have a solution to the websnapr API change.

There was also a problem many of the comment author hyper links they all seemed to be http://. To fix the worst of this I ran a SQL query.

update be_PostComment set author = 'Anon' where Author = 'http://'

I am sure I could have done a better job with a bit more SQL, but our blog has few comments so I felt I could get away with this basic fix


CS2007 displays tag clouds that are based on categories. BlogEngine.Net does the more obvious and uses categories as categories and tags as tags.

To allow the BlogEngine.NET  to show tag clouds the following SQL can be used to duplicate categories to tags

insert into be_PostTag (BlogID,PostID, Tag)
select be_PostCategory.BlogID, postID, categoryname from be_PostCategory, be_Categories where be_PostCategory.CategoryID = be_Categories.CategoryID and be_PostCategory.BlogID ='[a guid from be_blogs table]'

A workaround for what could not be exported

Were we had a major problem was the posts that were made to the original .Text site that was upgraded to Community Server, these were posts from 2004 to 2007.

Unlike all the other blogs these posts would not export via the CS BlogML exporter. We just got a zero byte XML file. I suspect the issue was some flag/property was missing on these posts so the CS2007 internal API was having problems, throwing an internal exception and stopping.

To get around this I had to use the BlogML SDK and some raw SQL queries into CS2007 database. There was a good bit of trial and error here, but by looking at the source of BlogML CS2007 exporter and swapping API calls for my best guess at the SQL I got the posts and comments. It was a bit rough, but am I really that worried over 5 year old plus posts?

Blog Relationships

Parent/Child Relationship

When a child blog is created an existing blog is copied as a template. This includes all its page, posts and users. For this reason it is a really good idea to keep a ‘clean’ template that as as many of the setting correct as possible. So when a new child blog is create you basically only have to create new user accounts and set its name/template

Remember no user accounts are shared between blogs, so the admin on the parent is not the admin on the child, each blog has its own users.

Content Aggregation

A major problem for Black Marble was the lack of aggregation of child blogs. At present BlogEngine.NET allows child blogs, but no built in way to roll up the content to the parent. This is a feature that I understand the developers plan to add in a future release.

To get around this problem, I looked to see if it was easy to modify the FillPosts methods to return all post irrespective of the blog. This would, I my opinion,  have taken too much hacking/editing due to the reliance on the current context to refer to the current blog, so I decided on a more simplistic fix

  1. I create a custom template for the parent site that removes all the page/post lists and menu options
  2. Replaced the link to the existing syndication.axd with a hand crafted syndication.ashx
  3. Added the Rssdotnet.com open source project to the solution and used this to aggregate the Rss feeds of each child blog in the syndication.ashx page

This solution will be reviewed on each new release of BlogEngine.Net in case it is no longer required.


So how was the process? not as bad as I expected, frankly other than our pre-2007 content it all moved without any major issues.

It is a good feeling to now be on platform we can modify as we need, but has the backing of an active community.

Windows Phone 7 not synchronising Outlook

I had a problem with my LG E900 WP7 phone over the weekend, Outlook stopped synchronising with our office Exchange server.

It started when I got back from a trip Ireland. My phone switched back from roaming and started to use 3G for data again, as opposed to WIFI. Also over the weekend we had a connectivity problem from the office to the Internet so for a while I could not connect to any of our services from any device. However, even after both these things were sorted my Outlook still failed to sync, it said it was in sync but showed no new email since Friday when it was disconnected from my Irish ISP based MIFI in Ireland. No errors were shown. I waited until I got back to the office and tried a sync via our internal WIFI, all to no effect.

The fix was simple, and obvious, delete the Outlook account on the phone and recreated when I was in the office. Problem is I still have no idea why this issue occurred.

So that is 2 issues in about 6 months, much better than my previous few phones!

Working with Hyper-V, VLAN tags and TFS 2010 Lab Management

I did a post at the start of the year about Lab management and VLAN tags, how they are not supported, but you can work around the problems. Over the past few months we have split our old Hyper-V cluster into one for production and one for test/lab development. This gave our IT team a chance to look at the VLAN problem again.

So a quick reminder of the issue – the deployment tools in Lab management that create environments provide no means to set a VLAN tag for any networks connections they create. Once an environment is created you can manually set a VLAN tag, but it is all a bit of a pain and certainly unsupported.

The solution our IT team have come up with to avoid the problem is to set the default VLAN tag on the physical port on the Ethernet switch. Hence any VMs/Environments on the the new test/lab Hyper-V don’t have to worry about VLANs at all, they are all automatically, in our case, on subnet 200. This works for TFS Lab Management and also means our developers need to have no knowledge of IP routing setup to deploy a VM/environment. Our production Hyper-V box, that runs much of our business systems, still uses manually set VLAN tagging as before, but as there is no auto deployment involved on this system there are no problems.

There is one gotcha though…..

If you try to use a VM created on our old setup, that was previously set with the VLAN tag of 200, it cannot see the LAN, even though it has what you think is the correct VLAN tag. This is because setting a VLAN tag with Hyper-V to 200 is not the same as not setting a VLAN tag in the operating system and letting the Ethernet switch default the port to the VLAN tag 200. So you have to let the switch manage the VLAN tag, the VM needs to know nothing about it. As shown below


So once this is all set you have your routed network, but also have a fully supported Lab Management setup

More experiences upgrading my Media Center to receive Freeview HD

In my post experiences upgrading my Media Center to receive Freeview HD I said I thought the reason my Windows 7 Media Center was hanging at the "TV signal configuration” step was down to using mixed tuner cards. Well my second PCTV nanoStick T2.arrived yesterday so I was able to try the same process with a pair of identical USB T2 tuners.

Guess what? I got the same problem!

However, being USB devices it mean I could test the tuners on my laptop, a Lenovo W520 (Core i7, 16Gb, Windows 7). So I plugged them both in, they found drivers from the web automatically, I ran Media Center, select setup the TV signal and……. it worked! A few worrying pauses here and there, but it got there in about an hour.

So why did it work on a laptop and not on my Media Center PC?

I considered performance, but it seemed unlikely,the Media Center is aCore2 Duo based system about 3 years old and has had no performance problems to date. So the only difference was that the laptop had never seen a TV Tuner before, the Media Center had.

Unused drivers

So I wondered if the old Hauppauge drivers were causing the problem. Remember in Windows if you removed an adaptor card then the drivers are not removed automatically. If  the driver was automatically added (as opposed to you running a setup.exe) then there is no obvious way to removed the drivers. The way to do it as detailed in this Microsoft Answers post. When you load device manager this way you see the Hauppauge devices and you can uninstall their drivers.

And it makes no difference to the problem.

Media Center Guide Data and Tuner setup

Using task manager I could see that when Media Center TV setup appeared to hang the mcupdate.exe program was running and using a lot of CPU. I had seen this on the Lenovo, but it has passed within 30 seconds or so, on my 3 years old Intel based Media Center PC I would expect it to be a bit slower, but I left it overnight and it did not move on. So it is not just performance.

The mcupdate.exe is the tools that updates the TV guide data for Media Center. It is run on a regular basis and also during the setup. So it seems the issue as far as I can see that

  1. There is corrupt guide data so that it cannot update the channel guide
  2. There is data about a non-existent tuner that locks the process
  3. There is just too much data to update in the time allows (but you would expect leaving it overnight would fix this)
  4. There is an internet problems getting the guide (which I doubt, too much of a coincidence it happens only when I upgrade a tuner)

Simply put I think when the TV setup gets to the point it needs to access this data, it gets into a race condition with the mcupdate.exe process which is trying to update the guide.

The Hack7MC blog post seems to suggest the problem is that the guide data and tuner setup needs to be cleared down and provides a process. post suggest the problem can be addressed by cleared down the data; it provides a process to do this. However I though I would try to avoid this as I did not want really to loose the series recording settings I had if I could avoid it.

So I loaded Media Center and select update guide from the Task menu. This started the mcupdate process and  caused a 50% CPU load, and showed no sign of stopping. Again pointing to a probably one of the issues listed above. So I unloaded Media Center, but mcupdate.exe was still running as was the tool tray notification application. Again I left this a while to no effect. So I used task manager to kill mcupdate and the ectray.exe application.

I had at this point intend to run the process from the Hack7MC post, so stopped all Media Center services, but thought i would give the setup one more try. When I ran the setup TV dsignal I got a message along the lines of ‘guide data corrupt will reload’ and then the setup proceeded exactly as it should have done in the first place. I ended up will all my channels  both HD and non-HD accessible from both tuner, and all my series recording settings intact.

So a success, I am still not clear which step fixed the issue, but I am sure it was down to needing to clear down the guide data and tuner setting fully.

Access denied when running a command with InvokeProcess in a TFS team build

When you are trying to run a command line tool via the InvokeProcess activity in a TFS 2010 Team build you might see the somewhat confusing ‘Access denied’ error. There appears to be no more detail in the log.

I have found that this is usually down to a type on the filename property of the activity.

It should be set to something like

“c:\my tools\tool.exe”

but is actually set to

“c:\my tools”

i.e. it is set to the folder not the filename. An easy mistake to make of cutting and pasting paths in from batch files.

You cannot execute a folder, hence the access denied error. Simple but easy to miss.

No error detail when using VMPrep

When using VMPrep to setup a VM for use in a Lab Management system I got the error cross at the bottom of the dialog


Not too much help, usually there is a link to the log file or a message.

If you look in the log file in c:\user\[name]\Appdata\Roaming\LMInstaller.txt you see that the path to the Patches folder is invalid.

This is fixed by editing the VMPrepTool\VMPrepToolLibrary\Applications.XML file and correcting the path (which I had made a typo in)

Using Nuget and TFS Build 2010

At one of our recent events I was asked if I had any experience using Nuget within a TFS 2010 build. At the time I had not, but I thought it worth a look.

For those of you who don’t know Nuget is a package manager that provides a developer with a way to manage assembly references in a project for assemblies that are not within their solution. It is most commonly used to manage external commonly used assemblies such a nHibernate or JQuery but you can also use it manage your own internal shared libraries.

The issue the questioner had was that they had added references via Nuget to a project


Their project then contained a packages.config file that listed the Nuget dependencies. This was in the project root with the <project>.csproj file.

<?xml version="1.0" encoding="utf-8"?>
<packages>   <package id="Iesi.Collections" version="" />   <package id="NHibernate" version="" />

This packages.config  is part of the Visual Studio project and so when the project was put under source control so was it.

However, when they created a TFS build to build this solution all seems OK until the build ran, when they got a build error along the lines

Form1.cs (16): The type or namespace name 'NHibernate' could not be found (are you missing a using directive or an assembly reference?)
C:\Windows\Microsoft.NET\Framework64\v4.0.30319\Microsoft.Common.targets (1490): Could not resolve this reference. Could not locate the assembly "Iesi.Collections". Check to make sure the assembly exists on disk. If this reference is required by your code, you may get compilation errors.
C:\Windows\Microsoft.NET\Framework64\v4.0.30319\Microsoft.Common.targets (1490): Could not resolve this reference. Could not locate the assembly "NHibernate". Check to make sure the assembly exists on disk. If this reference is required by your code, you may get compilation errors.

Basically the solution builds locally but not on the build box, the assemblies referenced by Nuget are missing. A quick look at the directory structure show why. Nuget stores the assemblies it references in the solution folder, so you end up with

Solution Directory
      Packages – the root of the local cache of assemblies created by Nuget
      Project Directory

If you look in the <project>.csproj  file you will see a hint path pointing back up to this folder structure so that the project builds locally

<Reference Include="NHibernate">

The problem is that this folder structure is not known to the solution (just to Nuget), so this means when you add the solution to source control this structure is not added, hence the files are not there for the build box to use.

To fix this issue there are two options

  1. Add the folder to source control manually
  2. Make the build process aware of Nuget and allow it to get the files it needs as required.

For now lets just use the first option, which I like as in general in do want to build my projects against a known version of standard assemblies, so putting the assemblies under source control is not an issue for me. It allows me to easily go back to the specific build if I have to.

(A quick search with your search engine of choice will help with the second option, basically using the nuget.exe command line is the core of the solution)

To add the files to source control, I when into Visual Studio > Team Explorer > Source Control and navigated to the correct folder. I then pressed the add files button and added the whole Packages folder. This is where I think my questioner might have gone wrong. When you add the whole folder structure the default is to exclude .DLLs (and .EXEs)


If you don’t specifically add these files you will still get the missing references on the build, but could easily be thinking ‘ but I just added them!’, easy mistake to made, I know I did it.

Once ALL the correct files are under source control the build works as expected.