But it works on my PC!

The random thoughts of Richard Fennell on technology and software development

My experiences moving to BlogEngine.NET


I have recently moved this blog server from using Community Server 2007 (CS2007) to BlogEngine.NET.

We started blogging in 2004 using .Text, moving through the free early versions of Community Server then purchased Community Server Small Business edition in 2007. This cost a few hundred pounds. We recently decided that we had to bring this service up to date, if for no other reason, to patch the underling ASP.NET system up to date. We checked how much it would cost to bring Community Server to the current version and were shocked by the cost, many thousands of dollars. Telligent, the developers, have moved to only servicing enterprise customers, they have no small business offering. So we needed to find a new platform.

Being a SharePoint house, we consider SharePoint as the blog host. However, we have always had the policy to have systems that have external content creation i.e. you can post a comment, not be on our primary business servers. As we did not want to install a dedicated SharePoint farm for just the blogs we decided to use another platform, remembering we needed on that could support multiple blogs that we could aggregate to provide a BM-Bloggers shared service.

We looked at what appears to be the market leader Wordpress, but to host this we needed a MySql Db, which we did not want to install, we don’t need another DB technology on our LAN to support. So we settled on BlogEngine.NET, the open source .NET4 blogging platform that can use many different storage technologies, we chose SQL2008 to use our existing SQL server investment.


So we did a default install of BlogEngine.NET. We did it manually as I knew we were going to use a custom build of the code, but we could have used the Web Platform Installer

We customised a blog as a template and the used this to create all the child blogs we needed. If we were not bring over old content we would have been finished here. It really would have been quick and simple.

Content Migration

To migrate our data we used BlogML. This allowed us to export CS2007 content as XML files which we then imported to BlogEngine.NET.

BlogEngine.NET provides support for BlogML our the box, but we had install a plug-in for CS2007

This was all fairly straight forward, we exported each blog and imported it to the new platform, but as you would expect we did find a few issues

Fixing Image Path (Do this prior to import)

The image within blog posts are hard coded as URLs in the export file. If you copied over the image files (that are stored on the blog platform) from the old platform to the new server, on matching urls, then there should be no problems.

However, I decided I wanted images in the location they are meant to be in i.e the [blog]\files folder using BlogEngine.NETs image.axd file to  load them. It was easiest to fix these in the BlogML XML file prior to importing it. The basic edited was to change




I did these edits with simple find and replace in a text editor, but you could use regular expressions.

Remember also the images need to be copied from the old server (…\blogs\rfennell\image_file.png)  to a the new server ( …\App_Data\blogs\rfennell\files\image_file.png)

We also had posts written with older versions of LiveWriter. This placed images in a folder structure (e.g.  ..\blogs\rfennell\livewriter\postsname\image_file.png). We also need to move these to the new platform and fix the paths appropriately.

Post Ownership

All the imported posts were shown to have an owner ID not the authors name e.g. 2103 as opposed to Richard.The simplest fix for this was a SQL update after import e.g.

update [BlogEngine].[dbo].[be_Posts] set [Author] = 'Richard' where [Author]='2103'

The name set should match the name of a user account created on the blog

Comment Ownership

Due to the issues over spam we had forced all users to register on CS2007 to post a comment. These external accounts were not pulled over in the export. However, BlogEngine.NET did not seem that bothered by this.

However no icons for these users was show.


These icons should be rendered using the websnapr.com as a image of the commenter's homepage, but this was failing. This it turned our due to their recent a API changes, you now need to pass a key. As an immediate solution to this I just removed the code that calls websnapr so the default noavatar.jpg image is shown. I intend to look at this when the next release of BlogEngine.NET appears as I am sure this will have a solution to the websnapr API change.

There was also a problem many of the comment author hyper links they all seemed to be http://. To fix the worst of this I ran a SQL query.

update be_PostComment set author = 'Anon' where Author = 'http://'

I am sure I could have done a better job with a bit more SQL, but our blog has few comments so I felt I could get away with this basic fix


CS2007 displays tag clouds that are based on categories. BlogEngine.Net does the more obvious and uses categories as categories and tags as tags.

To allow the BlogEngine.NET  to show tag clouds the following SQL can be used to duplicate categories to tags

insert into be_PostTag (BlogID,PostID, Tag)
select be_PostCategory.BlogID, postID, categoryname from be_PostCategory, be_Categories where be_PostCategory.CategoryID = be_Categories.CategoryID and be_PostCategory.BlogID ='[a guid from be_blogs table]'

A workaround for what could not be exported

Were we had a major problem was the posts that were made to the original .Text site that was upgraded to Community Server, these were posts from 2004 to 2007.

Unlike all the other blogs these posts would not export via the CS BlogML exporter. We just got a zero byte XML file. I suspect the issue was some flag/property was missing on these posts so the CS2007 internal API was having problems, throwing an internal exception and stopping.

To get around this I had to use the BlogML SDK and some raw SQL queries into CS2007 database. There was a good bit of trial and error here, but by looking at the source of BlogML CS2007 exporter and swapping API calls for my best guess at the SQL I got the posts and comments. It was a bit rough, but am I really that worried over 5 year old plus posts?

Blog Relationships

Parent/Child Relationship

When a child blog is created an existing blog is copied as a template. This includes all its page, posts and users. For this reason it is a really good idea to keep a ‘clean’ template that as as many of the setting correct as possible. So when a new child blog is create you basically only have to create new user accounts and set its name/template

Remember no user accounts are shared between blogs, so the admin on the parent is not the admin on the child, each blog has its own users.

Content Aggregation

A major problem for Black Marble was the lack of aggregation of child blogs. At present BlogEngine.NET allows child blogs, but no built in way to roll up the content to the parent. This is a feature that I understand the developers plan to add in a future release.

To get around this problem, I looked to see if it was easy to modify the FillPosts methods to return all post irrespective of the blog. This would, I my opinion,  have taken too much hacking/editing due to the reliance on the current context to refer to the current blog, so I decided on a more simplistic fix

  1. I create a custom template for the parent site that removes all the page/post lists and menu options
  2. Replaced the link to the existing syndication.axd with a hand crafted syndication.ashx
  3. Added the Rssdotnet.com open source project to the solution and used this to aggregate the Rss feeds of each child blog in the syndication.ashx page

This solution will be reviewed on each new release of BlogEngine.Net in case it is no longer required.


So how was the process? not as bad as I expected, frankly other than our pre-2007 content it all moved without any major issues.

It is a good feeling to now be on platform we can modify as we need, but has the backing of an active community.

Windows Phone 7 not synchronising Outlook

I had a problem with my LG E900 WP7 phone over the weekend, Outlook stopped synchronising with our office Exchange server.

It started when I got back from a trip Ireland. My phone switched back from roaming and started to use 3G for data again, as opposed to WIFI. Also over the weekend we had a connectivity problem from the office to the Internet so for a while I could not connect to any of our services from any device. However, even after both these things were sorted my Outlook still failed to sync, it said it was in sync but showed no new email since Friday when it was disconnected from my Irish ISP based MIFI in Ireland. No errors were shown. I waited until I got back to the office and tried a sync via our internal WIFI, all to no effect.

The fix was simple, and obvious, delete the Outlook account on the phone and recreated when I was in the office. Problem is I still have no idea why this issue occurred.

So that is 2 issues in about 6 months, much better than my previous few phones!

Working with Hyper-V, VLAN tags and TFS 2010 Lab Management

I did a post at the start of the year about Lab management and VLAN tags, how they are not supported, but you can work around the problems. Over the past few months we have split our old Hyper-V cluster into one for production and one for test/lab development. This gave our IT team a chance to look at the VLAN problem again.

So a quick reminder of the issue – the deployment tools in Lab management that create environments provide no means to set a VLAN tag for any networks connections they create. Once an environment is created you can manually set a VLAN tag, but it is all a bit of a pain and certainly unsupported.

The solution our IT team have come up with to avoid the problem is to set the default VLAN tag on the physical port on the Ethernet switch. Hence any VMs/Environments on the the new test/lab Hyper-V don’t have to worry about VLANs at all, they are all automatically, in our case, on subnet 200. This works for TFS Lab Management and also means our developers need to have no knowledge of IP routing setup to deploy a VM/environment. Our production Hyper-V box, that runs much of our business systems, still uses manually set VLAN tagging as before, but as there is no auto deployment involved on this system there are no problems.

There is one gotcha though…..

If you try to use a VM created on our old setup, that was previously set with the VLAN tag of 200, it cannot see the LAN, even though it has what you think is the correct VLAN tag. This is because setting a VLAN tag with Hyper-V to 200 is not the same as not setting a VLAN tag in the operating system and letting the Ethernet switch default the port to the VLAN tag 200. So you have to let the switch manage the VLAN tag, the VM needs to know nothing about it. As shown below


So once this is all set you have your routed network, but also have a fully supported Lab Management setup

More experiences upgrading my Media Center to receive Freeview HD

In my post experiences upgrading my Media Center to receive Freeview HD I said I thought the reason my Windows 7 Media Center was hanging at the "TV signal configuration” step was down to using mixed tuner cards. Well my second PCTV nanoStick T2.arrived yesterday so I was able to try the same process with a pair of identical USB T2 tuners.

Guess what? I got the same problem!

However, being USB devices it mean I could test the tuners on my laptop, a Lenovo W520 (Core i7, 16Gb, Windows 7). So I plugged them both in, they found drivers from the web automatically, I ran Media Center, select setup the TV signal and……. it worked! A few worrying pauses here and there, but it got there in about an hour.

So why did it work on a laptop and not on my Media Center PC?

I considered performance, but it seemed unlikely,the Media Center is aCore2 Duo based system about 3 years old and has had no performance problems to date. So the only difference was that the laptop had never seen a TV Tuner before, the Media Center had.

Unused drivers

So I wondered if the old Hauppauge drivers were causing the problem. Remember in Windows if you removed an adaptor card then the drivers are not removed automatically. If  the driver was automatically added (as opposed to you running a setup.exe) then there is no obvious way to removed the drivers. The way to do it as detailed in this Microsoft Answers post. When you load device manager this way you see the Hauppauge devices and you can uninstall their drivers.

And it makes no difference to the problem.

Media Center Guide Data and Tuner setup

Using task manager I could see that when Media Center TV setup appeared to hang the mcupdate.exe program was running and using a lot of CPU. I had seen this on the Lenovo, but it has passed within 30 seconds or so, on my 3 years old Intel based Media Center PC I would expect it to be a bit slower, but I left it overnight and it did not move on. So it is not just performance.

The mcupdate.exe is the tools that updates the TV guide data for Media Center. It is run on a regular basis and also during the setup. So it seems the issue as far as I can see that

  1. There is corrupt guide data so that it cannot update the channel guide
  2. There is data about a non-existent tuner that locks the process
  3. There is just too much data to update in the time allows (but you would expect leaving it overnight would fix this)
  4. There is an internet problems getting the guide (which I doubt, too much of a coincidence it happens only when I upgrade a tuner)

Simply put I think when the TV setup gets to the point it needs to access this data, it gets into a race condition with the mcupdate.exe process which is trying to update the guide.

The Hack7MC blog post seems to suggest the problem is that the guide data and tuner setup needs to be cleared down and provides a process. post suggest the problem can be addressed by cleared down the data; it provides a process to do this. However I though I would try to avoid this as I did not want really to loose the series recording settings I had if I could avoid it.

So I loaded Media Center and select update guide from the Task menu. This started the mcupdate process and  caused a 50% CPU load, and showed no sign of stopping. Again pointing to a probably one of the issues listed above. So I unloaded Media Center, but mcupdate.exe was still running as was the tool tray notification application. Again I left this a while to no effect. So I used task manager to kill mcupdate and the ectray.exe application.

I had at this point intend to run the process from the Hack7MC post, so stopped all Media Center services, but thought i would give the setup one more try. When I ran the setup TV dsignal I got a message along the lines of ‘guide data corrupt will reload’ and then the setup proceeded exactly as it should have done in the first place. I ended up will all my channels  both HD and non-HD accessible from both tuner, and all my series recording settings intact.

So a success, I am still not clear which step fixed the issue, but I am sure it was down to needing to clear down the guide data and tuner setting fully.

Access denied when running a command with InvokeProcess in a TFS team build

When you are trying to run a command line tool via the InvokeProcess activity in a TFS 2010 Team build you might see the somewhat confusing ‘Access denied’ error. There appears to be no more detail in the log.

I have found that this is usually down to a type on the filename property of the activity.

It should be set to something like

“c:\my tools\tool.exe”

but is actually set to

“c:\my tools”

i.e. it is set to the folder not the filename. An easy mistake to make of cutting and pasting paths in from batch files.

You cannot execute a folder, hence the access denied error. Simple but easy to miss.

No error detail when using VMPrep

When using VMPrep to setup a VM for use in a Lab Management system I got the error cross at the bottom of the dialog


Not too much help, usually there is a link to the log file or a message.

If you look in the log file in c:\user\[name]\Appdata\Roaming\LMInstaller.txt you see that the path to the Patches folder is invalid.

This is fixed by editing the VMPrepTool\VMPrepToolLibrary\Applications.XML file and correcting the path (which I had made a typo in)

Using Nuget and TFS Build 2010

At one of our recent events I was asked if I had any experience using Nuget within a TFS 2010 build. At the time I had not, but I thought it worth a look.

For those of you who don’t know Nuget is a package manager that provides a developer with a way to manage assembly references in a project for assemblies that are not within their solution. It is most commonly used to manage external commonly used assemblies such a nHibernate or JQuery but you can also use it manage your own internal shared libraries.

The issue the questioner had was that they had added references via Nuget to a project


Their project then contained a packages.config file that listed the Nuget dependencies. This was in the project root with the <project>.csproj file.

<?xml version="1.0" encoding="utf-8"?>
<packages>   <package id="Iesi.Collections" version="" />   <package id="NHibernate" version="" />

This packages.config  is part of the Visual Studio project and so when the project was put under source control so was it.

However, when they created a TFS build to build this solution all seems OK until the build ran, when they got a build error along the lines

Form1.cs (16): The type or namespace name 'NHibernate' could not be found (are you missing a using directive or an assembly reference?)
C:\Windows\Microsoft.NET\Framework64\v4.0.30319\Microsoft.Common.targets (1490): Could not resolve this reference. Could not locate the assembly "Iesi.Collections". Check to make sure the assembly exists on disk. If this reference is required by your code, you may get compilation errors.
C:\Windows\Microsoft.NET\Framework64\v4.0.30319\Microsoft.Common.targets (1490): Could not resolve this reference. Could not locate the assembly "NHibernate". Check to make sure the assembly exists on disk. If this reference is required by your code, you may get compilation errors.

Basically the solution builds locally but not on the build box, the assemblies referenced by Nuget are missing. A quick look at the directory structure show why. Nuget stores the assemblies it references in the solution folder, so you end up with

Solution Directory
      Packages – the root of the local cache of assemblies created by Nuget
      Project Directory

If you look in the <project>.csproj  file you will see a hint path pointing back up to this folder structure so that the project builds locally

<Reference Include="NHibernate">

The problem is that this folder structure is not known to the solution (just to Nuget), so this means when you add the solution to source control this structure is not added, hence the files are not there for the build box to use.

To fix this issue there are two options

  1. Add the folder to source control manually
  2. Make the build process aware of Nuget and allow it to get the files it needs as required.

For now lets just use the first option, which I like as in general in do want to build my projects against a known version of standard assemblies, so putting the assemblies under source control is not an issue for me. It allows me to easily go back to the specific build if I have to.

(A quick search with your search engine of choice will help with the second option, basically using the nuget.exe command line is the core of the solution)

To add the files to source control, I when into Visual Studio > Team Explorer > Source Control and navigated to the correct folder. I then pressed the add files button and added the whole Packages folder. This is where I think my questioner might have gone wrong. When you add the whole folder structure the default is to exclude .DLLs (and .EXEs)


If you don’t specifically add these files you will still get the missing references on the build, but could easily be thinking ‘ but I just added them!’, easy mistake to made, I know I did it.

Once ALL the correct files are under source control the build works as expected.

Seeing loads of ‘cannot load load assemblies’ errors when editing a TFS 2010 build process workflow

I have been following the process in the ALM Rangers build guide and in the Community Build Extensions to edit a build process workflow. Now I am sure this process was working until recently on my PC (but we all say that don’t we!), but of late I have found that when the .XAML workflow is loaded into Visual Studio I see loads of warning icons. If I check the list of imported namespaces many of them also have warning icons which if the icons are clicked they say the assembly cannot be found.


Now all these errors did not stop the editing process working. What I found was that if I made an edit in the graphical designer for the workflow or edited a property of an activity then my Visual Studio instance locked for about 20 seconds and it was fine (whilst there was loads of disk activity). I also noticed I got no intellisense when setting properties. Not a great position to be in but at least I could make some edits, if only slowly.

Using Process Monitor I could see that Visual Studio was scanning folders for the files when loading the XAML workflow, but not finding them.

The fix is actually simple. In the project that is being used as a container for the workflow being editing, make sure you reference the missing assemblies. These can be found in one of the following folders

  • C:\Program Files (x86)\Microsoft Visual Studio 10.0\Common7\IDE
  • C:\Program Files (x86)\Microsoft Visual Studio 10.0\Common7\IDE\PrivateAssemblies
  • C:\Program Files (x86)\Microsoft Visual Studio 10.0\Common7\IDE\ReferenceAssemblies\v2.0

On my PC most of the assemblies were in the ReferenceAssemblies folder, not the first two, but on checking another PC at my office they were present in the PrivateAssemblies (which VS does scan)

Not sure why this has stopped working, what removed the files from my PrivateAssemblies folder, the only thing I can thing that I did was to install the Dev11 preview, But can’t see how this should have any effect.

Empty groups not being expanded in a combobox for a TFS work item

A common work item type (WIT) edit in TFS is to limit the list of names shown in a combo to the users assigned to the project i.e. the members of the Team Projects Contributors and Project Administrators groups.

This is done by editing the WIT either via your favourite XML editor or the Process Template Editor (part of the power tools). You edit the Allowedvalues for the field you wish to limit such as the Assigned To as shown below,


Which gives the following XML behind the scenes (for those using XML editors)

<ListRule filteritems="excludegroups">
  <LISTITEM value="[Project]\Contributors" />
  <LISTITEM value="[Project]\Project Administrators" />
  <LISTITEM value="Unassigned" />

Notice that Expand Items and Exclude Groups are checked. This means that the first two lines in the list will be expanded to contain the names in the groups, not the group names themselves.

A small gotcha here is that if either of the groups are empty you do see the group name in the combobox list, even with the Exclude Groups checked. Team Explorer does not expand an empty list to be a list with no entries, it show the group name. So you would see in the combo something like

  • [MyProject]\Contributors
  • John
  • Fred
  • Unassigned

where John and Fred as project administrators and the [MyProject]\Contributors group is empty.

This should not be a serious issue as in most cases why would you have a Team Project with no contributors or administrators? However it is conceivable with more complex security models you might see this issue. if so make sure each group in the list has at least one member, again if it does not have any members do you really need it?

Stupid gotchas on a SQL 2008 Reporting Services are why I cannot see the Report Builder Button

There is a good chance if you are using TFS that you will want to create some custom reports. You can write these in Reporting Services via BI Studio or Excel, but I wanted to use Report Builder, but could not see the Report Builder button on this Reporting Services menu


The problem was multi-levelled

First I had to give the user access to the Report Builder. This is done using folder property security. I chose to give this right to a user (along with browser rights) from the root of the reporting services site


But still no button. Forums and blog posts then talk about changing options on the ‘Site Settings’ menu, the above screenshots shows that this is also missing from the top right.

To get this menu option back, I had to run my browser as administrator and then this option appeared. Turns out that the TFS Setup user I was using  had not been made a Reporting Services site administrator, just a content administrator.

But still this was not enough, I also add to add users as System Users to allow the Reporting Services button to appears. So my final Site Settings > Security options were


Once all this was done I got my Report Build button and I could start to write reports.