When software attacks!

Thoughts and musings on anything that comes to mind

Living with the Acer Aspire 1420P

This blog has been a very quiet place for a long time now, reflecting somewhat how busy I have been elsewhere. During this period of heavy work I have found a new friend in my Aspire 1420Tp In some ways it’s sad – my trusty and reliable Dell Mini 9 has been neglected in favour of a younger, sexier model.

tablet mode

The 1420P is the production model Acer convertible tablet, a variant of which was given to all Microsoft PDC conference attendees last year. We have quite a few in the office; sadly I am the only person to have paid for theirs. However, I benefit greatly from the fact that mine has a UK keyboard with all the right keys in their correct and proper places.

Personally, I think you get quite a lot for your money. For about £400 I have a lightweight, highly portable machine with ample power to perform the daily chores I give to it. I will admit that the first thing I did upon taking it from its box was to add a further 2Gb or RAM to its shipping quota of 2Gb, but many will not find the need to do so.

I thought long and hard before purchasing the 1420. I already have my workhorse laptop – the excellent TravelMate 6593 – which runs the things I need for the more technical aspects of my working life. However, it’s 15.4” frame weighs heavy when doing light work on the sofa in an evening, and it’s not great for casual web browsing.

I was finding myself more and more using my iPhone for casual web browsing, email reading and research. It was far easier than having a laptop on my knee, browsing with the touchpad and keyboard. I was seriously considering an iPad – the slate form factor and extreme usability were attractive. I had an eye on the HP Slate so loudly trumpeted b Steve Ballmer before it vanished frustratingly from view.

There were two problems with the iPad approach: Firstly, being a Yorkshireman, I found the price a little steep for an iPhone on steroids; secondly, and not unrelated to my opinion of the price, it was not as functional and flexible as I wanted.

The 1420P meets my needs ably. For casual web browsing, research and email it spends most of its life in tablet form, running portrait mode as I browse the web using nothing but the touch screen. When I use it for document writing or bits of sysadmin work it turns easily back to a traditional notebook form factor.

slate mode

It’s not perfect. The techy in me wishes that the touch screen was more than a mere 2-point variety; it would be nice if the display was a higher resolution than the now ubiquitous 1366x768, but that’s possibly because I am spoiled by the magnificent 1650x1080 of my TravelMate; the lack of a docking station connector makes it less convenient for use as a workhorse office computer; and it suffers in comparison with the iPad in terms of user interface for touch alone (this isn’t really the fault of the hardware, I suppose).

It’s qualities far outweigh the shortfalls, however. It weights almost nothing, and I can comfortably get eight hours from a full charge which means that the charger (itself small and lightweight) becomes an optional extra for short trips. The glossy screen is bright and clear (although like all glossy screens it suffers in bright light) and does not suffer like the Dell Mini when browsing the web; the keyboard is comfortable and responsive to use and causes me no trouble when working on long documents; finally, and my favourite part, when in tablet mode it is comfortable to hold and natural to use.

Which brings me to something I find really significant about convertibles. When I’m in a meeting I hate using a laptop. I find that the screen immediately forms a barrier between participants and I hate thinking that behind that barrier the person could be doing something other than concentrating on the x7686meeting. I prefer to use a pen and paper as a result, but that means I need to transcribe notes later. The convertible 1420 allows me to switch to tablet mode and use OneNote and the stylus. I have all the benefits of a computer in front of me so I can access documents, email and other resources on demand, but the computer does not come between me and the other attendees. OneNote also allows me to quickly generate notes, tasks, actions and more without leaving the application.


Size 285 x 208.9 x 28.5 mm
Weight 1.72kg
Screen Resolution 1366x768
Multi-touch 2-point
Processor Intel Celeron U2300
Memory 2Gb (upgraded to 4Gb)
Hard drive 160Gb
Price approx. £400

Powershell script to rename files for use as SharePoint 2010 User Profile thumbnails

User profile photos have changed in SharePoint 2010 in that they are now stored in a single image library in the MySite Host root site collection. They have also changed in that when you change the profile photo, SharePoint takes the file and creates three new images at specific sizes, then discards the file you gave it. These files have specific names to link them to the user account and come in small, medium and large flavours.

We’ve just delivered a solution to a customer that involved heavy customisation of the Profile page for users. This also involved replacing the large thumbnail version of the profile picture with one which met our size requirements.

The customer had a large group of image files all named in the pattern <firstname surname>.<extension> which had been loaded in to SharePoint as profile pictures. We wanted a quick way to replace the large thumbnail with our own version.

Enter Powershell, stage left. The script below is a little rough and ready but works great. It gets a directory listing, splits the filename and then looks up in AD to see if there’s a user that matches the filename (sans extension). If it finds a match it renames the file to match the pattern <domain>_<SamAccountName>_LThumb.<extension> to match the profile picture naming convention.

As I said, it’s a little rough and ready but I place here for the greater good. You need the ActiveDirectory powershell module to use this. It’s available on Server 2008 and above, and Windows 7 if you install the remote management tools.

The Active Directory Powershell Blog is a great resource for this stuff!

#import-module ActiveDirectory
if (-not (get-module -name activedirectory)) {
    write-host "This script requires the ActiveDirectory powershell modules to run"
$filesuffix = "LThumb"
$files = get-childitem
foreach ($file in $files) {
    $filesplit = $file.Name.split(".")
    $fullname = $filesplit[0]
    $fileext = $filesplit[1]
    write-host "Searching for:" $fullname
    $user = get-aduser -Filter { Name -eq $fullname }
    if ($user.SamAccountName -eq $Null) {
        write-host "Not Found!"
    } else {
        $newfilename = $domain+"_"+$user.SamAccountName+"_"+$filesuffix+"."+$fileext
        write-host "Renaming:" $file.name "New name:" $newfilename
        rename-item $file.Name $newfilename

Thoughts on the BCS EGM

Stepping along the path ploughed by Richard and Robert, I thought I’d try to order my thoughts on the BCS EGM through a blog post. Like Richard, I am (as I begin writing) uncertain as to my final leaning on this, although I have clear views on some of the issues.

Democracy In Action

One of the most important, in my view, is one which might be missed by many. Should the membership vote in favour of the Board of Trustees they are also strongly encouraged to change the bye-laws of the Royal Charter to stop this happening again.

For me, that is an atrocity and should not be allowed. That’s a strong word, so let me explain why.

The membership of the BCS is being swelled through the push for a higher profile. That’s a good thing, no doubt. However, two percent of an ever increasing membership base quickly becomes a large number of people. The time and effort involved in trying to marshal that many people to raise an object to how the BS is working effectively means that it will never happen. I believe that to be incredibly undemocratic.

Furthermore, the kind of member who is likely to pay enough attention to the actions of the BCS to raise an objection is much more likely to have attained a higher level of membership, such as Fellow. There aren’t many of those about, and I’ll wager that there certainly aren’t enough to amount to two percent!

Arguably what we are currently experiencing is a good thing. A group of highly committed members have used the mechanisms embedded in the charter of our professional body to put the brakes on a process which they believe requires greater scrutiny by the whole membership. Should the EGM vote go against those members, they should still be commended for having the courage and commitment to the BCS that they fought to initiate the process at all.

For the good of the members

The purpose of a professional body such as the BCS is to provide those outside of our industry with a recognisable ‘Kite Mark’ of quality when it comes to engaging the services of IT practitioners. Everything else that the organisation does should revolve around that most important premise.

If we follow the line of reasoning which identifies the activities which must flow from the above aim we will quickly find ourselves in the heart of the current argument.

Affirmation of Qualifications

Other professional bodies are extremely careful about how their members qualify for the professional qualifications they offer. This is a critical matter, in that it underpins the level of trust the outside world places on the body in question and its assertions as to the professionalism and trustworthiness of its members.

It worries me, therefore, that current implications suggest CITP seems to be a qualification along the lines of a Readers Digest competition – fill in the form, everybody must win!

It worries me even more because, as someone who is not a developer, the more rigorous path of CEng is not open to me. For IT professionals like me, the CITP must be a thorough assessment of the skill and integrity of the bearer or it becomes worthless.

I say this as somebody who has achieved CITP status. When I went through the process the level of detail I had to provide was actually quite high, and references were needed from other members of the BCS who were already of Chartered status (CITP or CEng) or higher. Had I not worked in the industry for so long, with such a varied wealth of experience from different roles, I am not sure I would have made the grade. That is absolutely how it should be.

The very fact that there are those within the BCS who cast doubts as to the validity of the CITP status inherently means that there are doubts as to that validity and it is therefore of far less value. This is a rapidly accelerating downward spiral which has important ramifications for the body.

Promotion of the Body and its role

There is no point having a professional body which underwrites the quality of practice in its industry if nobody is aware of it. For many years, working in IT, I dismissed the BCS as a group of fusty academics who were not in touch with the rapidly moving industry that I loved to be a part of. That the current BCS management have been striving to change that is to be applauded.

IT is an still immature industry. With such immaturity and rapid change there will inevitably be crises. It is the role of the BCS to wade into all these and advise, mediate and in some cases dictate in line with the levels of professionalism it seeks to underwrite in the industry. It cannot perform that role if nobody is aware of its existence.

At the same time, however, the BCS is, in reality, somewhat of a toothless tiger. We do not work in an industry where lack of professional qualifications is a barrier to practice. Perhaps that is wrong, perhaps no; there are very clear arguments to be made in favour of both views.

Sacrificed on the alter of our own success

IT as an industry has a problem. Some aspects of it, one can argue, fall into a similar professional services area to those of the legal and accounting professions. You would never hire an unqualified and unregulated accountant, so why should you use IT professional services that are not similarly regulated.

Many IT projects, particularly for large organisations and functions where lives are at stake, are held up as abject failures and stain the reputation of our industry. Would the threat of being cast out of the BCS and therefore being unable to continue to practice improve the level of conduct and professionalism of those involved in such projects? Who can say?

At the same time, however, ours is an area of extreme innovation at a pace so rapid as to be frightening. A prescriptive professional body might prevent such innovation (or at least force it outside the UK, which helps nobody). In areas such as the web, technology advances faster than any regulations could cope with, however responsive to change they might be.

That second situation demands a body more in line with the other engineering disciplines. They are looked upon to provide a guarantee of skill, knowledge, approach and practice to give confidence in those consuming the services of their practitioners.

Ultimately, the BCS has now reached a point where it does none of the above:

  • There is no regulation of our industry, so the BCS is not an institution which safeguards quality of practice.
  • The CITP has little value in the eyes of many because they perceive it to be to easy to achieve and too little scrutinised.
  • If the CITP is to easy, what does that say for the CEng awarded by the BCS?
  • What good is self promotion if it merely promotes your own inadequacies?

Where does this leave the BCS? I would seem to be approaching a bleak conclusion!


It is clear that careful examination of the issues driving the actions which have led to the EGM takes us down an existential rabbit-hole. Let us then zoom out and ask some simple questions which might help us (I am not going to answer them – you must answer them for yourself and let that guide your vote):

  1. Can we find information that tells us what the current management are doing, not just in broad strokes that outline a strategy but in more detail as to the implementation of that strategy?
    If the answer is yes, then we have the transparency which those who have called the EGM have implied does not exist. If not, then the arguments of the dissenters have obvious validity.
  2. Are we comfortable with management that, faced with a situation which is uncomfortable for them (the EGM) wishes to change the constitution of the organisation to prevent the situation ever occurring again.


I think I have come to a conclusion during the course of writing this. What  will happen after the vote? If the outcome does not meet my own convictions should I look to leave the BCS? If, as I appear to have concluded, I have low confidence in the CITP qualification I hold, should I remain a member of the body which awarded it?

Oddly, my opinion on this is clear. Yes. The pain within the BCS reflects the pain within the IT industry. That there should be a professional body within IT is clear. That the BCS is currently the only game in town is also clear. We should therefore continue to strive to make the BCS what it must be – the guarantor of quality and trust within the industry.

You can only affect change from within.

It works! 8Gb RAM in my Acer TravelMate 6593

I thought I’d post this because so many like me might benefit from my experiment. We have a number of Acer TravelMate 6593 laptops here at Black Marble. They’re great machines – plenty of grunt, a lovely screen and most of the toys you could need in a laptop that’s used for a mix of IT admin, dev and technical sales (including demos). The only downside is that they only ship with up to 4Gb of memory, and Acer say it won’t take more.

I’ve wondered about that for while – all the documentation I could find said that the system supported 4Gb SODIMMs in the two memory slots and the Intel chipset inside supports more than 4Gb of memory.

The cost of two 4Gb SODIMMs for experimentation always stopped me. However, the now lower price, combined with an absolute requirement to run the SharePoint 2010 IW virtual machines (which need 8Gb to fire up!) made me take the plunge.

The good news is that two DDR3 4Gb SODIMMs from Crucial arrived, were installed and worked first time. The system booted with no errors, Windows 7 (x64) recognised 8Gb of memory and a quick bit of partition shuffling later I had a dual-boot Windows 7/Server 2008 R2 + Hyper-V laptop. Marvellous.

Obviously, your TravelMate might not be as accommodating as mine, so make sure you check the returns policy on your RAM!

Solving a mystery: Windows 7 games won’t work on HP TouchSmart TX2

This one has been nagging at me for a long time. My grandmother has an HP TouchSmart TX2 tablet. It was bought with Windows Vista, but as with her main computer, I upgraded it to Windows 7.

It was a good plan – Windows 7 should make it perform better, and the touch capabilities of 7 are better than Vista. There was, however, a small matter of the N-Trig digitiser drivers not being great at point of release – something which would lead me down the wrong path over the problems I encountered.

Windows 7 went onto the TX2 with no problems, except for the phone call I got soon after the rebuild – Mah Jong wouldn’t load, and Tinker (courtesy of Live) was crashing on startup.

Weirdly, when I looked at the system, they all ran when I was logged in as an admin user. However, my standard user-level grandmother got errors. I played with UAC and discovered that having switched it off, rebooted, run the games, switched UAC back on and rebooted again, they worked.

I told myself that it was something to do with the recently-installed N-Trig drivers not having configured things right (the last change to the system) and went away. Except things weren’t working…

The next time, I spent hours examining the system using Process Monitor and Process Explorer. I was thinking that file rights or registry rights would be the culprit, as the games still worked for the admin user. Sadly, I found no errors, no access denied messages, no failures at all. Still things didn’t work.

I’d largely given it up as a bad job, until today, when I installed the Touch Pack. I thought that the additional games might be fun for my Grandmother to play. Had they worked… The newly installed games failed in the same way as Tinker – a shiny ‘program has stopped working’ message and nothing more.

When I tried Bing Maps 3D, however, I got a different error. There in front of was a message about being unable to initialise the Direct 3D system, and so Maps 3d couldn’t load.

Aha! I thought. I downloaded the ATI graphics drivers for the Mobility Radeon 3200 and installed the latest set. No difference.

So I resorted to the hive-mind of the web again. This time I found a thread on a Microsoft forum talking about a problem with ATI drivers properly recognising the hardware at install time on the TX2. That sounded promising, and led me to the AMD support article. Unfortunately, installing the hotfix drivers still didn’t work.

I then found another article on a Microsoft forum talking about a similar issue, fixed with a BIOS update. I hadn’t thought about a BIOS update for the TX2 – I tend not to think about that kind of thing when it’s not my PC. Sure enough, the TX2 had an older BIOS (version F.03) than the latest on the HP site – F.25.

Updating the BIOS still didn’t fix things, but I then reinstalled the ATI drivers supplied by the hotfix article and that did it. All the games worked, Tinker fired into life, and Bing Maps 3d started without a problem. before the driver reinstall I got a slightly more informative error from the games saying ‘A problem has occurred with the 3d driver’.

So, if you get the same problem, here is a quick summary:


  • Windows 7-included games fail to load. Click the icon to fire them up and nothing happens.
  • Microsoft Tinker dies on startup.
  • Microsoft Touch Pack games die on startup.
  • Bing Maps 3D says it failed to initialise Direct3D.


  • HP TouchSmart TX2 tablet, model 1015ea


Unable to remote control Hyper-V VM after installing SharePoint 2010 on Windows 7

True to form, you only discover something isn’t working when you’re in a desperate hurry. We use lots of Hyper-V VMs here at Black Marble and they are mostly running on our four node cluster. I use Failover Cluster Manager and this morning I couldn’t connect remotely to any of the Hyper-V VMs. I kept getting an error:

Virtual Machine Connection:
A connection will not be made because credentials may not be sent to the remote computer. For assistance, contact your system administrator.
Would you like to try connecting again?

A quick search suggested that the credssp settings on the host servers were broken. A quick test showed that they weren’t – the problem was local to my machine.

The only thing I had changed recently (try yesterday!) was to install SharePoint 2010 on my workstation. OK, I’ll be fair – that means a whole load of pre-requisites, so it’s not that simple!

I decided to check my machine and look at the settings which had been suggested as being wrong on the hyper-v servers. Sure enough, my workstation now had the credssp elements and sure enough, they didn’t match the example I’d found.

So if you get the same problem, copy the text below into a .reg file and import it into your registry. It should fix the problem.

Windows Registry Editor Version 5.00

"Hyper-V"="Microsoft Virtual Console Service/*"
"Hyper-V"="Microsoft Virtual Console Service/*"

"Hyper-V"="Microsoft Virtual Console Service/*"

"Hyper-V"="Microsoft Virtual Console Service/*"

"Hyper-V"="Microsoft Virtual Console Service/*"

"Hyper-V"="Microsoft Virtual Console Service/*"

"Hyper-V"="Microsoft Virtual Console Service/*"

"Hyper-V"="Microsoft Virtual Console Service/*"

"Hyper-V"="Microsoft Virtual Console Service/*"

Fixing SharePoint 2007 IIS WAMREG DCOM 10016 activation errors on Server 2008 R2

Anybody who works will SharePoint will grumble if you mention DCOM activation permissions. No matter how hard we try, how many patches we install (or how hard we try to ignore it), granting activation and launch permissions to the SharePoint service accounts is like plugging a dike with water-soluble filler.

On Server 2008  R2 our job is made that much harder by the fact that, by default, even administrators can’t edit the security settings for the IIS WAMREG service (GUID {61738644-F196-11D0-9953-00C04FD919C1}, for when you see it in your application event log).

The fix is to change the default permissions on a registry key, which you can only do by taking ownership of the key. My only comment would be that those permissions were locked down for a good reason in Server 2008 R2 and it’s somewhat frustrating that we need to do this.

Anyway, the key you are looking for is:


To change the ownership you need to click the Advanced button in the Permissions tab of the properties dialog, then select the Owner tab. I’d recommend changing the owner to the Administrators group rather than a specific user, and make sure the permissions for TrustedInstaller are the same after you finished as they were before you started.

Once done, you can edit the DCOM permissions for the IIS WAMREG service in the same way as on other versions of Server 2008.

Social Networking: The double-edged sword of maintaining an online presence

Exploring the new frontier

I’m writing this post whilst watching my Windows Home Server slowly copy data onto an external drive. I mention that not because of its pertinence, but to indicate why I found myself having time to join Facebook.

The other reason was the excellent session given by Eileen Brown at our most recent event. After Eileen had finished admonishing me for not taking my online presence (and therefore reputation) seriously enough I took the step of installing the Twitter Notify plugin for Live Writer so I could connect two of my online personas together.

But that wasn’t enough. I’ve had an online profile on LinkedIn for some time now, which I find very useful for business contacts. Ping.fm offered a very useful service of allowing effective cross-posting of status updates between my online services, so I signed up (on Elaine’s most excellent advice) and could then amplify the volume of my random thoughts across multiple networks.

Perhaps foolishly, however, I didn’t stop there. I now have a Facebook profile. This has turned out to be almost my making and undoing, all at once. Suddenly I can see why people I know lose hours of their lives hooked into their online circle of friends. At the same time though, there are so many people out their on Facebook that I haven’t seen or spoken to in years and suddenly I have a mechanism which allows me to reconnect with them (with varying degrees of passive- or activeness, depending on both sides’ level of enthusiasm).

The Twitter Notify plugin has now been replaced by xPollinate – a Ping.fm plugin for Live Writer. Once more, projecting my voice across the vastness of cyberspace.

And now I find myself wondering whether I’ve done the right thing. The cat is most forcefully out of the bag and no amount of persuasion will force it back in. I must now engage with these networks, spending time which I’m not certain I have commenting and posting and updating or my online personas will wither and die and fall back into the ocean of neglected accounts, blogs and other internet detritus.

I remember when this was all fields

Sadly, I really am old enough to remember the internet before the web. I’m old enough to remember Compuserve being the big online realm. When I was an undergraduate at University, suddenly email was a fantastic way of communicating with my friends at other Universities – all connected to JANET (the UK Joint Academic Network, which itself connected to the Internet).

Back then we couldn’t share much. Sure, you could attach things to emails, but you didn’t have much space in your mailbox and, frankly, there wasn’t much to send. We bounced messages back and forth to arrange meetings and social gatherings, and it was an invaluable tool for coursework!

Whilst we had USENET (internet news groups, for those who haven’t encountered them) to allow online discussion, we didn’t have anything like the Blogs of today, which offer anybody a platform from which to voice their opinions.

The web, when it came, was exciting and fresh. Where I worked, at the University of Bradford, we had one of the first web sites in the UK, thanks to the enthusiasm of my colleagues in the Computer Centre. Over time, academics embraced the new tool as a way to push academic content out to their students.

Certainly, you could lose hours of your life to these things,  but there wasn’t the necessity to post stuff because, frankly, the internet wasn’t very big and most of the people on it were academics at other Universities.

The power of the web to promote yourself became apparent when I began to be involved in creating content for the web at the University. At that time, many of the sources of knowledge I was learning from were influential bloggers – using the new medium to put forward their ideas on how the web should be built. Many of them are still around today, but interestingly, many do not post with the frequency that they used to.

The trap of influence

It seems that the more you post, providing what you have to say is not complete rubbish, then the more people ask you to post more. I have seen many people for whom I have the utmost respect slowly fade away, citing pressures of time or growing workload. The problem is, our online voice is what builds our reputation and if we silence that voice our reputation fades along with it.

This is a conundrum for me. Frankly, I don’t post enough, either to this blog or any of my other online personas. I’d like to post more; I have lots to say (and some of it is more pertinent than this current stream of consciousness). In order to help build the reputation of Black Marble, I need to post more about the cool stuff we do and the great things we achieve as a company. The problem is, I also have a wife, and a life outside what I do for a living (which is already tightly combined with most of my hobbies and interests). How much of my time must I devote to activities connected to my work, even if some of those activities merge into my personal life (like Facebook) or are simply fun?

Passive Engagement

Interestingly, Twitter really has connected me more with some of my friends. Nick Smith, a man for whom I have only respect, persuaded me during the last @Media conference in London last year that Twitter was a great way of keeping in contact with people. The most interesting thing about his argument was that it was an almost entirely passive means of communication, by which he meant that I could listen to his stream of tweets and thereby know what he was up to and choose to comment if I wished.

If you think about it, that’s pretty revelatory. I can’t think of any other means of keeping in touch which doesn’t involve effort from both parties, or risk upset if only one side makes an effort (such as letter writing, at which I was always appalling). To me, Twitter is a great informer, keeping me abreast of what my friends are doing, however remote.

Facebook, by way of contrast, would seem to be something that is almost more demanding of my time and commitment than any of the pre-internet communication channels we had (telephone, letter, meeting down the pub), and provides such a rapid stream of communication with a hugely varying signal-to-noise ratio that I’m struggling to keep up already…

No answers, only questions…

I have no panacea for this. To be honest, this post is more an open question to anybody who reads my blog or notices my twittering or has found me on Facebook or LinkedIn: How do you do it? What advice can we offer one another in coping with the deluge of information of modern life and striking the balance between the demands of maintaining our online profile and enjoying the time with the friends it connects us to? Am I making a point which strikes a chord, or am I talking rubbish? You decide. Deluge my Facebook profile with comments; I can only try to keep up.

New and coming Microsoft technologies you need to look at

Yesterday was the annual Black Marble Tech Update event, where we try to cover every product in the Microsoft arsenal in half a day, telling local businesses what’s coming and what deserves attention.

Writing up the content of the presentations would be almost as exhausting as the research required for create them, but following a few conversations during breaks yesterday I decided that a short blog post on some of the technologies that deserve a closer look was merited.

Rather than hit you with lots, all at once, I’ll probably do a few posts, each with a small list of ‘homework’ for you.

So, the first few, in no particular order…

Direct Access

This is a game-changer when it comes to enabling anywhere-access for mobile workers, and ties nicely in with my recent remote access post. In brief, the qustion behind this is “why should I trust my corporate network any more than the internet?” Once you’ve realised that the answer to that question should be a loud “I shouldn’t!” then Direct Access is the logical answer. In short, it assumes all networks are untrusted and therefore demands a secure connection between all computers at the protocol level (using IPSec). The anywhere access comes from using IPv6, which means that when I fire up my laptop in a hotel I can securely work just like I do in the office, including access to stuff like file shares.


Unified Access Gateway (the latest version of IAG) builds on DirectAccess, making it easier to configure and manage. It also provides secure remote access for machines which you don’t trust. When you combine UAG with DirectAccess you end up with a comprehensive universal access solution for your infrastructure.

SharePoint 2010

There’s already a great deal of buzz around this. Architectural changes are great, but I firmly believe that the real game-changer is the way that social networking technologies have been absorbed into a business-solution in such a way that it can seriously benefit the way we store, use and find information. You just need to overcome your natural businessman fear of social networking and worker time-wasting and embrace the possibilities.

Office 2010

One of my biggest issues with Office 2007, and the one I hear most often as a barrier to adoption was not the ribbon, but that the interface was not consistent across all of the applications. Office 2010 fixes that, making your transition much less painful when it comes to training. Couple that with the new web versions and excellent business functionality when combined with SharePoint and it becomes quite compelling. Of course, that’s without mentioning the improvements in Outlook like the new conversation view. You’ll prise Outlook 2010 out my cold, dead hands, I can tell you.

Forefront ‘Stirling wave’

The big benefit in my opinion of the new codename Stirling wave of Forefront products is that they can be integrated with a control layer which allows behaviour seen by one to trigger remedial action by another (e.g. trigggering an AV scan of a desktop PC sending lots of emails). That hands-off rapid containment of potential issues is something which I think could be invaluable to large organisations.

Remote working solutions (or how I learned to stop worrying and love the snow)

We lost remarkably few days of productivity to the bad weather at Black Marble. That wasn’t because we were all intrepid, hardy types and all made it into the office. Far from it – some of us live in areas where they don’t grit very often and can’t make it to the main roads.

As you guessed from the title, the reason we came through the bad weather so well was because of our ability to work remotely. I thought I’d write a post about what we do – not because we have any wonderfully clever solution, but because lost time is lost money, and many people discard remote access out of hand.

Keep it simple

I come at this from two sides: Firstly, complex solutions are hard to manage and are more likely to fail. Secondly, users don’t want to have to remember some peculiar incantation to access their stuff just because they are somewhere other than their desk.

I have a simple approach; Anything the users do to access stuff on our company network should be what they do to access it when they aren’t on the company network. If I don’t allow remote access to that system (and I can’t think of any of those off the top of my head) then they should get some kind of access denied message; otherwise, they should be asked to authenticate and carry on.

Pick a protocol. Don’t pick lots.

To be fair, I’m in a strong position with this because of the portfolio of services I run. I don’t profess to be a network security ninja so I have very few rules in our firewall. Only one protocol is allowed in for remote access: https.

How can I do that? Well, SharePoint, Project Server and CRM are all very obviously web-based. Exchange has OWA and Outlook can connect using https as well. Even our remote desktop access is published using https, using Terminal Services Gateway. Since I’m using https outside the LAN, I use it inside as well. Why? Well, why trust my own network any more than the internet, and why make users remember a different URL when outside.

A short list of the stuff we use

ISA Server 2006 sits at the edge of our network. I use it to publish out the various services. It’s very easy to manage and works beautifully. It’s about to be replaced, however, by Forefront Threat Management Gateway (TMG). My own plan is to move towards using DirectAccess and Unified Access Gateway (UAG) in the near future.

Our SharePoint, Project Server and CRM systems all run on IIS. We have a wildcard certificate, which I would recommend to any small organisation wanting to publish web systems securely as they offer a much lower cost approach than getting specific certs for all the different URLs.

Out Visual Studio Team Foundation Server (TFS), in both 2008 and 2010 flavours also works quite happily over https, and can be published out securely.

Terminal Services Gateway allows me to connect to appropriate systems securely using RDP over HTTPS.

What don’t we publish?

Perhaps unsurprisingly, none of our file shares are accessible from the outside world. However, since all our business data is in SharePoint or CRM (including documents), the stuff on the file shares is not needed and is mostly stuff like ISOs of software.

How easy is it?

If you keep things simple, remote access can be delivered securely and easily. ISA Server takes only a short time to install and configure if you stick to a very limited and straightforward ruleset.

I would, however, urge you not to simply rush out and allow access to your systems without thinking: Security is essential and that means putting some thought into what you want to publish outside your corporate LAN and how you manage access and auditing.

The bottom line, though, is the effect that incidents like the recent bad weather can have on the company’s bottom line. Being able to work remotely doesn’t mean that your staff can do so on a whim, but it means that should they need to, they can do all the things they would normally do in the office without penalty. If you haven’t considered remote access solutions yet, perhaps now is the time to do so – before next winter and your workforce is stuck at home…