Social Networking: The double-edged sword of maintaining an online presence

Exploring the new frontier

I’m writing this post whilst watching my Windows Home Server slowly copy data onto an external drive. I mention that not because of its pertinence, but to indicate why I found myself having time to join Facebook.

The other reason was the excellent session given by Eileen Brown at our most recent event. After Eileen had finished admonishing me for not taking my online presence (and therefore reputation) seriously enough I took the step of installing the Twitter Notify plugin for Live Writer so I could connect two of my online personas together.

But that wasn’t enough. I’ve had an online profile on LinkedIn for some time now, which I find very useful for business contacts. offered a very useful service of allowing effective cross-posting of status updates between my online services, so I signed up (on Elaine’s most excellent advice) and could then amplify the volume of my random thoughts across multiple networks.

Perhaps foolishly, however, I didn’t stop there. I now have a Facebook profile. This has turned out to be almost my making and undoing, all at once. Suddenly I can see why people I know lose hours of their lives hooked into their online circle of friends. At the same time though, there are so many people out their on Facebook that I haven’t seen or spoken to in years and suddenly I have a mechanism which allows me to reconnect with them (with varying degrees of passive- or activeness, depending on both sides’ level of enthusiasm).

The Twitter Notify plugin has now been replaced by xPollinate – a plugin for Live Writer. Once more, projecting my voice across the vastness of cyberspace.

And now I find myself wondering whether I’ve done the right thing. The cat is most forcefully out of the bag and no amount of persuasion will force it back in. I must now engage with these networks, spending time which I’m not certain I have commenting and posting and updating or my online personas will wither and die and fall back into the ocean of neglected accounts, blogs and other internet detritus.

I remember when this was all fields

Sadly, I really am old enough to remember the internet before the web. I’m old enough to remember Compuserve being the big online realm. When I was an undergraduate at University, suddenly email was a fantastic way of communicating with my friends at other Universities – all connected to JANET (the UK Joint Academic Network, which itself connected to the Internet).

Back then we couldn’t share much. Sure, you could attach things to emails, but you didn’t have much space in your mailbox and, frankly, there wasn’t much to send. We bounced messages back and forth to arrange meetings and social gatherings, and it was an invaluable tool for coursework!

Whilst we had USENET (internet news groups, for those who haven’t encountered them) to allow online discussion, we didn’t have anything like the Blogs of today, which offer anybody a platform from which to voice their opinions.

The web, when it came, was exciting and fresh. Where I worked, at the University of Bradford, we had one of the first web sites in the UK, thanks to the enthusiasm of my colleagues in the Computer Centre. Over time, academics embraced the new tool as a way to push academic content out to their students.

Certainly, you could lose hours of your life to these things,  but there wasn’t the necessity to post stuff because, frankly, the internet wasn’t very big and most of the people on it were academics at other Universities.

The power of the web to promote yourself became apparent when I began to be involved in creating content for the web at the University. At that time, many of the sources of knowledge I was learning from were influential bloggers – using the new medium to put forward their ideas on how the web should be built. Many of them are still around today, but interestingly, many do not post with the frequency that they used to.

The trap of influence

It seems that the more you post, providing what you have to say is not complete rubbish, then the more people ask you to post more. I have seen many people for whom I have the utmost respect slowly fade away, citing pressures of time or growing workload. The problem is, our online voice is what builds our reputation and if we silence that voice our reputation fades along with it.

This is a conundrum for me. Frankly, I don’t post enough, either to this blog or any of my other online personas. I’d like to post more; I have lots to say (and some of it is more pertinent than this current stream of consciousness). In order to help build the reputation of Black Marble, I need to post more about the cool stuff we do and the great things we achieve as a company. The problem is, I also have a wife, and a life outside what I do for a living (which is already tightly combined with most of my hobbies and interests). How much of my time must I devote to activities connected to my work, even if some of those activities merge into my personal life (like Facebook) or are simply fun?

Passive Engagement

Interestingly, Twitter really has connected me more with some of my friends. Nick Smith, a man for whom I have only respect, persuaded me during the last @Media conference in London last year that Twitter was a great way of keeping in contact with people. The most interesting thing about his argument was that it was an almost entirely passive means of communication, by which he meant that I could listen to his stream of tweets and thereby know what he was up to and choose to comment if I wished.

If you think about it, that’s pretty revelatory. I can’t think of any other means of keeping in touch which doesn’t involve effort from both parties, or risk upset if only one side makes an effort (such as letter writing, at which I was always appalling). To me, Twitter is a great informer, keeping me abreast of what my friends are doing, however remote.

Facebook, by way of contrast, would seem to be something that is almost more demanding of my time and commitment than any of the pre-internet communication channels we had (telephone, letter, meeting down the pub), and provides such a rapid stream of communication with a hugely varying signal-to-noise ratio that I’m struggling to keep up already…

No answers, only questions…

I have no panacea for this. To be honest, this post is more an open question to anybody who reads my blog or notices my twittering or has found me on Facebook or LinkedIn: How do you do it? What advice can we offer one another in coping with the deluge of information of modern life and striking the balance between the demands of maintaining our online profile and enjoying the time with the friends it connects us to? Am I making a point which strikes a chord, or am I talking rubbish? You decide. Deluge my Facebook profile with comments; I can only try to keep up.

New and coming Microsoft technologies you need to look at

Yesterday was the annual Black Marble Tech Update event, where we try to cover every product in the Microsoft arsenal in half a day, telling local businesses what’s coming and what deserves attention.

Writing up the content of the presentations would be almost as exhausting as the research required for create them, but following a few conversations during breaks yesterday I decided that a short blog post on some of the technologies that deserve a closer look was merited.

Rather than hit you with lots, all at once, I’ll probably do a few posts, each with a small list of ‘homework’ for you.

So, the first few, in no particular order…

Direct Access

This is a game-changer when it comes to enabling anywhere-access for mobile workers, and ties nicely in with my recent remote access post. In brief, the qustion behind this is “why should I trust my corporate network any more than the internet?” Once you’ve realised that the answer to that question should be a loud “I shouldn’t!” then Direct Access is the logical answer. In short, it assumes all networks are untrusted and therefore demands a secure connection between all computers at the protocol level (using IPSec). The anywhere access comes from using IPv6, which means that when I fire up my laptop in a hotel I can securely work just like I do in the office, including access to stuff like file shares.


Unified Access Gateway (the latest version of IAG) builds on DirectAccess, making it easier to configure and manage. It also provides secure remote access for machines which you don’t trust. When you combine UAG with DirectAccess you end up with a comprehensive universal access solution for your infrastructure.

SharePoint 2010

There’s already a great deal of buzz around this. Architectural changes are great, but I firmly believe that the real game-changer is the way that social networking technologies have been absorbed into a business-solution in such a way that it can seriously benefit the way we store, use and find information. You just need to overcome your natural businessman fear of social networking and worker time-wasting and embrace the possibilities.

Office 2010

One of my biggest issues with Office 2007, and the one I hear most often as a barrier to adoption was not the ribbon, but that the interface was not consistent across all of the applications. Office 2010 fixes that, making your transition much less painful when it comes to training. Couple that with the new web versions and excellent business functionality when combined with SharePoint and it becomes quite compelling. Of course, that’s without mentioning the improvements in Outlook like the new conversation view. You’ll prise Outlook 2010 out my cold, dead hands, I can tell you.

Forefront ‘Stirling wave’

The big benefit in my opinion of the new codename Stirling wave of Forefront products is that they can be integrated with a control layer which allows behaviour seen by one to trigger remedial action by another (e.g. trigggering an AV scan of a desktop PC sending lots of emails). That hands-off rapid containment of potential issues is something which I think could be invaluable to large organisations.

Remote working solutions (or how I learned to stop worrying and love the snow)

We lost remarkably few days of productivity to the bad weather at Black Marble. That wasn’t because we were all intrepid, hardy types and all made it into the office. Far from it – some of us live in areas where they don’t grit very often and can’t make it to the main roads.

As you guessed from the title, the reason we came through the bad weather so well was because of our ability to work remotely. I thought I’d write a post about what we do – not because we have any wonderfully clever solution, but because lost time is lost money, and many people discard remote access out of hand.

Keep it simple

I come at this from two sides: Firstly, complex solutions are hard to manage and are more likely to fail. Secondly, users don’t want to have to remember some peculiar incantation to access their stuff just because they are somewhere other than their desk.

I have a simple approach; Anything the users do to access stuff on our company network should be what they do to access it when they aren’t on the company network. If I don’t allow remote access to that system (and I can’t think of any of those off the top of my head) then they should get some kind of access denied message; otherwise, they should be asked to authenticate and carry on.

Pick a protocol. Don’t pick lots.

To be fair, I’m in a strong position with this because of the portfolio of services I run. I don’t profess to be a network security ninja so I have very few rules in our firewall. Only one protocol is allowed in for remote access: https.

How can I do that? Well, SharePoint, Project Server and CRM are all very obviously web-based. Exchange has OWA and Outlook can connect using https as well. Even our remote desktop access is published using https, using Terminal Services Gateway. Since I’m using https outside the LAN, I use it inside as well. Why? Well, why trust my own network any more than the internet, and why make users remember a different URL when outside.

A short list of the stuff we use

ISA Server 2006 sits at the edge of our network. I use it to publish out the various services. It’s very easy to manage and works beautifully. It’s about to be replaced, however, by Forefront Threat Management Gateway (TMG). My own plan is to move towards using DirectAccess and Unified Access Gateway (UAG) in the near future.

Our SharePoint, Project Server and CRM systems all run on IIS. We have a wildcard certificate, which I would recommend to any small organisation wanting to publish web systems securely as they offer a much lower cost approach than getting specific certs for all the different URLs.

Out Visual Studio Team Foundation Server (TFS), in both 2008 and 2010 flavours also works quite happily over https, and can be published out securely.

Terminal Services Gateway allows me to connect to appropriate systems securely using RDP over HTTPS.

What don’t we publish?

Perhaps unsurprisingly, none of our file shares are accessible from the outside world. However, since all our business data is in SharePoint or CRM (including documents), the stuff on the file shares is not needed and is mostly stuff like ISOs of software.

How easy is it?

If you keep things simple, remote access can be delivered securely and easily. ISA Server takes only a short time to install and configure if you stick to a very limited and straightforward ruleset.

I would, however, urge you not to simply rush out and allow access to your systems without thinking: Security is essential and that means putting some thought into what you want to publish outside your corporate LAN and how you manage access and auditing.

The bottom line, though, is the effect that incidents like the recent bad weather can have on the company’s bottom line. Being able to work remotely doesn’t mean that your staff can do so on a whim, but it means that should they need to, they can do all the things they would normally do in the office without penalty. If you haven’t considered remote access solutions yet, perhaps now is the time to do so – before next winter and your workforce is stuck at home…

Twitter clients: Twinbox and Tweetz

Anybody who follows me on twitter will know that @rikhepworth is by no means a prolific tweeter. However, I do follow a number of people around the planet, and in addition to the ubiquitous Tweetie2 on my iPhone, I have found two clients to be useful and reliable.

The first is Tweetz, from Blue Onion Software. This is a great gadget for the Windows 7 desktop (or Vista Sidebar). The UI is simple and extremely usable (I love the way I can scroll the history for older tweets) and it makes posting a breeze.

The second reflects just how much I live by Outlook and the resulting ability to search and collate unread mails, blog posts and now tweets. Twinbox from TechHit allows you to tweet directly from Outlook and incoming tweets are collated by sender. No integration with the Office 2010 fluent UI but the add-in works, and there is a 64-bit version available as well.

Solve ‘pending reboot’ setup show stopper for CRM 4 Client (with Update Rollup 7)

I’ve been extremely busy over the past week creating demo systems and updating our own internal Black Marble systems. Part of that long list of tasks was to get around to testing the CRM 4 Outlook client with Outlook 2010.

For those who don’t know, you need the Update Rollup7 client if you want to use Outlook 2010 (and x86 Office only need apply). You can download a slipstreamed client installer from Microsoft.

However, you may find that the client steadfastly refuse to install, telling you that it is unable to proceed due to a pending restart.

The solution to the problem can be found on the Microsoft forums:

Look in the registry, in the Current User hive (HKEY_Current_User) for the user you’re trying to run setupclient.exe as. You will find a key in HKCU\Software\Microsoft named MSCRMClient. Create a new Dword value (32-bit if you’re on Windows 7 x64) called IgnoreChecks and set the value to 1.

This fixed it for me. Hopefully it will fix it for you too.

Reassigning the correct SSL certificate to SharePoint 2010 Web Services IIS Site


This post is about assigning an SSL certificate to an IIS 7.5-hosted website which is not located in the Personal Certificate store. The steps shown are not SharePoint-specific, however. Hopefully this post will save you the large amount of time I spent hunting down the information on how to do this.

The usual background

I’ve been installing and configuring a SharePoint 2010 system that we can use here at Black Marble for our demo sessions. I hit a nasty wall just after lunch which turned out to be caused by the SSL certificate being used by the ISS web site hosting the SharePoint web services.

I’d spent a while carefully wiring up the user profile service to our AD, getting synchronisation working and dealing with the creation of a new MySite host. That in itself is a fairly involved process right now, so when I hit errors I naturally assumed it was related to my work on the user profile service.

When trying to manage the User Profile Service I was seeing errors that Central Administration could  not access the service.

The automatic Health Analyzer in SharePoint was telling there was an error with the Security Token Service:

The Security Token Service is not available.
The Security Token Service is not issuing tokens. The service could be malfunctioning or in a bad state.
SPSecurityTokenService (SecurityTokenService)

In the Application Event Log I was seeing EventID 8306: An exception occurred when trying to issue security token: Could not establish trust relationship for the SSL/TLS secure channel with authority 'localhost:32844'..

Naturally, I checked the bindings through IIS Manager to see what certificate was in use. An IIS self-issued certificate for the server was listed, which I though should have been valid…

I looked in the Local Computer Certificate Store using the MMC snapin and I discovered a folder called SharePoint which had three certificate in it, all issued by the Sharepoint Root Authority:

  • SharePoint Security Token Service
  • SharePoint Security Token Service Encryption
  • SharePoint Services

That sounded interesting – perhaps one of these was the certificate which should be used and the configuration had got changed. The trouble now was how I assigned those certificates. IIS Manager only shows you the certificates in the Personal store – I couldn’t select the certificate I  needed anywhere.

Being one to tinker before turning to the web I looked in applicationhost.config – the xml file which contains the configuration details for the IIS sites. It listed the protocol bindings but not the certificate. So I turned to Bing.

The first site of note was (of course) on IIS.Net – How to setup SSL on IIS 7.0

This listed a whole heap of things to do in order to set up SSL, but none of it told me how to assign a certificate from a specific store, at least without turning to WMI (and that wasn’t clear).

I then found a detailed MSDN How To: Configure a Port with an SSL Certificate

This was really useful (if hard to find). It detailed how to configure a certificate using netsh. This required a key bit of information which I didn’t have – the certificate hash. However, the article linked to another, telling me that the has is in fact the Thumbprint attribute, accessible through the certificate MMC snapin (MSDN – How To: Retrieve the Thumbprint of a Certificate).

I tried the appropriate netsh command and it failed. I then realised that when I queried the ssl bindings the certificate store name was listed, showing where cert was. There was no information in that article on how to specifiy this.

Bing to the rescue again. A non-MS site listing the parameters of the netsh add sslcert command.

The actual solution

In an elevated command prompt enter the following command to list the current SSL bindings:

netsh http show sslcert

You’ll get something that looks suspiciously like the image below. Note that there may be more than one binding listed; note also that the details below are for a working web services site.

Output from netsh http show sslcert

You need to get some information for the SSL binding on port 32844, used by the SharePoint Web Services. The relevant section, as show above, will list the IP:Port as Mark and copy the the Application ID GUID. Interestingly, I’ve checked two different SharePoint 2010 installs on different servers and the Application ID is the same for both.

You also need to find the certificate hash (thumbprint) for the SharePoint Services Certificate. Load up MMC and add the certificate snapin, connecting to the Local Computer store. You should see a store named SharePoint with three certificates in, as per the image below:

Certificate console showing SharePoint store

Double-click the SharePoint Services certificate and select the Details tab. Scroll down and find the Thumbprint property and copy it’s contents to the clipboard.

Certificate properties showing Thumbprint

Paste the text into notepad and trim out the spaces before you use it in the commands below.

I removed the SSL binding first using the command below, although I’m not sure if this step is necessary:

netsh http delete sslcert ipport=0.0.0:32844

Once that’s done, enter the command below, using the thumbprint from your certificate and (if it’s different) the correct appid for your website.

netsh http add sslcert ipport=0.0.0:32844 certhash=<thumbprint> appid={4dc3e181-e14b-4a21-b022-59fc669b0914} certstorename=SharePoint

Finish off with another netsh http show sslcert to make sure the changes have been made, and then perform an iisreset, just to be sure.

The annoying bit

When you’ve done all this, don’t be fooled when you examine the bindings in IIS manager. If the certificate isn’t in the Personal store (i.e. IIS Manager doesn’t show it in the list) then the certificate is listed as Not Selected, which is very misleading. One to poke the guys in the IIS team about, I think.