Balancing customer needs against forward motion: IE8

I’ve watched the debate with interest but not posted anything until now. The news of Internet Explorer 8 keeping it’s new rendering engine to itself unless you tell it otherwise caused a strong outpouring of opinion around the web.

I must admit, my initial reaction mirrored that of many others – that it’s just plain wrong (although my good friend Nick’s posting took some concentration to ascertain his thoughts!). Why hold back on improved support for CSS; why hide the fact that the engine now passes ACID2?

Then I thought for a bit, and tempered my view with the knowledge that the coming of IE7 caused much angst amongst companies because what worked in IE6 failed in IE7. Perhaps an additional switch to toggle this new rendering marvel on and off was a good idea. But surely, you’d want it to default to the shiny new engine… wouldn’t you?

I have now changed my mind. Why? Because at a recent event, after presenting for a while on upcoming Microsoft technologies including IE8, one of the attendees came up to chat. He worked for major financial organisation and was pressing for more information on the new browser. Would it really keep the rendering behaviour as previous versions by default? If so, that was great! Why? Because he was faced with many different divisions within his organisation, all of which had web-based applications and all of which cried foul over IE7 breaking their systems. This was still giving headaches with the rollout of IE7, and he was very keen on being able to convince his stakeholders that if they would just shoulder the pain of the version 6 to 7 transition, he could guarantee that there would be no more pain with future upgrades. This would mean that the IT department could push out the newer, more secure browsers without the battle.

There are many large organisations like that around the globe. Their strength in terms of buying power and opinion is what has led Microsoft to the solution we now see with IE8. Whilst purists may hate it, the truth is that IT Managers around the planet are smiling.

Which would you rather see – massive companies sticking to insecure browsers on their desktops because the investment in internal systems would be too large to allow movement, or a steady push forward in versions safe in the knowledge that there will be zero impact on existing investment?

If you’ve managed to avoid this issue entirely thus far, the ever thoughtful and tactful Eric Meyer has some excellent posts discussing the matter.

In the Mix

Well, it’s just after 3pm on day one of Mix:UK 07. I’m taking a break with a coffee so I thought I’d post.

It’s mixed bag down here (sorry – no pun intended). The technology is fantastic – the stuff that can be achieved with WPF and Silverlight is excellent. I’m still a little uncertain that usability has been sacrificed on the sacrificial alter of bling, however. To be fair, that’s more telling about the rapid-development nature of conference demos, where the wow-factor is more important, but I think it’s a very, very significant issue which should not be allowed to get lost in the excitement.

So, keynote was good, but a little patchy, with lots of people showing off their latest and greatest example of of WPF or Silverlight. The first session was really useful for me. I’ve done some XAML, but to watch a guy who really knows his way around Blend really helped gel things in my mind.

More interesting still, however, was the next session, where a gret guy called Nathan Buggia from Live Search talked about SEO. It was a good session, with a lot of straight talk from a guy who works at a search engine about SEO, nicely pointing out some of the less honorable practices of SEO sharks. Overall his message was what I’ve said all along – build good, semantic pages with informative content and you’ll get good rankings. There’s a bit more to it than that, obviously, but that’s broadly it.

What I did discover during that session, which I really ought to have seen before (I may even have seen it but not have it register), was the XML sitemap format, detailed at sitemaps.org. This can be pushed to the search engines to give them prior information, if you like. It doesn’t let you ‘fix’ your results, but it can be used to give helpful hints to the search engine, particularly on refresh rates for changing pages or even just giving them the nod that things have changed. I will research this more thoroughly now – I may even manage a post on what I find.

Anyway, I will sign off with an apology – sorry Nick, I’m in London and I haven’t called. Next time, I promise!

Web site development: University of Bradford Part 1

One of the last projects I was involved in before I left the University of Bradford to join Black Marble was a new design for the external web site of the institution. I’d pretty much finished the construction of the page layouts and styles before I left, but it’s only now that the site is about to go live. I’ve threatened a few people with a series of posts on how the site is constructed and although I’m not there any more it seems topical.

In this post I’ll give some background, describe the project and run through why things were done in a certain way. Over the next few posts I’ll cover the construction in more detail – what styling problems I hit and how they were fixed, and how the site tries to make use of things like microformats and opensearch.

A Brand Refresh; A whole new look

The University of Bradford old website

The University’s external web site hasn’t really changed much in years. Having said that, in spite of not necessarily being the snappiest dresser on the block, it was always extremely easy to find what you were after. Back in early 2006 the marketing department were engaged in a ‘brand refresh’ which to you and me means fiddling with the logo and corporate colours. Also to be included in the spruce-up was the web site.

The University of Bradford WebsiteFor those of you who don’t know, my role at the University expanded to take in the web when one of my colleagues, who ran the web servers, left the organisation. I’ve always been passionate about web development (and I use that term advisedly) and I spent a fair amount of my time trying to expand the level of knowledge and appreciation of web standards, issues and technology throughout the university. It was because of this that I was asked if I could assist with the development of the new web site.

University internal page new designThe design for the site was done by the same agency responsible for the brand refresh. It is extremely striking, and still in keeping with trying to make the site as navigable as possible. A meeting was had with the designer, the University’s Web Officer, the Head of Marketing and myself. In that meeting we agreed that the University would build the site itself from the designs created by the agency. This would allow us to make sure that we met our legal obligations in terms of Accessibility, and also ensure that the was knowledge and understanding within the organisation of how the site was built.

A series of laudable aims

It was agreed that the site should meet a series of requirements from a technical perspective:

  • It should be a fully fluid design – not a thin sliver down the middle of your monitor but able to flow and take up as much space as allowed.
  • It should work in all modern browsers, including mobile browsers such as Opera, and text-only browsers such as Lynx.
  • It should be as accessible as possible, using accepted best-practice for ensuring users of assistive technologies would be able to get the most out of the site.
  • It should attempt to include new technologies such as OpenSearch and Microformats if and where appropriate.

Assigning roles

There were a number of areas that required work to make the new web site a reality. It was agreed that I would build the external homepage and a template for the content pages. I would not deal with site structure or content- those would be managed by the Web Officer and the marketing team.

Starting Out

I started out with a series of visual comps given to me in PDF format. I began with the homepage and started to work out how to tackle taking the design and building the underlying HTML structure.

I’m a bit of a luddite at heart, so I printed all the comps out at A3, got some large sheets of tracing paper and traced my initial wireframe, labelling the parts as I went.

Once I’d got a basic structure I then made some scribbled notes about how certain elements should function – using remote rollovers, for example.

After that, I pulled the comps up in my bitmap editor (Corel PhotoPaint, if you care) and took some dimensions to inform the initial styling, and lifted the colour values from the design element to feed into the stylesheets.

Once I had my trusty paper notes to work from, I started to tackle the creation of the site. I code by hand – I hate GUI editors – so I did most of the work in HTML-Kit from Chami.com. I now tend to use Expression Web, although I dip into Dreamweaver occasionally and I suspect that I will use Visual Studio 208 more as the projects I work on at Black Marble tend to involve ASP.Net coders as well.

In my next post I’ll run through how the homepage was built and what hurdles the web browsers threw into my path along the way!

Web development helpers: Redux

After posting yesterday about useful tools for development I stumbled across another little gem of a utility. IE7Pro is much more of a usability-enhancing tool but it has a wonderfully handy tool nestling within – Save Current Tab As Image. If you need to do grabs of pages for documentation or presentations and the page is more than a single screen in length this will transform your life – no more cropping and stitching!

IE7Pro also has a raft of features such as adblocking and mouse gestures, which I will admit to switching off immediately. However, it’s inline search (not quite Find As You Type, but pretty close) is jolly useful.

Get IE7Pro

Web development little helpers

As web development gets more and more complex having the right tools to help you figure out what’s going on is essential. I thought I’d do a quick post on the ones I find most useful. In no particular order, then, here they are.

  1. Virtual PC
    This one is a godsend, because as we all know, running multiple versions of Internet Explorer is hard. VPC, now available as a free download from Microsoft, allows me to run the numerous variants of IE our clients require me to test against.
    If you just want IE6, Microsoft have a handy downloadable pre-built VPC:
    Download Virtual PC
    Download the Internet Explorer Compatibility VPC Image
     
  2. Firebug for Firefox
    Now imitated for other browsers, Firebug is fantastic. A clear and straightforward way to identify the bugs in your pages or styles, it allows you to easily identify which stylesheet rules are being applied and in what order, and to hack ’em on the fly as you test your fixes. Add to that the ability to mangle the page and debug javascript and we have a winner.
    Download Firebug
    Firebug running in Firefox
  3. Chris Pederick’s Developer Toolbar for Firefox
    Even though Firebug is great, I still use Chris Pederick’s trusty developer toolbar for enabling and disabling styles, accessing the W3C validator and other stuff. Couldn’t live without it, in fact.
    Get Developer Toolbar
  4. Nikhil Kothari’s Web Development Helper for IE
    Broadly offering the same level of information as Firebug, but without the ability to hack on the fly, this is a handy way of seeing what IE is doing with your page under the hood.
    Get Web Development Helper
    Webhelper in IE
    The Webhelper DOM Inspector
  5. Inspector for Safari (for Windows)
    I have a trusty Mac Mini that I use for checking Safari as well, but the advent of Safari for Windows has made my life easier, I must admit. How excited was I, then, to find that you get Inspector working with the Windows version. Again, loads of info about the page, although hacking on the fly. Instructions courtesty of David Barkol’s blog. A note – as I write this the latest nightly crashes horribly – I am using the nightly from the 21st June and it works well. At some point I will try later builds but right now a stable platform that I can enable easily and consistently is more important.
    Enable Web Inspector for Safari on Windows
    Inspector in Safari for Windows
    The Inspector Information Window

I’d love to hear from anybody who uses other cool tools that I may not have come across. I’m particularly interested in these kind of things for Opera.