But it works on my PC!

The random thoughts of Richard Fennell on technology and software development

Welcome to the past of software development

I was at an interesting meeting at my local BCS branch tonight ‘Opening The Black Box: An Introduction to Quality Driven Development’  by Tim Hunter. I had heard of TDD and DDD etal. but QDD was new to me.

What we got was a hour framed by the basic premise that ‘Waterfall is good - Agile is bad’ (or progressive methods as the speaker called anything that was not waterfall). As another attendee pointed out in the Q&A, this tone in the presentation tended to cloud the more balanced points, managing to get the backs up of a good few attendees by the speaker’s seeming lack of understanding of god agile practices. He seemed to see agile as developers messing around, no documentation, testing or general engineering discipline. He argued that without waterfall, and specifically quality gates, we could not write quality systems. This is not the Agile I know.

Agile, if adopted properly is very constraining from an engineering point of view. We have detailed specification by example, open reporting practices, regular re-estimation of remaining work, test driven development, pair programming, automated builds, regular potentially shippable products with quality gates to move products between states of publication so we don’t just release everything we build. The list goes on and on; OK no team is going to use it all, but the tools are there in the tool box. A team can set where on the agility spectrum they choose to sit.

I agree with the sessions premise that quality gates are important, but not that waterfall is the only way to enforce them. You can put the whole methodology choice aside and frame the discussion in how do we get staff who take pride in their work and are empowered produce quality products via their working environment. I would argue there is more hope for this in an agile framework where the whole team buys into the ethos of software craftsmanship, as opposed to any methodology where an onerous procedure is imposed, a system must be habitable as Alistair Cockburn puts it.

I felt the session was too pessimistic over the quality of people in our industry. The speaker wanting to make rules because he perceived people were of low quality and had to be forced to do a half way decent job. OK I am a bit pessimistic, not too bad a trait for a developer or tester, but we have to hope for more, to strive for more. This is something I think the agile community does do, they are trying to write better software and become better craftsman everyday. They care.

For me the key question is how can we bring more people along with us. Especially the people who have given up and just turn up to do their IT related job and avoid as much hassle as possible. They are the ones who don’t turn up to the BSC, community conference or any user groups or even read a book or blog on the subject. What can we do for them?

The setup story for TFS 2010

I have been looking at the various install and upgrade stories for the TFS 2010 Beta. I have to say they are very nice compared to the older TFS versions. You now have a SharePoint like model where you install the product then use a separate configuration tool to upgrade or setup the features required. There are plenty of places to verify your setting as you go along to greatly reducing the potential mistakes.

One side effect of this model is that it is vital to get all your prerequisites in place. The lack of these has been the cause of the only upgrade scenario I have tried that has failed. This was on a VPC I used for TFS 2008 demos. This VPC used a differencing VHD using the older 2004 format that had a 16Gb limit and this disk was virtual full. To upgrade to TFS 2010 I needed to upgrade SQL to 2008 and this in turn needed Visual Studio 2008 patched to SP1 which needed over 6Gb free space, which was never going to happen on that VHD. So my upgrade failed, but that said this is not a realistic scenario, who has servers with just 16Gb these days!

Everytime I have to use Typemock I need to ask does my code stinks?

Ok a bit sweeping but I think there is truth in this, if you have to resort to a mocking framework (such as Typemock the one I use) I think it is vital to ask ‘why am I using this tool?’ I think there are three possible answers:

  1. I have to mock some back box that is huge and messy that if I don’t mock it will mean any isolated testing is impossible e.g. SharePoint
  2. I have to mock a complex object, I could write it all by hand, but it is quicker to use an auto-mocking framework. Why do loads of typing when a tool can generate it for me? (the same argument as to why using Refactoring tools are good, they are faster than me typing and make less mistakes)
  3. My own code is badly designed and the only way to test it is to use a mocking framework to swap out functional units via ‘magic’ at runtime.

If the bit of code I am testing fails into either of the first two categories it is OK, but if it is in third I know must seriously consider some refactoring. Ok this is not always possible for technical or budgetary reasons, but I should at least consider it. Actually you could consider category 1 as a special case of category 3, a better testable design may be possible, but it is out of your control.

So given this I looked at the new Typemock feature with interest, the ability to fake out DateTime.Now. Something you have not been able to do in the past due to the DataTime classes deep location in the .NET framework. OK it is a really cool feature, but that is certainly not a good enough reason to use it. I have to ask if I need to mock this call out does my code stink?

Historically I would have defined an interface for a date services and used it to pass in a test or production implementation using dependency injection e.g.

public class MyApplication 
{
public MyApplication(IDateProvider dateProvider)
{
// so we use
DateTime date1 = dateProvider.GetCurrentDate();
// as opposed to
DateTime date2 = DateTime.Now
}
}

So in the new world with the new Typemock feature I have three options:

  1. Just call DateTime.Now in my code, because now I know I can use Typemock to intercept the call and return the value I want for test purposes
  2. Write my own date provider and use dependency injection to swap in different versions (or if I want to be really flexible use a IoC framework like Castle Windsor)
  3. Write my own date provider class with a static GetDate method, but not use dependency injection, just call the method directory wherever I would have called DateTime.Now and use Typemock to intercept calls to this static method in tests (the old way to get round the limitation that Typemock cannot mock classes from MSCORELIB

I think this bring me back full circle to my first question: does the fact I use the new feature of Typemock to mock out DateTime.Now mean my code stinks? Well after bit of thought I think it does. I would always favour putting in some design patterns to aid testing, so in this case some dependency injection would appear the best option. Like all services it would allow me to centralise all date functions in one place, so a good SOA pattern. With all my date service in one place I can make a sensible choice of how I want to mock it out, manual or via an auto mocking framework.

So in summary, in mocking, like in so many things in life, just because you can do it is no reason why you should do something in a polite society. If you can, it is better to address a code smell with good design as opposed to a clever tool.

A day at the Architect Insight Conference

I was at the Architect Insight Conference yesterday, so the big question is do I better know the role of the architect in the development process – I have to say no. Don’t get me wrong the event was interesting, I especially enjoyed the interactive group discussion sessions, one of which I chaired if that is the right term. As I have said about other conference I tend to find I get more from the discussions with other delegates than the more tradition presentation sessions.

For me the role of the architect is very fluid. There are many different ways to run a project and a company. Some define a role for the architect, usually those with more formal structures, for others the role is actually an emergent virtual role that the team as a whole perform, usually as part of an agile planning process. There is no single silver bullet solution for all project types, recognising this is probably the big insight of the conference.

Give this why does it seem that people aspire to being an architect? what do the think the role entails that makes it appeal so much?

A very noticeable comment in our interactive session was that recent computer science graduate did not seem to have done much programming as part of their courses. They all seemed to be focused on the business/analysis aspects of the industry. Is this driving people to the perceived glamour/rock star role of architect? More than one delegate went as far as to say they were now looking at A Level students to fill junior developer roles. Graduates were either not interest or lacked the skills companies would expect after completing a degree course. It was easier to train up suitable 18 year olds. In our industry a keen enquiring mind is more important than a degree, something that seems to beaten out of many people at university.

This harks back to my formative years, I was a thin-sandwich course student mixing 6 months at university followed by 6 months in industry (which I hasten to add I would not recommend, better a couple of years study and then a year out in industry). Many people I worked with were not student/graduate engineers but HND students, in my opinion an educational route sadly underused with this current government target of 50% of people going to university. HND’s aimed to turn out good technicians, people who knew the job of making and testing the product, but without the grounding in theoretical theory a graduate would have. Very much a craftsmanship point of view where staff are trained up within team, not arriving from university as the finished article.

So the key takeaway for the conference for me? Software development is a people/communication process. It is key to get everyone involved in the all stages of the process. Whatever else an architect is, they should not a person in an ivory tower lobbing out huge specification tomes to the minions below.

Virtual PC on Windows 7

You can run Virtual PC 2007 on Windows 7, but Windows 7 does include a new version of Virtual PC as part of the operating system, which is good.

The problem I have, and as will many others, is though my 64Bit Acer 8210’s Intel process has hardware virtualization support, Acer for some bizarre reason chose to disable it in the BIOS; thought it was enabled in the 32bit 8200 series, and is enable in the later Travelmate equivalents. Acer are not alone in this choice. This means that many people with fairly recent PCs will not be able to run the newer version of Virtual PC.

If at all possible I think Microsoft need to provide support for host PCs with no hardware virtualization support,or that lack the option to enable it in the BIOS, as does my Acer. However I wonder, as hardware virtualisation is pre-requisite for Hyper V, is it that this new version Virtual PC share technology with Hyper V, thus giving the same hardware requirements?

Holiday is when you catch up…..

I got round to listening to the latest Radio TFS podcast today whist out for a run, Adopting Team System with Steve Borg. If you are looking at adopting TFS or even just critically looking at your development life cycle with a view to improving (irrespective of the tools you use), then this podcast is well worth the time to listen to. It actually covers a lot of the points I was discussing at the Agile Yorkshire user group this week in my session of Crystal Clear. By now I would usually have put my slide stack up for all to download, but in this case, as my session was a book review in essence I would like you to read the original Crystal Clear by Alistair Cockburn.

In my opinion, the key point they both raise is the that it is important to have a process that provides:

  • Safety – provides a framework that means the project can safely be delivered
  • Efficiency – development should be in an efficient manner
  • Habitable – that the team can live with the process (if they can’t the process will be avoided/subverted)

Or to put it another way (and quoting here from the Crystal Clear book) “a little methodology does a lot of good, after that weight is costly”

A point raised at the user group in the chat after my session was that of how to get senior people (such as CEO, CFO etc) to buy into the ‘new’ development process (a critical factor for success). Too often it is heard “I don’t care if you are agile or not, I just want it delivered” and no support is provided beyond the actual coding team from the business. A good discussion of this type of problem is in Gojko Adzic’s book Bridging the Communication Gap: Specification by Example and Agile Acceptance Testing. This is written for non software developers and discusses how to make sure that the whole business is involved in the development process, thus enabling the project to deliver what the business really needs not what people think they need. I would say this book is an essential for anyone involved in the software specifications process – and that should be everyone in an agile project!

Intent is the key - thoughts on the way home form Software Craftsmanship 2009

Today has been interesting, I have been to conferences where you sit and listen, such as DDD, TechEd etc. I have been to conferences where everyone is encouraged to talk open spaces style such as Alt.Net, but today has fallen between the two styles.

The Software Craftsmanship 2009 conference has been in more of a workshop style; most sessions have started with a short presentation to set the scene then the attendees split to forms small groups to do some exercise or chat, reporting back later in the session. A sort of lead open spaces feel if you want.

As usual with events you need to let what you heard sink in, but I think it will be useful. Not so much in the 'I must do X to fix project Y' but in the general approach to development issues. This was a conference on craftsmanship, best practice in general not magic bullets. A good example was in the session on responsibility driven design with mock objects, where a good deal of time was spent discussing the important of variable/object names in the design. From this session you should not take away that 'View' is a bad name and 'Display' is a good one; but that the choice of the name is important to how you will view the intent of the test and the code you are writing.

I suppose this was the theme for the day, in development intent is key, why you do something is more critical than how. It is only through clear understanding of the intent of the business users that a developer can hope to design the best system. So often what the client asks for is based on what they think can be done and unless this requirement is challenged to get at the underlying intend the best solution (whatever best means to the project) will be missed. The same holds true with writing tests, it is vital that the test conveys the intent of what is being tested, else there is little hope for any future maintenance work when all the original staff have moved on. This means to me that the most important part of the user story is the 'so that they can' clause at the end, it is so often the window onto the truth of the real story intent behind the story.

So an excellent day all round, thanks to Jason Gorman and everyone else who helped to organise the event, I look forward to next years, and so should you if you are interested in your craft....

First thoughts on Windows 7 Beta

I had the PDC CTP on my Netbook and that was OK so I had not expected any major issues. That said it has not been without problems, but all the issues I have logged as part of the beta program have been related to hardware detection (missing base stations and ignored physical Wifi switch state) on my Acer laptop. However, these issue can be worked around i.e. don’t use sleep or hibernate. so have not stopped be using the beta on my primary PC.

As to using Windows 7, I like it. I am find the revised UI easy to use, and it certainly seems faster than my Vista build on the same PC, but this might just be the fact it is fresh install on a formatted disk.

I will report more then I have used it for a few days in the real world

Steve Ballmer’s MVP Live Search Challenge

At the last MVP Summit Steve Ballmer said “I’m going to ask you one week switch your default [search engine], one week. At the end of the week…I’ll want feedback, how was your week, what happened, what did you like, what didn’t you like … Can I make that deal with you? (Cheers and applause.) That’s the deal.”

Well the week was last week, and how did I find Live Search?

I have to say it is vastly improved, in the past I just assumed Live Search would find nothing of use, especially if I was after something I would expect to find on a Microsoft site like TechNet.

This week I have found that though it does not return exactly the same a Google, it is just as useful; in fact the two are fairly complimentary. For most searches it does not now seem to matter which one I used, but when really digging one might turn up something the other does not.

So am I going to move back to Goggle? Well I am just not sure it matters for day to day searching. I certainly don’t now feel the need to change my default search engine to Google immediately when I setup a PC as I used to.

live