Wednesday, August 15, 2012

Dog Show Quality

tl; dr – I'm ranting about arbitrary judgement

I've been enjoying the occasional horse ballet this Olympics. Back in my dim and distant, I learnt to distinguish some of the more obvious of the tiny communications and balances necessary to keep half a tonne of twitchy muscle in shape. At this remove, and especially when watching experts through my lo-res crystal bucket, I now see very little, but one can only admire the quality of the riding, the riders, and the ridden.

Dressage is exacting, it gives joy, and it's an extraordinary skill gained through dedication and talent. Once upon a time there might have been a working reason to get your horse to trot in goosestep, but in the arena those practicalities are subverted into competition. Criteria are set out, judges judge, competitors are measured and ranked. Everyone's an adult, everyone's there by choice, the horses love it – so what's the harm?

None at all, at a guess.

It's not quite the same in all competitions. For instance, competition requirements which set out strict parameters for dog breeds have led to a variety of unpleasant canine complaints as certain breeds of dog have become caricatures of their ideals, or as unmeasured and unexpected emergent properties have popped up (or out, as the case may be). Preset breed standards make it relatively straightforward to judge objectively – but the social act of judging the quality of a dog against those criteria can drive the breed as a whole, indeed the community of breeders as a whole, into unexpected and unwanted unpleasantness. But hey, if that's your bag, go to it.


My interest here is in software and systems and the people who make them. I'm using the situations above as a leaping-off point for an extended metaphor concerning software development and development processes. Be forewarned: I'm not about to respond to commentary from lovers of dressage or of dogs.


Occasionally, I'll be on site, or in a lecture, or talking with a colleague, or reading a paper, and the words "Dog Show Quality"* will pop into my head.

Dog Show Quality is where "Quality" is its own goal. A measured characteristic, where the measurement is made against an arbitrary set of criteria which don't have any necessary relevance outside the narrow limits the criteria themselves describe. You can measure it objectively, and you can use it as a goal for incremental change – but that goal may worsen your situation over time.

Something that is judged to have good Dog Show Quality is not necessarily rubbish. Indeed, it may in itself be best of breed. It's the measurement that is rubbish, and in being rubbish, leads to rubbish. It's not a bad thing to say "Every line of this software has a test", but "Every line of all our software must have a test" is no guarantee of goodness, and defining "Quality" as "Every line has a test" is just a small and noxious step away.

I tend to think Dog Show Quality most often when confronted with ossified software development process. Where signoff matters more than assessment, where knowing where the documents are is more important than what's in them, where spelling is more important than sense. When I talk to the Divisional Champion of Quality Assurance, and they care far more about adherence to process than whether their rotten product creaks, crashes, destroys data and chews the faces off their users' children, I'll think Dog Show Quality.

Dog Show Quality is mostly pointless and sometimes harmful. Making working systems is an exacting and skilled pleasure, but it's also done for money and directly affects real people. If, in the pursuit of quality, your judgement relies on measurements and you continue to measure the relatively unimportant, you're indulging yourself. Get over it, or get a green blazer with brass buttons and hand out rosettes. Your choice.



* There's nothing original under the sun, and this phrase has popped into my head unbidden for years. You may have read or written something similar. If you think I've simply written a cover version of your work without attribution, then I'm really sorry. I've searched, and I've not found your stuff. Please let me know, and if I've stood upon your shoulders to write this, I'll make a clear acknowledgement and link to your ideas.

3 comments:

  1. Well done! Excellent article!

    Now whenever I hear someone mindlessly spout off on "Best Practices", I'll inevitably translate that in my head to "Best of Show" and think of this article.

    ReplyDelete
  2. I think this is a great post, but haven't been able to comment before now (since I feel I need to add something,)

    You have definitely not done a cover version, but Michael Bolton uses "dog and pony show" to describe acceptance testing being done as a ceremony, http://www.developsense.com/presentations/2007-10-PNSQC-UserAcceptanceTesting.pdf

    Thought you wanted to know,
    Rikard

    ReplyDelete
    Replies
    1. Rikard - thank you! And an apology; I think you made this comment weeks ago, and I've only just approved it to go here. In my defence, I've been busy*...

      I'll quote the appropriate passage from Michael's paper: "This brings us to an observation about expertise that might be surprising: for this kind of dog and pony show, the expert tester shows his expertise by never finding a bug. For example, when the Queen inspects the troops, does anyone expect her to perform an actual inspection? Does she behave like a drill sergeant, checking for errant facial hairs? Does she ask a soldier to disassemble his gun so that she can look down the barrel of it? In this circumstance, the inspection is ceremonial. It’s not a fact-finding mission; it’s a stroll. We might call that kind of inspection a formality, or pro forma, or ceremonial, or perfunctory, or ritual; the point is that it’s not an investigation at all."

      For those interested in the idiom, Wikipedia http://en.wikipedia.org/wiki/Dog_and_pony_show for once has a more useful entry than Wiktionary http://en.wiktionary.org/wiki/dog_and_pony_show .

      * my first child was born a few weeks ago...

      Delete