However, the interconnected world allows us all to read / see what the book's about. Here's a fine blog post of his summarising talk. For those more visually/aurally oriented, here's a the talk itself.
Answer to the question above: Clay says that each year, the people in the USA spend 2,000 times more time than has been needed, so far, to construct Wikipedia. Except that he says it better. Read the article.
Let's work these numbers a tad to give us a feel for their reasonableness. If there are 500,000,000 people in the US, then the work done on Wikipedia so far is equivalent to the TV-watching habits of 250,000 people. That's round about the population of Boise (Idaho) or Daytona Beach. For those of us closer to GMT, think Belfast. If the average citizen spends 10% of their waking time watching TV (The NYT says it's over 4 hours, which is closer to 30%, but let's make life simple and not cut out TV entirely - I assume that many have the TV on while eating, talking, making macramé wallhangings), then it would take the waking time of 25,000 people for around a year. English-only wikipedia has 2.5 million articles, so that's around 3 articles a day per person - I'd have said 1 or less, but we would seem to be around the right kind of numbers.
Of course, this may be a circular analysis if Clay started with 2.5 million articles at around 4 hours a piece to write.
A question for you: Do you know anyone who actually understands the numbers, ie can apply them in their everyday lives? Who understands fundamentally how large numbers of people and small commitments/risks/expenditures actually add up? It doesn't exactly come up in conversation much, but I don't think I know anyone who has a clear feel for this. Perhaps it's my generation. Then again, I know plenty of people who have a fine handle on atomic measurements and cosmological time - perhaps I should get to know some social scientists.
I'm aware that, as humans, we're bad at things outside our direct experience - and have to either put things into scales we can understand, or manipulate the numbers directly. My problem is, perhaps, that we're bad at viscerally understanding how large (small) a thousand (th), a million (th), or a billion (th) actually is - and so when we combine a tiny with a huge we're bad at understanding what that means. In our actual lives, being out by a factor of two is plenty - but when dealing with things beyond our ken, it's much harder to spot. Both the following are out, by a bit - but by how much? 1) A trillion (US) dollars spent on the Iraq war. 2) One in ten million chance of winning the (UK) lottery. To get light-headed, what if the war spend had been aimed at the lottery? Fifty thousand winners, you say, currency conversion being on your mind. Let's say you'd only want one winning ticket a week, if you could help it - that's a winning lottery ticket every week all the way from the battle of Hastings to sometime around my unborn children's late middle age.
Hmm. Let's get back on theme. For the testers among you, another question: How are we sizing our beta tests? At what point might a beta test outweigh a local test team? At what point might a beta test be reasonable expected to have found problems that could surface in the first month of general use?
Justified finger in the air estimates preferred to unsubstantiated formulae.