What crowdsourcing can learn from Who Wants to be a Millionaire

I’ve been reading a new book called Smart Swarms recently. It’s one of those Tipping Point / Wikinomics types pop social science books, which doles out huge hunks of insightful wisdom. In Smart Swarm’s case, the book looks at how humans can learn from the collective intelligent behaviour of animals – in particular, how we can learn from animals (like bees and termites) swarm behaviour: i.e. how they manage to self-organise in such huge numbers.

The book’s full of great nuggets of awesomeness, however one bit I found particularly interesting was the discussion around harnessing the wisdom of the crowds – what a lot of people call crowd-sourcing. I particularly liked his simple mathematical proof of how the crowd is more intelligent than the individual. The point the book makes is that even if the majority of people in a crowd don’t know the answer to a question, between themselves they’ll come up with the correct answer.

The book uses Who Wants to be a Millionaire as an example of how you should always listen to the crowd – pointing out that the audience is correct 91% of the time, and the experts 65%. He goes onto prove the power of the crowd through a simple mathematical model as follows:

A contestant on Millionaire is asked the following question, with the 4 options for answers:

Which person from the following list was not a member of the Monkees?

(a) Peter Tork
(b) Davy Jones
(c) Roger Noll
(d) Michael Nesmith

Not knowing the answer, they “ask the audience” for the answer. Assuming there’s an audience of 100, the following is a rough analysis of how crowd manages to select the correct answer (which is “c” Roger Noll by the way!):

Smart_Swarm_numbers

7 people are Monkees fans and know Roger Noll was not a member of the Monkees (but an economist at Stanford)
10 people recognise 2 of the names on the list as being in the Monkees, leaving Noll and one other. Assuming they pick randomly, this gives another 5 votes to Noll.
15 people recognise only 1 of the other names, which leaves another 5 votes for Noll.
The final 68 people have no clue at all, so randomly pick a name splitting the votes evenly across the names, giving Noll another 17 votes.

Adding up all these votes, gives Noll 34 votes in total, with the other names getting around 22 votes each (as statistical law suggests).

This means that even though 93 percent of the audience didn’t know the answer, and were basically guessing, the crowd (the audience) when combined picks the right answer.

The book, then goes on to point out that this idea of a group being able solve a problem even though the individual parts don’t know the answer isn’t at all new, as it was something that Aristotle pointed out in his “Politics”.

Put nicely, the trick of crowdsourcing answers to problems can be put as follows “there is no mystery here. Mistakes cancel one another out, and correct answers, like cream, rise to the surface.”

So, the question has to be: if contestants on game shows playing for £1 million choose to “ask the audience” why don’t business and governments “ask the crowd” more often?

YouTube Preview Image
This entry was posted in Our Thinking and tagged , . Bookmark the permalink.

Facebook comments:

One Response to What crowdsourcing can learn from Who Wants to be a Millionaire

  1. Article Dan says:

    So, the question has to be: if contestants on game shows playing for £1 million choose to “ask the audience” why don’t business and governments “ask the crowd” more often?

    Because game show questions like that seldom come up in political issues. Like guessing the size of a cow – a large crowd will generally average to near as damn it the right answer; but how to achieve peace in the Middle East is a tougher one to crowd source in this way (although no one told Saatchi and Saatchi this – http://www.theimpossiblebrief.com/). Haven’t the Govt also tried to ‘ask the audience’ about how to save money (and the UK economy in the process) – http://www.hm-treasury.gov.uk/spend_index.htm – but again this isn’t going to yield a ‘correct answer’ in the way the Ask the Audience or weigh a cow examples appear to.

    These more sophisticated forms of crowd-sourcing require heavy filtering and judgement – not the mere determination of a mean. This isn’t to say it’s not right to consult, just to recognise the difference between the much vaunted ‘Wisdom of the Crowd’ and the wisdom in the crowd.

    /pompousness

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>