James Surowiecki’s 2004 book The Wisdom of Crowds offers a somewhat contrarian view of what is typically seen as a widespread bias: the human tendency to follow the crowd and go along with what the majority says is true. Surowiecki argues that in many cases, this is actually a reasonable thing to do, as crowds and groups are often much smarter and more accurate than even their smartest members.
Of course there are many well known circumstances in which crowds have done poorly, and much of Surowiecki’s book attempts to tease out the various issues which affect the accuracy of the group consensus. But he begins by setting out examples which demonstrate his thesis. There have been many experiments performed over the 20th century in which individuals and groups were asked to make various estimations and predictions, and in which the averaged prediction turns out to be highly accurate, often more accurate than even the best individual prediction of the group. Classic examples include traditional "how many beans in the jar" guessing contests. On these kinds of straightforward factual questions there is ample evidence that groups can perform very well.
(I should note that Surowiecki’s book is more broad than deep, and presents much of its information anecdotally. It wasn’t until I found the Notes at the back that I was able to see how well founded many of his claims are. In fact I might almost recommend reading just the 20 page Notes and only dipping into the main text if you want to enjoy the story that he builds around the results.)
Surowiecki also cites the accuracy of prediction markets, including sports betting as well as the newer election markets, where the consensus odds have proven to be highly accurate. Google’s PageRank is also invoked as an example of successfully exploiting the wisdom of crowds.
But as that last example illustrates, success depends crucially on how the collective information is acquired and evaluated. PageRank is a highly artificial construction and one could imagine many alternative algorithms which would not have worked as well. Infrastructure and institution make the difference between a crowd which can bring its diverse knowledge to bear accurately, and one which only manages to exaggerate its own biases.
This last process is described as an "information cascade" and it is one of the most common traps that crowds can fall into. The problem is that recognizing the wisdom of crowds involves a paradox. The crowd can only be wise if the information and insights from all its members are incorporated. But if each person believes that the crowd is wiser than he is (as would typically be correct) then they will only echo back what they think is the crowd consensus, leading to "groupthink" and runaway. This is one way of explaining well known mob behavior such as investment bubbles. Each person changes his own beliefs about prices when he sees the crowd consensus, producing positive feedback and driving prices to unsustainable levels.
Surowiecki therefore advises that collective judgements are best made by at least initially acquiring estimates from each member before he has been exposed to the group consensus. In some circumstances it may then be appropriate to have group discussion or interaction, in order to attempt to reconcile conflicting views. But he also quotes cases where these kinds of interactions have been shown to be harmful, causing the group to move to more extreme positions. In the end, aside from the clear evidence that some institutional arrangements work better than others, and identifying a few common failure modes, it is still an open problem how best to benefit from collective judgement without falling prey to information cascades and other failures.
It’s also clear that the public has a number of false beliefs on specific issues, as discussed here earlier. Such errors appear to have another cause than information cascades, although no doubt people are influenced by the fact that their self-selected peers share their opinions on these matters. It’s tempting to say that people are wrong about, say, the U.S. foreign aid budget, because it doesn’t matter to them whether they are right or wrong, since they have effectively no influence over policy. However, in many of the classic laboratory experiments people are quite successful at making collective judgements about matters where being wrong would be equally unimportant.
When to believe the crowd, when to believe experts, how to cut through the maze of differing opinions and achieve a realistic approximation to the truth, all these are questions that I struggle with. I know many people who are smart, and whose solution is to try to become an "instant expert" on every issue, determining the truth by thinking for themself and weighing all the arguments on each side. To me, this is a spectacularly unlikely route to success! The mere fact that the process produces widely differing answers among different practitioners is a very bad sign. Further, if you delve deeply enough on most subjects of controversy, you find that the arguments and evidence turn out to be subtle and complex, and the superficial gloss which is the best a layman can hope to achieve is likely to miss the real issues.
Rather, I hope to find rules of thumb by which an extremely resource-constrained truth-seeker can judge which side is correct on various factual questions. Following the mob may well be the best thing to do in many circumstances. Surowiecki’s book offers tantalizing clues about when this can work, but there are still many open questions.
Well you won't fall behind if you follow the crowd. "Runaways" substitute short term reputability for longer term status, erroneously assuming the crowd will appreciate the value of insight.
The Wisdom of Crowds Paradox
Donn Parker is a security professional who wrote an article for ISSA Magazine a few months ago that asserted that risk management should be replaced by due diligence, compliance, and enablement (whatever that is). Of course, ignoring risk is simply one...