Strategy mapping avoids the peril of evidence overload

Evidence overload? Can too much evidence be a bad thing?

Well, thanks to David Perell who had a link in his newsletter to a post by Matt Mullenweg, which featured a long quote from Adam Robinson that was published in Tim Ferris‘s book Tribe of Mentors and talked about an experiment undertaken by Paul Slovic in the 1970’s it turns out that too much evidence can indeed be a bad thing. Don’t you just love how these connections pop into your life?

Here’s the experiment.

Eight professional horse handicappers were recruited to predict which horses would win races when different amounts of information on the horses were available to them: 5, 10, 20 or 40 facts on each horse from past performance charts.  All of the races had 10 horses competing, so the probability of picking a winner by random selection was 0.1. The results showed that the professional horse handicappers picked winners with a probability of 0.17, a 70% increase over random selection.  This didn’t change with more or less information available on past horse performance.  What did change, however, was the confidence with which they judged their predictions, as shown in Paul Slovic’s original graph (p24 of this pdf), copied below.

With more evidence, people get over-confident about the accuracy of their predictions!

What can we do about it?  One possible answer is to diminish the ‘illusion of explanatory depth’.  Rozenblit and Keil first demonstrated this illusion in a series of experiments. Subjects were asked to evaluate how well they understood everyday objects, such as a zipper, a flush toilet or a sewing machine. Using a 7 point scale, the average level of understanding was a score of 4 on a 7-point scale. When asked to explain the object and how it worked, their self-assessment fell to just above 3 and when given a diagnostic question about the device, their self-assessment fell again to below 3. Here is the data from p9 of this pdf.

The illusion of explanatory depth is that we think we know more than we do. Since Rozenblit and Keil’s original work, this has been found to be both a powerful and pervasive influence on the way we think.  Sloman and Fernbach (in their book The Knowledge Illusion) took understanding of this illusion a layer deeper. They presented hot political topics of the day (e.g. US imposing unilateral sanctions against Iran) and again asked people to judge how well they understood that topic. They were then asked the following “Please describe all the details you know about the impact of instituting unilateral sanctions against Iran, going from the first step to the last step and providing causal connections between the steps”. Again, they showed the illusion of explanatory depth: they thought they knew more about the issue until asked to explain it, at which point they realised their understanding was less deep than they’d previously thought. Sloman and Fernbach then asked another question; how strongly are they for or against the issue. The more they realised they didn’t really understand the issue, the more their strength of opinion on the subject moderated. This effect applied equally to both sides of the political spectrum. People with strongly held views both in favour of sanctions and against sanctions felt less strong in their opinions once their illusion of explanatory depth had been revealed to them.

And how do we help people get a better understanding of the mechanics underpinning a device, a political issue or even a strategic issue in a business setting? With strategy mapping, of course!   It will help them avoid over-confidence in their decisions / judgements and will help them moderate extreme positions for which they don’t have the necessary justification.

Comments are closed.