News

The Perils of Perception

Session 1 Bobby Duffy - 2019

by Sam Leith @ OffGrid Sessions – Pic James Clarke

Bobby Duffy — professor of public policy at KCL, former MD of Public Affairs at IPSOS/Mori and author of The Perils of Perception — kicked off this year’s OffGrid Sessions by asking a peculiar question: “Is the Great Wall of China Visible From Space?” About a third of the audience, it turns out, thought it was — even though, as Prof Duffy pointed out, the wall is only nine meters wide and roughly the same colour as the surrounding terrain.

Why do we think that? Because we’ve heard it a lot (“illusory truth bias” means that the more often you hear a lie, the more likely you are to believe it), because we’re thinking quickly rather than reasoning it out, because we tend to muddle scales (the wall is huge — but it’s long, not wide) — and because it’s a cool fact and we kinda want it to be true.

Welcome to the territory of cognitive biases — the territory of Prof Duffy’s book. He set out his stall: “We’re often incredibly wrong.” Just how wrong, and about what, is the question that Prof Duffy and his colleagues sought to answer in a huge study of 100,000 people in 40 countries. The subjects were asked to make guesses about the reality of a wide range of subjects from immigration and violent crime to sex — and their guesses were compared with reality to expose huge biases and errors.

What the study shows is useful, he said, not because it tells us we’re wrong — but because it helps us to discover why we’re wrong. And in discovering why we’re wrong, we’ll get a better idea of what we can do about it.

Take the birth rate among teenage girls. The real figure is about 1.4 per cent in the UK, 2 per cent I the US and 6 per cent in Argentina; but people in those countries, on average, guessed 19 per cent, 24 per cent and 37 per cent respectively.

This isn’t a simple error of arithmetic: we’re drawn to vivid emotional stories (so the media covers these outliers) and we suffer from “emotional innumeracy”: we overestimate things we’re worried about just as much as we worry about things we overestimate. (Which means our overestimations are a good guide to what worries us.) Also, baked deep into our psyches by evolution is a tendency to give more attention to negative news and ignore slow improvements — useful when avoiding sabretooth tigers; less so when estimating social change.

Stats on murder rates (most respondents thought they’d gone up over the last two decades; in fact they’re 29 per cent down) bear this out too — as well as showing how what psychologists call “rosy retrospection” comes into play. We edit our ideas of the past to make it seem more positive.

Other examples showed how “directionally motivated reasoning” causes us to bend the facts to fit our identities rather than vice versa. Democrats and Republicans in the US come out with drastically different estimates of the prevalence of guns in deaths through violence in the US; and two thirds of Leave voters believe the notorious £350 million claim while less than half of Remainers do.

And we generalise, wrongly, from our own experience. An online survey asked Indians what percentage of their countrymen had Facebook accounts. They guessed 67 per cent; the real figure is 18 per cent. The poll was conducted online: so these wired Indians assumed what was true of them would be true of their compatriots.

The most poignant finding was, as so often, about sex. Everyone, it seems, thinks 18–29-year-olds are having more sex than they are. The real figure is 4–5 times a month; the average guess was 17 times a month. But young men estimated that young women were having sex 32 times a month. Just not, obviously, with them.

Prof Duffy ended with words of hope, though. This is not a new crisis — we were just as wrong about everything in the 1940s. On the whole, our negativity bias means that things are mostly not as bad as we think. And by the application of conscious effort, we can learn to resist our own “fast thinking” biases — and find ways of using our natural predispositions to resist disinformation effectively.

Three key takeaways:

1) Cognitive biases are powerful, but they aren’t the whole story.

2) You can push back against your biases: avoid thinking you’re normal; avoid having your attention drawn by the extremes; and actively work to unfilter your world.

3) Facts aren’t useless. Facts and storytelling are not opposites: you can use them together to cut through the disinformation.