top of page

BBC Feature


"We would rather hide our heads in the sand than listen to evidence questioning our beliefs, even if the facts are solid" For BBC future series. Photo: Eddie Gerald / Getty

  • By David Robson

24 March 2016

If you ever need proof of human gullibility, cast your mind back to the attack of the flesh-eating bananas. In January 2000, a series of chain emails began reporting that imported bananas were infecting people with “necrotizing fasciitis” – a rare disease in which the skin erupts into livid purple boils before disintegrating and peeling away from muscle and bone.

According to the email chain, the FDA was trying to cover up the epidemic to avoid panic. Faced with the threat, readers were encouraged to spread the word to their friends and family.

The threat was pure nonsense, of course. But by 28 January, the concern was great enough for the US Centers for Disease Control and Prevention to issue a statement decrying the rumour.

Did it help? Did it heck. Rather than quelling the rumour, they had only poured fuel on its flames. Within weeks, the CDC was hearing from so many distressed callers it had to set up a banana hotline. The facts became so distorted that people eventually started to quote the CDC as the source of the rumour. Even today, new variants of the myth have occasionally reignited those old fears.

We may laugh at these far-fetched urban myths – as ridiculous as the ongoing theory that Paul McCartney, Miley Cyrus and Megan Fox have all been killed and replaced with lookalikes. But the same cracks in our logic allow the propagation of far more dangerous ideas, such as the belief that HIV is harmless and vitamin supplements can cure AIDS, that 9/11 was an ‘inside job’ by the US government, or that a tinfoil hat will stop the FBI from reading your thoughts.

Why do so many false beliefs persist in the face of hard evidence? And why do attempts to deny them only add grist to the rumour mill? It's not a question of intelligence – even Nobel Prize winners have fallen for some bizarre and baseless theories. But a series of recent psychological advances may offer some answers, showing how easy it is to construct a rumour that bypasses the brain’s deception filters.

One, somewhat humbling, explanation is that we are all “cognitive misers” – to save time and energy, our brains use intuition rather than analysis.

As a simple example, quickly answer the following questions:

“How many animals of each kind did Moses take on the Ark?” “Margaret Thatcher was the president of what country?”

Between 10 and 50% of study participants presented with these questions fail to notice that it was Noah, not Moses, who built the Ark, and that Margaret Thatcher was the prime minster, not the president – even when they have been explicitly asked to note inaccuracies.

Known as the “Moses illusion”, this absentmindedness illustrates just how easily we miss the details of a statement, favouring the general gist in place of the specifics. Instead, we normally just judge whether it “feels” right or wrong before accepting or rejecting its message. “Even when we ‘know’ we should be drawing on facts and evidence, we just draw on feelings,” says Eryn Newman at the University of Southern California, whose forthcoming paper summarises the latest research on misinformation.

Based on the research to date, Newman suggests our gut reactions swivel around just five simple questions:

  • Does a fact come from a credible source?

  • Do others believe it?

  • Is there plenty of evidence to support it?

  • Is it compatible with what I believe?

  • Does it tell a good story?

Crucially, our responses to each of these points can be swayed by frivolous, extraneous, details that have nothing to do with the truth.

Consider the questions of whether others believe a statement or not, and whether the source is credible. We tend to trust people who are familiar to us, meaning that the more we see a talking head, the more we will begrudgingly start to believe what they say. “The fact that they aren’t an expert won’t even come into our judgement of the truth,” says Newman. What’s more, we fail to keep count of the number of people supporting a view; when that talking head repeats their idea on endless news programmes, it creates the illusion that the opinion is more popular and pervasive than it really is. Again, the result is that we tend to accept it as the truth.

Sticky nuggets

Then there’s the “cognitive fluency” of a statement – essentially, whether it tells a good, coherent story that is simple to imagine. “If something feels smooth and easy to process, then our default is to expect things to be true,” says Newman. This is particularly true if a myth easily fits with our expectations. “It has to be sticky – a nugget or soundbite that links to what you know, and reaffirms your beliefs,” agrees Stephan Lewandowsky at the University of Bristol in the UK, whose work has examined the psychology of climate change deniers.

A slick presentation will instantly boost the cognitive fluency of a claim, while raising its believability. In one recent study, Newman presented participants with an article (falsely) saying that a well-known rock singer was dead. The subjects were more likely to believe the claim if the article was presented next to a picture of him, simply because it became easier to bring the singer to mind – boosting the cognitive fluency of the statement. Similarly, writing in an easy-to-read font, or speaking with good enunciation, have been shown to increase cognitive fluency; indeed, Newman has shown that something as seemingly inconsequential as the sound of someone’s name can sway us; the easier it is to pronounce, the more likely we are to accept their judgement.

In light of these discoveries, you can begin to understand why the fear of the flesh-eating bananas was so infectious. For one thing, the chain emails were coming from people you inherently trust – your friends – increasing the credibility of the claim, and making it appear more popular. The concept itself was vivid and easy to picture – it had high cognitive fluency. If you happened to distrust the FDA and the government, the thought of a cover-up would have fitted neatly into your worldview.

Newman agrees it’s a helpful strategy. For instance, when considering the fears that MMR vaccines may be linked to autism, she suggests it would be better to build a narrative around the scientific fraud that gave rise to the fears – rather than the typical “myth-busting” article that unwittingly reinforces the misinformation. Whatever story you choose, you need to increase the cognitive fluency with clear language, pictures, and good presentation. And repeating the message, a little but often, will help to keep it fresh in their minds. Soon, it begins to feel as familiar and comfortable as the erroneous myth – and the tide of opinion should begin to turn.

At the very least, staying conscious of these flaws in your thinking will help you to identify when you may be being deceived. Both Newman and Lewandowsky point out that there is a flurry of misinformation flying around the forthcoming US presidential elections, as seen in Donald Trump’s claims that Mexican immigrants bring sexual violence and drug trafficking and Hillary Clinton’s opinion that Isis are using videos of Trump to recruit terrorists. (Neither statement held up to fact-checking.)

It’s always worth asking whether you have thought carefully about the things you are reading and hearing. Or are you just being a cognitive miser, persuaded by biased feelings rather than facts? Some of your dearest opinions may have no more substance than the great banana hoax of the year 2000.

Featured Posts
Recent Posts
Search Archive
Follow Me
  • Facebook - Black Circle
  • Instagram - Black Circle
bottom of page