How to better dispel myths


Myths are tenacious. Many of us get frustrated in our attempts to try and set things straight because it seems like people will believe them against all the evidence. Trying to convince them they are wrong does not seem to work. But maybe we are going about it the wrong way, two authors argue, with the result that we end up actually strengthening the belief rather than weakening it.

Stephen Lewandowsky and John Cook set out to review the science on this topic, and even carried out a few experiments of their own. This effort led to their “Debunker’s Handbook”, which gives practical, evidence-based techniques for correcting misinformation about, say, climate change or evolution. Yet the findings apply to any situation where you find the facts are falling on deaf ears.

The first thing their review turned up is the importance of “backfire effects” – when telling people that they are wrong only strengthens their belief. In one experiment, for example, researchers gave people newspaper corrections that contradicted their views and politics, on topics ranging from tax reform to the existence of weapons of mass destruction. The corrections were not only ignored – they entrenched people’s pre-existing positions.

They suggest that rather than trying to correct the misinformation, what we need to do is ignore the myth and provide an alternative explanation for the phenomenon.

What you must do, they argue, is to start with the plausible alternative (that obviously you believe is correct). If you must mention a myth, you should mention this second, and only after clearly warning people that you’re about to discuss something that isn’t true.

This debunking advice is also worth bearing in mind if you find yourself clinging to your own beliefs in the face of contradictory facts. You can’t be right all of the time, after all.

That last point is important. We have to be on our guard against falling into the same trap of believing myths against the evidence because they appeal to our prejudices.

Comments

  1. says

    I’ve heard about several effects regrading the persistence of wrong information, because it fits a person’s belief system, or simply because it was the first thing they heard on a particular subject. IIRC, en when they take in the new information without protest, they tend to later fall back to the false information. Interesting now to have some possible methods to work with that.

  2. says

    I remember listening to linguist George Lakoff on NPR about 10 years ago talking about how to change people’s minds. Not by presenting competing facts, but by presenting a compelling story to compete with the narrative they currently hold. Our brains don’t really hold facts as discrete items, we need to string them together in an interconnected way in order to really believe them.

  3. Ed says

    In a general sort of way, this reminds me of how surveys (at least in very religious countries) that are intended to measure the prevalence of atheism and skepticism show very low numbers when the term “atheist” is actually used, and low but slightly higher numbers with the language of “don’t believe in God(s).”

    But then the level of admitted unbelief, skepticism, and a naturalist worldview goes up significantly if asked about certainty about theism (“true or false: I know there is a God/gods), belief in the efficacy of prayer, belief in an afterlife, etc.

    In other words when the focus is on the “myth” this tends to elicit loyalty to it on the part of those indoctrinated in it. When the focus is on matters of reasoning and evidence, this may have a liberating effect in allowing taboo possibilities to be considered.

    Similarly, many people have a huge emotional connection to conspiracy theories, for example. They’re an outlet for the legitimate suspicion that one is manipulated by lied to by the powerful, they are interesting in the way that suspense thrillers are, and many of them are popular (I.e., a person’s friends are likely to believe some of the more widespread ones even if they have mainstream views on most things).

    I can certainly see where a direct presentation of the verifiable facts and reasonable inductions about an event that is the subject of a conspiracy theory could harm the conspiracy theory much worse than a step by step debunking.

Leave a Reply

Your email address will not be published. Required fields are marked *