The Debunking Handbook is a guide to debunking myths, by John Cook and Stephan Lewandowsky. Although there is a great deal of psychological research on misinformation, unfortunately there is no summary of the literature that offers practical guidelines on the most effective ways of reducing the influence of misinformation. This Handbook boils down the research into a short, simple summary, intended as a guide for communicators in all areas (not just climate) who encounter misinformation.
This is part three in a five-part series cross-posted from Skeptical Science.
One principle that science communicators often fail to follow is making their content easy to process. That means easy to read, easy to understand and succinct. Information that is easy to process is more likely to be accepted as true.1 Merely enhancing the colour contrast of a printed font so it is easier to read, for example, can increase people’s acceptance of the truth of a statement.2
Common wisdom is that the more counter-arguments you provide, the more successful you’ll be in debunking a myth. It turns out that the opposite can be true. When it comes to refuting misinformation, less can be more. Debunks that offered three arguments, for example, are more successful in reducing the influence of misinformation, compared to debunks that offered twelve arguments which ended up reinforcing the myth.1
The Overkill Backfire Effect occurs because processing many arguments takes more effort than just considering a few. A simple myth is more cognitively attractive than an over-complicated correction.
The solution is to keep your content lean, mean and easy to read. Making your content easy to process means using every tool available. Use simple language, short sentences, subheadings and paragraphs. Avoid dramatic language and derogatory comments that alienate people. Stick to the facts.
End on a strong and simple message that people will remember and tweet to their friends, such as “97 out of 100 climate scientists agree that humans are causing global warning”; or “Study shows that MMR vaccines are safe.” Use graphics wherever possible to illustrate your points.
Scientists have long followed the principles of the Information Deficit Model, which suggests that people hold erroneous views because they don’t have all the information. But too much information can backfire. Adhere instead to the KISS principle: Keep It Simple, Stupid!
The Debunking Handbook, a guide to debunking misinformation, is now freely available to download. Although there is a great deal of psychological research on misinformation, there’s no summary of the literature that offers practical guidelines on the most effective ways of reducing the influence of myths. The Debunking Handbook boils the research down into a short, simple summary, intended as a guide for communicators in all areas (not just climate) who encounter misinformation.
References
- Schwarz, N., Sanna, L., Skurnik, I., & Yoon, C. (2007). Metacognitive experiences and the intricacies of setting people straight:Implications for debiasing and public information campaigns. Advances in Experimental Social Psychology, 39, 127-161.
- Reber, R., Schwarz, N. (1999). Effects of Perceptual Fluency on Judgments of Truth, Consciousness and Cognition, 8, 338-3426.
No comments:
Post a Comment