We are currently awash in fake news and conspiracy theories. From reader Jason, I received this 2018 article that discussed what effect cognitive abilities have on people’s vulnerability to fake news. It highlights a particularly significant risk factor, that people they identified as having low cognitive ability have a particularly hard time rejecting misinformation.
What do they mean by ‘cognitive ability’?
First proposed by the cognitive psychologists Lynn Hasher and Rose Zacks, this theory holds that some people are more prone to “mental clutter” than other people. In other words, some people are less able to discard (or “inhibit”) information from their working memory that is no longer relevant to the task at hand—or, as in the case of Nathalie, information that has been discredited. Research on cognitive aging indicates that, in adulthood, this ability declines considerably with advancing age, suggesting that older adults may also be especially vulnerable to fake news. Another reason why cognitive ability may predict vulnerability to fake news is that it correlates highly with education. Through education, people may develop meta-cognitive skills—strategies for monitoring and regulating one’s own thinking—that can be used to combat the effects of misinformation.
Repeated exposure to a statement tended to make people think the statement was true even after being informed that they were false, even for purely factual statements.
The results revealed that repetition increased the subjects’ perception of the truthfulness of false statements, even for statements they knew to be false. For example, even if a subject correctly answered Pacific Ocean to the question
What is the largest ocean on Earth? on the knowledge test, they still tended to give the false statement The Atlantic Ocean is the largest ocean on Earth a higher truth rating if it was repeated. When a claim was made to feel familiar through repetition, subjects neglected to consult their own knowledge base in rating the claim’s truthfulness.
I think we are all familiar with that effect. A statement that seems familiar tends to also feel more plausible. This is why the creation of echo chambers like Facebook, where false information ricochets all over the place and hits readers from many different angles, tend to breed misinformation.
The article suggests certain metacognitive strategies to reduce the chances of falling prey to fake news.
If you are convinced that some claim is true, ask yourself why. Is it because you have credible evidence that the claim is true, or is it just because you’ve encountered the claim over and over? Also ask yourself if you know of any evidence that refutes the claim. (You just might be surprised to find that you do.)
This is commonsense advice that is not at all new. Long ago, physicist Arnold Arons, whom I consider to be a guru for physics teaching, said his goal in teaching physics to introductory physics students at the University of Washington was to get them to learn to instinctively ask themselves the questions: What do I believe? Why do I believe it? What is the evidence for it? Is there any counter-evidence against it? He did this by structuring his lectures in such a way that these questions were always surfacing.
In my own teaching, I tried to follow Arons’s advice and structure my lectures in that way too. But to be really effective, that approach, which comes under the general umbrella of what is called ‘inquiry-based learning’, has to be carried out at all levels of the K-12 curriculum, not just in this or that isolated course. I was involved in such efforts in Ohio but like so many good education reform efforts in the US, it lacked widespread and sustained implementation over the long term, coupled with the fact that schools and teachers are under-resourced and over-pressured to teach students to pass standardized tests that result in more emphasis on rote learning