Vulnerability to fake news


We are currently awash in fake news and conspiracy theories. From reader Jason, I received this 2018 article that discussed what effect cognitive abilities have on people’s vulnerability to fake news. It highlights a particularly significant risk factor, that people they identified as having low cognitive ability have a particularly hard time rejecting misinformation.

What do they mean by ‘cognitive ability’?

First proposed by the cognitive psychologists Lynn Hasher and Rose Zacks, this theory holds that some people are more prone to “mental clutter” than other people. In other words, some people are less able to discard (or “inhibit”) information from their working memory that is no longer relevant to the task at hand—or, as in the case of Nathalie, information that has been discredited. Research on cognitive aging indicates that, in adulthood, this ability declines considerably with advancing age, suggesting that older adults may also be especially vulnerable to fake news. Another reason why cognitive ability may predict vulnerability to fake news is that it correlates highly with education. Through education, people may develop meta-cognitive skills—strategies for monitoring and regulating one’s own thinking—that can be used to combat the effects of misinformation.

Repeated exposure to a statement tended to make people think the statement was true even after being informed that they were false, even for purely factual statements.


The results revealed that repetition increased the subjects’ perception of the truthfulness of false statements, even for statements they knew to be false. For example, even if a subject correctly answered Pacific Ocean to the question What is the largest ocean on Earth? on the knowledge test, they still tended to give the false statement The Atlantic Ocean is the largest ocean on Earth a higher truth rating if it was repeated. When a claim was made to feel familiar through repetition, subjects neglected to consult their own knowledge base in rating the claim’s truthfulness.

I think we are all familiar with that effect. A statement that seems familiar tends to also feel more plausible. This is why the creation of echo chambers like Facebook, where false information ricochets all over the place and hits readers from many different angles, tend to breed misinformation.

The article suggests certain metacognitive strategies to reduce the chances of falling prey to fake news.

If you are convinced that some claim is true, ask yourself why. Is it because you have credible evidence that the claim is true, or is it just because you’ve encountered the claim over and over? Also ask yourself if you know of any evidence that refutes the claim. (You just might be surprised to find that you do.)

This is commonsense advice that is not at all new. Long ago, physicist Arnold Arons, whom I consider to be a guru for physics teaching, said his goal in teaching physics to introductory physics students at the University of Washington was to get them to learn to instinctively ask themselves the questions: What do I believe? Why do I believe it? What is the evidence for it? Is there any counter-evidence against it? He did this by structuring his lectures in such a way that these questions were always surfacing.

In my own teaching, I tried to follow Arons’s advice and structure my lectures in that way too. But to be really effective, that approach, which comes under the general umbrella of what is called ‘inquiry-based learning’, has to be carried out at all levels of the K-12 curriculum, not just in this or that isolated course. I was involved in such efforts in Ohio but like so many good education reform efforts in the US, it lacked widespread and sustained implementation over the long term, coupled with the fact that schools and teachers are under-resourced and over-pressured to teach students to pass standardized tests that result in more emphasis on rote learning

Comments

  1. sonofrojblake says

    “people they identified as having low cognitive ability”

    How did they identify such people?

    ” When a claim was made to feel familiar through repetition, subjects neglected to consult their own knowledge base in rating the claim’s truthfulness.

    I think we are all familiar with that effect”

    I’m not. I had “there is a god” drilled into me through my whole primary and secondary education, and was a firm atheist at the end of it. This is only one example. I accept my experience may be non typical.

  2. brucegee1962 says

    What do I believe? Why do I believe it? What is the evidence for it? Is there any counter-evidence against it?

    These are excellent questions — they should be put above the doors of every institution of higher education.
    It’s interesting that they were stressed by a physics teacher — a discipline which we think of as mostly being about “facts” and hard science. Over here in the humanities, we tend to think of these things as our province, but of course they belong to everyone. The philosophy teachers I know structure their courses around an exploration of these questions, as do I in my composition classes. I tell my students “Our job here in college is to try to get you to question your beliefs. If you end up deciding to keep them, that’s fine, but if you graduate without ever questioning them, then we haven’t done our job.”
    Republicans, of course, since this attitude in higher education as a profound threat to their party and the status quo — and they’re absolutely right.
    You bring up a good point that these critical thinking skills are sadly neglected in K-12 education, and I’m not sure how to address that.

  3. Curt Sampson says

    …a particularly significant risk factor, that people they identified as having low cognitive ability have a particularly hard time rejecting misinformation.

    Well, there is that, but it’s also the case that high cognitive ability brings its own problems. Tim Harford, in his recent (and excellent) article “Facts v feelings: how to stop our emotions misleading us,” writes:

    …people with deeper expertise are better equipped to spot deception, but if they fall into the trap of motivated reasoning, they are able to muster more reasons to believe whatever they really wish to believe.

    And motivated reasoning, it turns out, can ambush you even in stiuations where it should be obvious:

    In 1997, the economists Linda Babcock and George Loewenstein ran an experiment in which participants were given evidence from a real court case about a motorbike accident. They were then randomly assigned to play the role of plaintiff’s attorney (arguing that the injured motorcyclist should receive $100,000 in damages) or defence attorney (arguing that the case should be dismissed or the damages should be low).

    The experimental subjects were given a financial incentive to argue their side of the case persuasively, and to reach an advantageous settlement with the other side. They were also given a separate financial incentive to accurately guess what the damages the judge in the real case had actually awarded. Their predictions should have been unrelated to their role-playing, but their judgment was strongly influenced by what they hoped would be true.

    Tim provides some advice on helping yourself work against this:

    Before I repeat any statistical claim, I first try to take note of how it makes me feel. It’s not a foolproof method against tricking myself, but it’s a habit that does little harm, and is sometimes a great deal of help. Our emotions are powerful. We can’t make them vanish, and nor should we want to. But we can, and should, try to notice when they are clouding our judgment.

    The article is a fairly quick read, and I feel is worth reading in its entirety even if you already think you know all about this.

    [Curt: You did not provide a link to the Tim Haford article you cite but I found one here -- Mano]

Leave a Reply

Your email address will not be published. Required fields are marked *