Americans Oppose Religious Restrictions on Contraception »« Appeals Court Rules Against Geller, Spencer

Why Political Lies Work

Douglas LaBier writes in Psychology Today to explain why voters are so susceptible to political lies. The answer seems rather obvious to me: Because people tend to think tribally and generally don’t think critically, preferring cognitive shortcuts because they take less effort.

Lies tend to stick in people’s minds, and can sway the outcome of elections, as well as public opinion in many arenas. So, what happens within our minds and emotions that make us receptive to lies, and then resistant to information that exposes the truth? A study led by Stephan Lewandowsky of the University of Western Australia explains part of what may happen. The researchers found that “Weighing the plausibility and the source of a message is cognitively more difficult than simply accepting that the message is true — it requires additional motivational and cognitive resources.”

If the subject isn’t very important to you or you have other things on your mind, misinformation is more likely to take hold, according to the researchers. They point out that rejecting false information requires more cognitive effort than just taking it in. That is, weighing how plausible a message is, or assessing the reliability of its source, is more difficult, cognitively, than simply accepting that the message is true. In short, it takes more mental work. And if the topic isn’t very important to you or you have other things on your mind, the misinformation is more likely to take hold.

Moreover, when you do take the time to valuate a claim or allegation, you’re likely to pay attention just to a limited number of features, the study found. For example: Does the information fit with other things you already believe? Does it make a coherent story with what you already know? Does it come from a credible source? And do others believe it?

In other words, we tend to use cognitive shortcuts. Rather than evaluating claims and arguments independently and rationally, we filter out facts that don’t support our preconceived ideas. While the human brain is certainly capable of thinking rationally, it’s also quite good at preventing us from doing so. Thinking rationally takes effort.

Comments

  1. ajb47 says

    “Being Wrong: Adventures in the Margin of Error” by Kathryn Schulz goes over this, too, with the same conclusion. There are a lot of pressures on us to go along with our “tribe”.

  2. Michael Heath says

    I wonder if this explains the total population rather than some non-partisan subset. I get how it explains the latter, but not the whole. For example, conservatives do not believe truthful assertions by the president, let alone false assertions he makes.

  3. says

    While the human brain is certainly capable of thinking rationally, it’s also quite good at preventing us from doing so.

    That suggests that there’s some “we” who are trying to think rationally, and our brains are somehow standing in the way. I know you didn’t slip into inadvertent dualism here by pitting “us” against “our brains,” but actually it would be more accurate to stop speaking as if thinking rationally is the normal, the default, the thing we’re continually aiming for but failing to reach. By and large, rather, thinking rationally (in the sense of thinking logically, requiring evidence for conclusions) is the exception. “The human mind is designed to reason adaptively, not truthfully or even necessarily rationally” is not a strong enough statement. The truth is, we are creatures made of intuition and bias who manage to diverge into logical thought only occasionally and with great effort.

  4. valhar2000 says

    The truth is, we are creatures made of intuition and bias who manage to diverge into logical thought only occasionally and with great effort.

    Yup.

  5. Paul W., OM says

    There are important social effects that amplify this sort of thing.

    When deciding whether to pass along information that you think is true, your main concern in deciding whether to do that is not whether you’re very sure it’s true, but whether the information, if true, is useful.

    You generally don’t just want to pass on whatever information is most certain (e.g., the thousandth digit of pi), but whatever true information will do the most good—be interesting to the receiver, affect the receiver’s behavior in ways that will benefit them, or you, or society, etc.

    So if you’re 90 percent sure something is true, and think it’s a very good thing for people to know about, you’ll typically pass it along rather than investigating it further first, and only passing it along if that makes you 99 percent sure.

    Being interested in the subject, you may later find out you were wrong, but may be less motivated to correct the record, because the fact that the particular information turns out to be false may seem less important than its being true seemed when you thought it was true—it doesn’t advance your cause, or help the listener, or whatever. It’s just an unfortunate mistake.

    (For example, if you give an illustrative example of a trend you’re sure is real, but it turns out to be a bad example, the fact that there are such counterexamples doesn’t mean your original statistical argument wasn’t right. If you were talking about how smoking causes cancer, and you gave an example of a heavy smoker whose cancer turns out by chance to have in fact been caused by something else—Bill Hicks, maybe?—what difference does it really make? It was more important at the time to illustrate that cancer causes smoking than it is now to talk about particular exceptions to the rule. It doesn’t really matter if Bill Hicks himself didn’t die from smoking, because plenty of people like him do.)

    Even if you do issue a retraction, the audience is is likely to apply the same kind of filter, and the correction is likely not to propagate as far as the error did.

    Another basic factor in what ideas get spread is how memorable they are—you can’t retell a story if you can’t remember it.

    That is why various mythologies tend to be similar—they all tend to combine simple well-known elements in simple ways to tell dramatic stories that are easily understood.

    It is also why supernatural entities tend to have similar characteristics around the world—and to be similar to stuff in superhero stories.

    It’s really easy to remember and retell stories about simple modifications of well-known things, even if those simple modifications make no sense. For example, a pagan god or superhero is pretty much just like a human being, except for having some easy-to-describe differences, like being super-strong, or able to fly, or able to breathe underwater, or whatever.

    Actually explaining how such a thing could plausibly be true is very difficult, but getting the high-level idea and retelling it to someone else (without any real explanation) is dead easy. So that’s what happens, with complicated stories getting stripped down in multiple retellings into a simple essence, with gross stereotypes plus a few striking and easily-remembered exceptions.

    The inevitable retellings of many such stories creates new stereotypes. When you have enough memorable and oft-retold stories of otherwise pretty normal people with a few inexplicable abilities, that creates new stereotypes, some of them minor and concrete, some big and abstract.

    So on one hand you get the Samaritan who’s good, the genius with no common sense, the hooker with a heart of gold, and the welfare queen with a Cadillac, and all that.

    And on the other hand you get more general stereotypes like the general idea of a pagan god or superhero as a very humanlike entity with a superpower or two. So you get pagan religions with pantheons where for any interesting X you can have the god of X, or a literary genre of superhero stories, where for any interesting ability X, you can have a superhero with that X factor.

    In both those cases, the main constraints are constraints of narrative convenience, determined by how people remember things in terms of stereotypes and a very few striking exceptions. (That applies to plots as well as characters. There’s a relatively few basic plots people can accurately remember, and you can’t twist them too far or they forget some of the minor twists and strip them down into something more memorable in the retelling.)

    An important consequence of this is that there ends up being a whole lot of inconsistencies. Some of those inconsistencies are exactly what make things memorable—e.g., Hercules is just a very stereotypical Good Guy plus he’s Super Strong. Others are just the consequence of various stories being memorable in themselves, a thus consistently retold, without comparable pressure to make those stories consistent with each other. (You may have an all-loving God who is extremely jealous and vengeful, too, because it’s more important that you can “get” and remember each of those ideas—and use them to tell and retell various interesting stories about God—than that it actually makes any sense at all to put them together.)

    And of course people are good at rationalizing the inconsistencies that come up due to such narrative pressures—usually by invoking more lazy stereotypes. (E.g., Hercules can be ridiculously strong because um… it’s a magical gift, and magic is like that, with magical essences of X and such. And God is all-loving because he’s Our Father, but like a father he has to discipline his unruly brood, and besides you’re just a dumb kid so what the fuck do you know about right and wrong, trust him.)

    Put together individuals’ thinking in terms of stereotypes with the stricter constraints of communicating with other individuals in terms of shared stereotypes, and you get all sorts of emergent nonsense, including new, fundamentally nonsensical stereotypes.

  6. says

    In other words, we tend to use cognitive shortcuts. Rather than evaluating claims and arguments independently and rationally, we filter out facts that don’t support our preconceived ideas.

    Actually it is a great deal more than filtering out facts that don’t support or which may even strongly contradict our preconceived ideas. Our brains are hardwired around what we know and believe. The more important an idea, the stronger the connection and the more central it will be to the processing of new ideas. Time also strengthens the connections and how new connections are influenced.

    I understand that what I am saying doesn’t contradict and can include what you are saying. The problem is that you fail to assert the actual scope of the issue. We aren’t talking about just circumventing simple cognitive shortcuts. We are talking about circumventing neuropathways, the very structures with which our brains process information. And while these structures are malleable, they are not easily shifted and may be nearly impossible to work around. This is why cognitive dissonance is so very central to our thinking.

    Keep in mind that the cognitive pressures you are talking about here are extreme and often cause extreme reactions. It is much easier to ignore, accept weak evidence that contradicts stronger evidence, and to avoid information altogether, than it is to manage extreme cognitive dissonance. People who are thrown into sharp conflict with critically important central ideas, who cannot resolve the dissonance may well choose to end their life or go completely batshit. We are talking about central connections that have influenced all sorts of other connections. Uprooting these central connections, may mean uprooting so much else that there is little left in terms of identity.

  7. fastlane says

    Michael Heath:

    I wonder if this explains the total population rather than some non-partisan subset. I get how it explains the latter, but not the whole. For example, conservatives do not believe truthful assertions by the president, let alone false assertions he makes.

    I think this part:

    Moreover, when you do take the time to valuate a claim or allegation, you’re likely to pay attention just to a limited number of features, the study found. For example: Does the information fit with other things you already believe? Does it make a coherent story with what you already know? Does it come from a credible source? And do others believe it?

    somewhat explains it. For a lot of people (I know your familiar with Bob Altemeyers’ work), tribalism is their main driver, if you will, and the last part of the quoted bit ‘Do others believe it?’, if written as ‘Do others in my tribe believe it?’ would explain a large part of what you’re describing. So conservatives have already had the statements of the current POTUS ‘pre-evaluated’ for them by the tribe, making the accuracy of said information a moot point.

  8. says

    Well, it is like one of those matryoshka dolls, isn’t it? One belief layered into another set, layered inside yet another set… and what matters is how you built the outer layers in the first place, since every specific instance is evaluated based on all the larger beliefs you hold.

    For instance, this week we’ve had a pretty major storm system hit the east coast of the United States, putting millions of people in the dark and causing billions of dollars of damage. So Republican governor Chris Christie of New Jersey was on Fox News praising the Democratic president’s response to the catastrophe, and rejecting attempts of the on-air personalities to attack the president or support his electoral opponent, Republican Mitt Romney.

    So how do I evaluate this? I could use the smallest “nested” belief, that anything that happens on Fox News is a lie and ignore it on principle. I could use a different layer of nested beliefs that evaluate anything in favor of Democrats in a more favorable light. If I were of a particular mindset I would assume that anything favorable of a Democrat must be a lie, and therefore Governor Christie is either being positive towards Obama because he’s being blackmailed to say good things in exchange for federal relief dollars, or that Christie is actually part of the great Muslim Marxist conspiracy to blow up the moon in order to make white people feel bad.

    Or, I could use what I think is a more reasonable evaluation based on my knowledge of politics, which is that governors tend to be more supportive of “big government” regardless of party because they are more directly responsible to their constituents than members of Congress, so they are more cooperative with the Executive branch. But is that right? Or is that based on some incorrect assumptions on yet another level of thinking?

  9. iangould says

    There’s a very simple experiment that proved this beautifully.

    Subjects were divided into two groups and each was asked to state their political affilation.

    Both groups were then shown pictures and biographies of candidates representing the two main political parties and asked to choose who they’d vote for and justify their choice.

    They were also asked to rate the candidates on scales for attractivness, honesty, likeability and intelligence.

    Unsurprisingly, people chose the candidate of the party they identified with and also tended to rate them more highly in all categories.

    The kicker, of course, is that the party affiliations attributed to the canddiates were switched between the two groups.

    The rationalizations were partcularly interesting. If the candidate for YOUR party has lots of different jobs on their resume it’s because they’re ambitious and want to try new things. If a candidate for the OTHER party has had lots of jobs it’s because they’re unreliable and can’t stick to one thing for any length of time or because they get fired a lot.

Leave a Reply