The desire for belief preservation.


In the previous post we saw how human beings are believed to not be natural critical thinkers, preferring instead to believe in the first plausible explanation for anything that comes along, not seeing these initial explanations as merely hypotheses to be evaluated against competing hypotheses.

But one might think that when we are exposed to alternative hypotheses, we might then shift gears into a critical mode. But Tim van Gelder, writing in the article Teaching Critical Thinking: Some Lessons from Cognitive Science (College Teaching, Winter 2005, vol. 53, No. 1, p. 41-46) argues that what foils this is the human desire for belief preservation.

He quotes seventeenth century philosopher Francis Bacon who said:

The mind of man is far from the nature of a clear and equal glass, wherein the beams of things should reflect according to their true incidence; nay, it is rather like an enchanted glass, full of superstition and imposture, if it be not delivered and reduced.

In other words, van Gelder says, “the mind has intrinsic tendencies toward illusion, distortion, and error.” These arise from a combination of being hard-wired in our brains (because of evolution), natural growth of our brains as we grow up in the Earth’s environment, and the influence of our societies and cultures. “Yet, whatever their origin, they are universal and ineradicable features of our cognitive machinery, usually operating quite invisibly to corrupt our thinking and contaminate our beliefs.”

All these things lead us to have cognitive biases and blind spots that prevent us from seeing things more clearly, and one of the major blind spots is that of belief preservation. van Gelder says that “At root, belief preservation is the tendency to make evidence subservient to belief, rather than the other way around. Put another way, it is the tendency to use evidence to preserve our opinions rather than guide them.”

van Gelder says that when we strongly believe some thing or desire it to be true, we tend to do three things: “1. We seek evidence that supports what we believe and do not seek and avoid or ignore evidence that goes against it. . . 2. We rate evidence as good or bad depending on whether it supports or conflicts with our belief. That is, the belief dictates our evaluation of the evidence, rather than our evaluation of the evidence determining what we should believe. . . 3. We stick with our beliefs even in the face of overwhelming contrary evidence as long as we can find at least some support, no matter how slender.”

This would explain why (as vividly demonstrated in the popular video A Private Universe) people hold on to their erroneous explanations about the phases of the moon even after they have been formally instructed in school about the correct explanation.

This would also explain the question that started these musings: Why for so long had I not applied the same kinds of questioning to my religious beliefs concerning god, heaven, etc. that I routinely applied to other areas of my life? The answer is that since I grew up in a religious environment and accepted the existence of god as plausible, I did not seek other explanations. Any evidence in favor of belief (the sense of emotional upliftment that sometimes occurs during religious services or private prayer, or some event that could be interpreted to indicate god’s action in my life or in the world, or scientific evidence that supported a statement in the Bible) was seized on, while counter evidence (such a massive death and destruction caused by human or natural events, personal misfortunes or tragedies, or scientific discoveries that contradicted Biblical texts) was either ignored or explained away. It was only after I had abandoned my belief in god’s existence that I was able to ask the kinds of questions that I had hitherto avoided.

Did I give up my belief because I could not satisfactorily answer the difficult questions concerning god? Or did I start asking those questions only after I had given up belief in god? In some sense this is a chicken-and-egg problem. Looking back, it is hard to say. Probably it was a little of both. Once I started taking some doubts seriously and started questioning, this probably led to more doubts, more questions, until finally the religious edifice that I had hitherto believed in just collapsed.

In the series of posts dealing with the burden of proof concerning the existence of god, I suggested that if we use the common yardsticks of law or science, then that would require that the burden of proof lies with the person postulating the existence of any entity (whether it be god or a neutrino or whatever), and that in the absence of positive evidence in favor of existence, the default assumption is to assume the non-existence of the entity.

In a comment to one of those postings, Paul Jarc suggested that the burden of proof actually lay with the person trying to convince the other person to change his views. It may be that we are both right. What I was describing was the way that I thought things should be, while Paul was describing the way things are in actual life, due to the tendency of human beings to believe the first thing that sounds right and makes intuitive sense, coupled with the desire to preserve strong beliefs once formed.

van Gelder ends up his article with some good advice:

Belief preservation strikes right at the heart of our general processes of rational deliberation. The ideal critical thinker is aware of the phenomenon, actively monitors her thinking to detect its pernicious influence, and deploys compensatory strategies.

Thus, the ideal critical thinker
• puts extra effort into searching for and attending to evidence that contradicts what she currently believes;
• when “weighing up” the arguments for and against, gives some “extra credit” for those arguments that go against her position; and
• cultivates a willingness to change her mind when the evidence starts mounting against her.

Activities like these do not come easily. Indeed, following these strategies often feels quite perverse. However, they are there for self-protection; they can help you protect your own beliefs against your tendency to self-deception, a bias that is your automatic inheritance as a human being. As Richard Feynman said, “The first principle is that you must not fool yourself – and you are the easiest person to fool.”

The practice of science requires us to routinely think this way. But it is not easy to do and even scientists find it hard to give up their cherished theories in the face of contrary evidence. But because scientific practice requires this kind of thinking, this may also be why science is perceived as ‘hard’ by the general public. Not because of its technical difficulties, but because you are constantly being asked to give up beliefs that seem so naturally true and intuitively obvious.

POST SCRIPT: The people who pay the cost of war

I have nothing to add to this powerful short video, set to the tune of Johnny Cash singing Hurt. Just watch. (Thanks to Jesus’ General.)

Leave a Reply

Your email address will not be published. Required fields are marked *