There is a group of people
(they’ll be named before too long)
Who are likely to believe a thing
They’ve just been shown… is wrong.
They’re wrong, it seems, quite often,
But in truth, it brings no shame;
The errors are so commonplace
We’ve given each a name!
There’s type 1 and type 2 error—
The distinction here is this:
The former is a false alarm;
The latter is a miss.
Suppose you find significance
(and do a little dance)
There’s a certain probability
It’s only random chance
Or maybe you find nothing
And you’re pulling out your hair
It’s possible you missed it
But there still is something there!
Belief opposed to evidence
Is faith—or so they say
But scientists (I’ve named the group!)
May do it every day!
At the risk of being misunderstood and quote-mined, I need to address something. I was reminded, by a really bad attempt at taking down “the new atheism”, that there are people who don’t know what they are talking about when it comes to belief. These people exist among the faithful, but also among us godless sorts. And, in part, slight misunderstandings on one hand are jumped upon by opportunists on the other hand (see the really bad attempt for an example) to misrepresent the process of science.
You see, the weird thing is, scientists are not immune to all the belief heuristics that everybody else falls prey to. Scientists want to be right, and will (or may) pay more attention to confirming evidence than disconfirming (when it comes to their own pet theory), will (or may) fall prey to predictable ingroup-outgroup biases, and will (or may) be far more eager to tear apart a competing theory than their own.
The difference between scientists and non-scientists, and the difference between believers and non-believers, isn’t so much in how individuals believe. (I mean, yes it can be, but this is a result of what I am about to say, not a cause.) Rather, it’s a difference in the structure that surrounds them.
The scientific community does not deny the effects of these biases. Rather, it harnesses them. A structure that systematically lets people support their own and tear down others’ ideas (call it “peer review”) harnesses our individual biases for the long-term good of the community. We don’t really need to be self-critical (though some certainly are) when we can set up an environment that will do that for us, systematically and more effectively.
So, yeah, when you are dealing with data that are probabilistic in nature, and when you intentionally and systematically make falsifiable predictions, there will necessarily be times when the data don’t play out the way you expect. And sometimes you will just know that you are right and the data are wrong. And sometimes you will be wrong. And sometimes (the cool thing is, for some of these values, an understanding of probability can let us know fairly well just how often) you will be right, and yes, the data are wrong (the bad news, of course, is that an understanding of probability won’t tell us which times you are right and the data are wrong.
But the scientific community doesn’t care about your pet theory. You can be biased all you want; so are your peers, and some of them want nothing more than to prove you wrong. And you want nothing more than to prove them wrong (and you right). And in the long run, you are crash-testing ideas, and only ideas that can survive the process will survive. In the long run.
There is a similar (well, functionally similar, but not similar in result) structure you can find in religion. Seriously. It’s called “apologetics”, and it is the system’s response to disconfirming data. Rather than rewarding and encouraging the refutation of an apparently bad idea (by others within the scientific community, at least), finding excuses for why the idea actually works is encouraged. (And if you can’t make it work… start a new religion!)
So… people are people. It’s the structures we are part of that allow us to rise above our biases to achieve more collectively than we ever could singly. It’s also the structures we are part of that can fortify our ignorance and make a virtue of non-questioning faith.
Scientists need not be anything special. But science is. And there is no reason to suspect that religious individuals are particularly prone to cognitive biases. But my goodness, religion as an institution (or set of institutions) seems to elevate the bias to an art form. The environments we have created, and which shape us, make us better, or worse, than we would be without them.
And that (as Frost said) has made all the difference.