Public ignorance/awareness of science

Have you ever read one of those news stories reporting that Americans are shockingly ignorant of science? For example, here’s an article saying that a quarter of Americans think the Sun orbits the Earth.

And if you found that shocking, brace yourself for the next one. According to this article, a quarter of Americans think the Sun orbits the Earth!

Okay, so both of the news articles are saying the same thing. But one of the articles is from 2014, and the other is from 1988. They’re both reporting on an NSF study, which has been repeated every couple years for three decades. They always ask whether the Sun orbits the Earth or the Earth orbits the Sun, and they consistently find that about a quarter of USians don’t know or get it wrong. They also ask if electrons are smaller than atoms, if lasers focus sound waves, and if antibiotics kill viruses.* News sources like to put the Sun/Earth statistic in their headlines, because it sounds the most shocking to readers.

You might guess from my tone that I’m a bit more apathetic about the whole thing. Yeah, it’s bad that USians are ignorant of elementary astronomy. But science is not a collection of factoids, and factoids are not the most important component of scientific literacy. As far as facts go, there are way too many for anyone to know all of them, and it’s difficult to judge which facts are more or less important for people to know.  If a fact is “basic” and “obvious”, that might make it socially unacceptable to be ignorant of it, but it also might make it less important to know, given how easy it is to look up the answer.  In my opinion, it’s far more important for people to understand scientific reasoning, like how experiments are designed, and how to read graphs.

I thought the reason we focus on scientific factoids might be because they’re much easier to test. Even though knowledge of factoids does not constitute scientific literacy, perhaps it is an indicator of scientific literacy.  Perhaps no better indicators are available. But it turns out the NSF study does ask questions that test scientific reasoning and understanding of the scientific process. You can see it right here in their latest report, with results dating back to 1999. But for some reason news stories never talk about that part.

To be fair, if I reported that 51% of USians were able to correctly answer a question about experimental design, that information isn’t particularly meaningful without knowing what the question was or how the answers were judged. It’s much easier to look at the question about the Sun and the Earth and immediately say, “Only 73% of USians know this?  How appalling!” So perhaps the 51% statistic is just too opaque, and the question about the Sun and Earth really is the best indicator of scientific literacy available.

But where do these immediate judgments come from? When do we think that a particular percentage is “high” or “low”? Why is it that the statistic about the Sun/Earth is so shocking? If I were to tell you only 34% of USians know that water boils at lower temperatures at high altitudes, is this more or less shocking?

It seems to me that the Earth/Sun statistic is shocking because we think of it as a fact that’s “obvious”, which “everyone” knows. Suppose you lived in a country where far more people get the wrong answer–say, 50%. You might perceive it as less shocking because some of those 50% would be your friends and family. Maybe the 50% even includes you, and you’d say to yourself, “I didn’t know that before, but I got along with my life fine anyways.” Social context is a huge factor when we make judgements about which factoids are most important to know.

The cautious approach is to treat knowledge of scientific facts as just an indicator of scientific literacy. Whether the raw percentage is 73%, 51% or 34% isn’t particularly meaningful. It’s more important to compare percentages to look at the trends. For instance, look at the changes over time (the US has been about the same for three decades), or the differences between countries (US is doing worse than Canada, but comparable to the EU and better than many other countries).

*They also ask if humans developed from an earlier species of animal but that’s not really the same kind of scientific ignorance. Like, if you asked people what they think scientists think, then the rate of correct responses would go up significantly. This topic is beyond the scope of my post.

Another question asks if the universe began with a huge explosion.  Now that just annoys me. While the Big Bang could be characterized as an explosion, that is not the first way I would describe it, and people who understand it in this way tend to have a variety of misconceptions.  What good is it to know that the universe began with an explosion, if you think it was like a bomb?  (return)


  1. Dunc says

    Also, it’s often said that you can get about 20% of people to agree to almost any proposition in a survey… I think it’s quite possible that around 1 in 5 people just like to screw with pollsters.

  2. says

    I think 20% bad faith responses is a severe overestimate. I run an online community survey, which is where you expect to get the most bad faith responses. But troll responses are more detectable than you might think, and they make up like <0.1%. Far more common is misreading questions, typos, misclicks, or "satisficing".

    Anyway, you could put a rough upper bound on survey shenanigans by looking for a question that has near 100% of people giving the same answer. The NSF study has 85% USians agreeing that the center of the earth is very hot. And maybe there's another one closer to 100% in one of the other report's chapters.

Leave a Reply

Your email address will not be published. Required fields are marked *