Where do numbers come from?

When I was addressing this lunacy about how God exists because minds and mathematics are supernatural, I was also thinking about a related set of questions: biologically, how are numbers represented in the brain? How did this ability evolve? I knew there was some interesting work by Ramachandran on the representation of digits and numerical processing, coupled to his work on synesthesia (which is also about how we map abstract ideas on a biological substrate), but I was wondering how I can have a concept of something as abstract as a number — as I sit in my office, I can count the vertical slats in my window blinds, and see that there are 27 of them. How did I do that? Is there a register in my head that’s storing a tally as I counted them? Do I have a mental abacus that’s summing everything up?

And then I realized all the automatic associations with the number 27. It’s an odd number — where is that concept in my cortex? It’s 33. It’s the atomic weight of cobalt, the sum of the digits 2 and 7 is 9, the number of bones in the human hand, 2 times 7 is 14, 27 is 128, my daughter’s age, 1927 was the year Philo Farnsworth first experimentally transmitted television pictures. It’s freakin’ weird if you think about. 27 isn’t even a thing, even though we have a label and a symbol for it, and yet it’s all wrapped up in ideas and connections and causes sensations in my mind.

And why do I have a representation of “27” in my head? It’s not as if this was ever useful to my distant ancestors — they didn’t need to understand that there were precisely 27 antelope over on that hillside, they just needed an awareness that there were many antelope, let’s go kill one and eat it. Or here are 27 mangoes; we don’t need to count them, we need to sort them by ripeness, or throw out the ones that are riddled with parasites. I don’t need a map of “27” to be able to survive. How did this ability evolve?

Really, I don’t take drugs, and I wasn’t sitting there stoned out of my head and contemplating 27. It’s a serious question. So I started searching the scientific literature, because that’s what one does. There has been a great deal of work done tangentially to the questions. Human babies can tell that 3 things is more than 2 things. An African Grey parrot has been taught to count. Neurons in the cortex have been speared with electrodes and found to respond to numbers of objects with differing degrees of activity. The problem with all that is that it doesn’t actually address the problem: I know we can count, I know there is brain activity involved, I can appreciate that being able to tell more from less is a useful ability, but none of it addresses the specific properties of this capacity called number. Worse, most of the literature seems muddled on the concept, and confuses a qualitative understanding of relative quantity for a precursor to advanced mathematics.

But then, after stumbling through papers that were rich on the details but vague on the concept, I found an excellent review by Rafael Núñez that brought a lot of clarity to the problem and summarized the ongoing debates. It also lays out explicitly what had been nagging me about all those other papers: they often leap from “here is a cool numerical capability of the brain” to “this capability is innate and evolved” without adequate justification.

Humans and other species have biologically endowed abilities for discriminating quantities. A widely accepted view sees such abilities as an evolved capacity specific for number and arithmetic. This view, however, is based on an implicit teleological rationale, builds on inaccurate conceptions of biological evolution, downplays human data from non-industrialized cultures, overinterprets results from trained animals, and is enabled by loose terminology that facilitates teleological argumentation. A distinction between quantical (e.g., quantity discrimination) and numerical (exact, symbolic) cognition is needed: quantical cognition provides biologically evolved preconditions for numerical cognition but it does not scale up to number and arithmetic, which require cultural mediation. The argument has implications for debates about the origins of other special capacities – geometry, music, art, and language.

The author also demonstrates that he actually understands some fundamental evolutionary principles, unlike the rather naive versions of evolution that I was recoiling from elsewhere (I’ll include an example later). He also recognizes the clear differences between estimating quantity and having a specific representation of number. He even coins a new word (sorta; it’s been used in other ways) to describe the prior ability, “quantical”.

Quantical: pertaining to quantity related cognition (e.g., subitizing) that is shared by many species and which provides BEPs for numerical cognition and arithmetic, but is itself not about number or arithmetic. Quantical processing seems to be about many sensorial dimensions other than number, and does not, by itself, scale up to produce number and arithmetic.

Oops. I have to unpack a few things there. Subitizing is the ability to immediately recognize a number without having to sequentially count the items; we can do this with a small number, typically around 4. Drop 3 coins on the floor, we can instantly subitize them and say “3!”. Drop 27, you’re going to have to scan through them and tally them up.

BEPs are biologically evolved preconditions.

Biologically evolved preconditions (BEPs): necessary conditions for the manifestation of a behavioral or cognitive ability which, although having evolved via natural selection, do not constitute precursors of such abilities (e.g., human balance mechanisms are BEPs for learning how to snowboard, but they are not precursors or proto-forms of it)

I think this is subtley different from an exaptation. Generally, but not necessarily, exaptations are novel properties that have a functional purpose that can be modified by evolution to have additional abilities; feathers for flight in birds are an exaptation of feathers for insulation in dinosaurs. Núñez is arguing that we have not evolved a native biological ability to do math, but that these BEPs are a kind of toolkit that can be extended cognitively and culturally to create math.

He mentions snowboarding as an example in that definition. No one is going to argue that snowboarding is an evolved ability because some people are really good at it, but for some reason we’re more willing to argue that the existence of good mathematicians means math has to be intrinsic. He carries this analogy forward; I found it useful to get a bigger picture of what he’s saying.

Other interesting data: numbers aren’t universal! If you look at non-industrialized cultures, some have limited numeral systems, sometimes only naming quantities in the subitizing range, and then modifying those with quantifiers equivalent to many. Comparing fMRIs of native English speakers carrying out a numerical task with native Chinese speakers (both groups having a thorough grasp of numbers) produces different results: “the neural circuits and brain regions that are recruited to sustain even the most fundamental aspects of exact symbolic number processing are crucially mediated by cultural factors, such as writing systems, educational organization, and enculturation.”

Núñez argues that many animal studies are over-interpreted. They’re difficult to do; it may require months of training in a testable task to get an experimental animal to respond in a measurable and specific way to a numerical task, so we’re actually looking at a plastic response to an environmental stimulus, which may be limited by the basic properties of the brain being tested, but aren’t actually there in an unconditioned animal. It says this ability is within the range of what it can do if it is specifically shaped by training, not that it is a built-in adaptation.

What we need is a more rigorous definition of what we mean by “number” and “numerical”, and he provides one.

Strangely, because this is one case where I agree with human exceptionalism, he argues that the last point is a signature of Homo sapiens, but…that it is not hard-coded into us, and that it may also be possible to teach non-humans how to do it. I have to add that all of those properties are hard-coded into computers, although they currently lack conscious awareness or intent, so being able to process numbers is not sufficient for intelligence, and an absence of the cultural substrate to enable numerical processing also does not imply a lack of intelligence.

The paper doesn’t exactly answer all of my questions, but at least it provides a clearer framework for thinking about them.


Up above, I said I’d give an example of bad evolutionary thinking from elsewhere in the literature. Conveniently, the Trends in Cognitive Science journal provides one — they link to a rebuttal by Andreas Nieder. It’s terrible and rather embarrassing. It’s not often that I get flashed by a naked Panglossian like this:

Our brain has been shaped by selection pressures during evolution. Therefore, its key faculties – in no way as trivial as snowboarding – are also products of evolution; by applying numbers in science and technology, we change the face of the earth and influence the course of evolution itself. The faculty for symbolic number cannot be conceived to simply ‘lie outside of natural selection’. The functional manifestations of the brain need to be adaptive because they determine whether its carrier survives to pass on its genes. Over generations, this modifies the genetic makeup of a population, and this also changes the basic building plan of the brains and in turn cognitive capabilities of the individuals of a population. The driving forces of evolution are variation and natural selection of genetically heritable information. This means that existing traits are replaced by new, derived traits. Traits may also shift their function when the original function becomes less important, a concept termed ‘exaptation’. In the number domain, existing brain components – originally developed to serve nonverbal quantity representations – may be used for the new purpose of number processing.

I don’t think snowboarding is trivial at all — there are a lot of cognitive and sensory and motor activities involved — but just focus on the part I put in bold. It’s absurd. It’s claiming that if you find any functional property of the brain at all, it had to have had an adaptive benefit and have led to enhanced reproduction. So, apparently, the ability to play Dungeons & Dragons was shaped by natural selection. Migraines are adaptive. The ability to watch Fox News is adaptive. This really is blatant ultra-adaptationism.

He also claims that “it has been recognized that numerical cognition, both nonsymbolically and symbolically, is rooted in our biological heritage as a product of evolution”. OK, I’ll take a look at that. He cites one of his own papers, “The neuronal code for number”, so I read that, too.

It’s a good paper, much better than the garbage he wrote in the rebuttal. It’s largely about “number neurons”, individual cells in the cortex that respond in a roughly quantitative way to visual presentations of number. You show a monkey a field with three dots in it, no matter what the size of the dots or their pattern, and you can find a neuron that responds maximally to that number. I can believe it, and I also think it’s an important early step in working out the underlying network behind number perception.

What it’s not is an evolutionary study, except in the sense that he has a strong preconception that if something exists in the brain, it had to have been produced by selection. All he’s doing in that sentence is affirming the consequent. It also does not address the explanation brought up by Núñez, that these are learned responses. With sufficiently detailed probing, you might be able to find a small network of neurons in my head that encode my wife’s phone number. That does not imply that I have a hard-wired faculty for remembering phone numbers, or even that one specific number, that was honed by generations of my ancestors foraging for 10-digit codes on the African savannah.

Nieder has done some valuable work, but Núñez is right — he’s overinterpreting it when he claims this is evidence that we have a native, evolved ability to comprehend numbers.


Núñez RE (2017) Is There Really an Evolved Capacity for Number? Trends in Cognitive Sciences 21(6):409–424.

Nieder A (2017) Number Faculty Is Rooted in Our Biological Heritage
Trends in Cognitive Sciences 21(6):403–404.

Nieder A (2016) The neuronal code for number. Nat Rev Neurosci 17(6):366-82.

Who’s the zombie here?

Jonathan Wells has a “new” book — he’s rehashing that dreadful crime against honesty and accuracy, Icons of Evolution. It’s titled, ironically, Zombie Science: More Icons of Evolution, and apparently, from Larry Moran’s discussion, it sounds like pretty much the same old book, flavored this time with an obstinate refusal to honestly consider any new evidence, or previous rebuttals of his previous distortions.

I like the cover, though, but only for its stunningly oblivious lack of self-awareness and self-referential nature. Wells is kind of a zombie, his arguments were destroyed 20 years ago, and he just keeps shambling back.

Botanical Wednesday: Enough, Australia, you’re getting carried away

Even the trees are vicious? And not really trees at all?

Australia has a parasite believed to be the largest in the world, a tree whose greedy roots stab victims up to 110m away. The Christmas tree (Nuytsia floribunda) has blades for slicing into the roots of plants to steal their sap. The blades are sharp enough to draw blood on human lips. They cause power failures when the tree attacks buried cables by mistake. Telephone lines get cut as well.

Never confuse climate with weather

The temptation is strong. I remember some amazingly fierce winters in the Pacific Northwest in the late 1960s and 1970s, where we had several feet of snow on the ground, the ponds froze solid, and the Green River was a churning mass of ice chunks. At that time, there were also a few popular magazine articles that speculated about a coming ice age…which was ridiculous. February is always colder than July, but we don’t mourn on New Year’s Day that the planet is doomed by this recent cold spate called Winter, and if there’s anything we know about weather it’s that it fluctuates.

Nowadays, though, one of the techniques used to discredit concerns about global climate change is to pretend that scientists’ opinions are as flighty as the weather, and therefore just as dismissable. Suddenly we have denialists arguing that scientists were claiming that the climate was slipping toward an Ice Age in the 1970s. Nonsense. So here’s a paper by Peterson, Connolley, and Fleck in which they actually did some history and asked what the scientists were actually thinking back then.

Climate science as we know it today did not exist in the 1960s and 1970s. The integrated enterprise embodied in the Nobel Prizewinning work of the Intergovernmental Panel on Climate Change existed then as separate threads of research pursued by isolated groups of scientists. Atmospheric chemists and modelers grappled with the measurement of changes in carbon dioxide and atmospheric gases, and the changes in climate that might result. Meanwhile, geologists and paleoclimate researchers tried to understand when Earth slipped into and out of ice ages, and why. An enduring popular myth suggests that in the 1970s the climate science community was predicting “global cooling” and an “imminent” ice age, an observation frequently used by those who would undermine what climate scientists say today about the prospect of global warming. A review of the literature suggests that, on the contrary, greenhouse warming even then dominated scientists’ thinking as being one of the most important forces shaping Earth’s climate on human time scales. More importantly than showing the falsehood of the myth, this review describes how scientists of the time built the foundation on which the cohesive enterprise of modern climate science now rests.

So even at the time of severe winter storms, scientists were objectively looking at long term trends and determining what was going on from the data, not from looking out their window and watching snowflakes.

One way to determine what scientists think is to ask them. This was actually done in 1977 following the severe 1976/77 winter in the eastern United States. “Collectively,” the 24 eminent climatologists responding to the survey “tended to anticipate a slight global warming rather than a cooling” (National Defense University Research Directorate 1978).

They also analyze the scientific literature of the period, and nope, no “global cooling”, it was all greenhouse effect.

The denialists have resorted to faking magazine covers to spread the myth of a global cooling fad. That’s how desperate they are.

The plain lesson is to never confuse climate with weather, but also, never confuse Time magazine with the scientific literature, especially when it’s been forged.

Sarah Kendior rips on graduate school

Wow. Sarah Kendzior has the most cynical, depressing take on grad school. She’s not entirely down on it and sees some virtue in advanced study, but also has some venom for the academic complex that is actually deserved.

Graduate students live in constant fear. Some of this fear is justified, like the fear of not finding a job. But the fear of unemployment leads to a host of other fears, and you end up with a climate of conformity, timidity, and sycophantic emulation. Intellectual inquiry is suppressed as “unmarketable”, interdisciplinary research is marked as disloyal, public engagement is decried as “unserious”, and critical views are written anonymously lest a search committee find them. I saw the best minds of my generation destroyed by the Academic Jobs Wiki.

I don’t know about that. I know that there were people who had the fast-track to academic success because they’d mastered the drill of churning out grants and papers that were exercises in technique and throwing money at a problem, rather than actually thinking broadly, but there was still room for creative play in the lab. I think I was lucky to have mentors who thought public engagement was important — I think part of that was the fact of teaching, which keeps an academic grounded.

The cult mentality of academia not only curtails intellectual freedom, but hurts graduate students in a personal way. They internalize systemic failure as individual failure, in part because they have sacrificed their own beliefs and ideas to placate market values. The irony is that an academic market this corrupt and over-saturated has no values. Do not sacrifice your integrity to a lottery — even if you are among the few who can afford to buy tickets until you win.

I knew professors who believed in grad school as a winnowing process, where you make suffering the goal so only the strong survive. They were the minority, but the misery of being in their lab was deep.

Anthropology PhDs tend to wind up as contingent workers because they believe they have no other options. This is not true – anthropologists have many skills and could do many things – but there are two main reasons they think so. First, they are conditioned to see working outside of academia as failure. Second, their graduate training is not oriented not toward intellectual exploration, but to shoring up a dying discipline.

Of my graduate school cohort, maybe 5-10% ended up in academia. There is a tendency to see continuing to do whatever you’re doing, only on a slightly more elevated plane, as “success”. We’ve been working at the undergraduate level to make students aware that becoming a professor is only one narrow slice of the range of outcomes of training in STEM.

We also don’t have the idea of being in a “dying discipline” — biology is thriving, as well as any scientific field in the age of Republican anti-intellectualism can be said to be doing well. Kendzior is an anthropologist; I don’t feel that anthropology is dying so much as being under-appreciated.

Gillian Tett famously said that anthropology has committed intellectual suicide. Graduate students are taught to worship at its grave. The aversion to interdisciplinary work, to public engagement, to new subjects, to innovation in general, is wrapped up in the desire to affirm anthropology’s special relevance. Ironically, this is exactly what makes anthropology irrelevant to the larger world. No one outside the discipline cares about your jargon, your endless parenthetical citations, your paywalled portfolio, your quiet compliance. They care whether you have ideas and can communicate them. Anthropologists have so much to offer, but they hide it away.

I got a lot of bad advice in graduate school, but the most depressing was from a professor who said: “Don’t use up all your ideas before you’re on the tenure track.” I was assumed to have a finite number of ideas, and my job as a scholar was to withhold them, revealing them only when it benefited me professionally. The life of the mind was a life of pandering inhibition.

Jebus. That’s terrible advice. I had the benefit of a graduate advisor who seemed to reinvent himself every few years: from immunologist to neuroscientist to cytoplasmic signalling to lineage tracing developmental biologist to geneticist. It kept us on our toes, and there were times we wondered what, exactly, our lab did. I think he set a good example, and never seemed to run out of ideas.

I ignored this along with other advice – don’t get pregnant, don’t get pregnant (again), don’t study the internet, don’t study an authoritarian regime – and I am glad I did. Graduate students need to be their own mentors. They should worry less about pleasing people who disrespect them and more about doing good work.

Because in the end, that is what you are left with – your work. The more you own that, the better off you will be. In the immortal words of Whitney Houston: “No matter what they take from me, they can’t take away my dignity.” And in the equally immortal words of Whitney Houston: “Kiss my ass.” Both sentiments are helpful for navigating graduate school.

Heh. Yes. I got married and we had two kids while we were both in grad school — you’ll notice most of your academic mentors aren’t getting tenure until they’re in their 40s, and 20 year olds putting the real life thing on hold that long is unwise. Grad school, or your job whatever it may be, is not the whole of your life.

Academic training does not need to change so much as academic careerism. There is little sense in embracing careerism when hardly anyone has a career. But graduate school can still have value. Take advantage of your time in school to do something meaningful, and then share it with the world.

At least that section ends on a positive note. I agree. The whole point of education is to open your mind, not to get you a job, but to prepare you for any opportunity that comes around.

What happened to octodon.social?

I’m a fan of Mastodon, the new microblogging service that is trying to break the hegemony of Twitter. It’s better, cleaner, free of most of the trolls, and promises to take seriously complaints about racists and nazis and misogynists, unlike Twitter. It also has an interesting approach, decentralizing the servers who manage the whole show, so you can even pick Mastodon servers that best reflect your interests.

But that might also be a vulnerability. I went for octodon.social, was trying to contribute regularly to it, and then…kablooiee. It’s been down for a couple of days now. It appears to be no fault of the administrator, but the hosting service itself has screwed up.

Anyway, just be prepared for occasional breakdowns. Mastodon is great, but it demands some flexibility that you don’t get with the monolithic monolith.