We can’t handle the truth, I guess

Mayor Mitch Landrieu of New Orleans made a powerful speech on the removal of Confederate monuments from the city — read the whole thing. What was most notable about it is how strongly he exposes the lie behind the defense of these statues honoring traitors as a part of Southern history. Yeah, the Civil War was real history, but the stories of a genteel Antebellum South and noble Southern aristocrats fighting for their liberty was all propaganda, a lie promoted by regressive monied interests that attempted to romanticize slavery with a set of myths. Tear down the lies, expose the truth.

The historic record is clear, the Robert E. Lee, Jefferson Davis, and P.G.T. Beauregard statues were not erected just to honor these men, but as part of the movement which became known as The Cult of the Lost Cause. This ‘cult’ had one goal – through monuments and through other means – to rewrite history to hide the truth, which is that the Confederacy was on the wrong side of humanity. First erected over 166 years after the founding of our city and 19 years after the end of the Civil War, the monuments that we took down were meant to rebrand the history of our city and the ideals of a defeated Confederacy. It is self-evident that these men did not fight for the United States of America, They fought against it. They may have been warriors, but in this cause they were not patriots. These statues are not just stone and metal. They are not just innocent remembrances of a benign history. These monuments purposefully celebrate a fictional, sanitized Confederacy; ignoring the death, ignoring the enslavement, and the terror that it actually stood for.

After the Civil War, these statues were a part of that terrorism as much as a burning cross on someone’s lawn; they were erected purposefully to send a strong message to all who walked in their shadows about who was still in charge in this city. Should you have further doubt about the true goals of the Confederacy, in the very weeks before the war broke out, the Vice President of the Confederacy, Alexander Stephens, made it clear that the Confederate cause was about maintaining slavery and white supremacy. He said in his now famous ‘corner-stone speech’ that the Confederacy’s “cornerstone rests upon the great truth, that the negro is not equal to the white man; that slavery — subordination to the superior race — is his natural and normal condition. This, our new government, is the first, in the history of the world, based upon this great physical, philosophical, and moral truth.”

America is well practiced at lying about our history. The Pilgrims were virtuous people looking for religious freedom, rather than admitting that they were puritanical religious bigots who wanted to impose their freakish religious beliefs without question. The Declaration of Independence and Constitution are stuffed full of high-minded language about freedom, but also were awash in hypocrisy, since they avoided the fact that our economy was built on the forced enslavement of black people. The US Cavalry was heroic in fighting off savages and advancing the cause of civilization, when they were actually murdering people of a culture that was reeling from the onslaught of European diseases, and sequestering the survivors in barren reservations to live for generations in poverty. The Civil War was about States’ Rights…yeah, fuck it, that’s all bullshit.

Good nations and good people are not built on a history of lies. It’s about time we started waking up. I’m just afraid our media will not be helping, but will instead be constructing new myths. Witness the newspaper of record, the NY Times, busily papering over the first decade of the 21st century.

Both George W. Bush and Barack Obama flexed their executive muscles. Mr. Bush enhanced the president’s control over national security after the Sept. 11 attacks by opening Guantánamo, trying terrorism suspects before military tribunals, and authorizing warrantless wiretapping. Mr. Obama took unilateral aggressive actions to reduce greenhouse gas emissions and reform immigration.

They left the office stronger than when they arrived. Although their policies were controversial, both presidents were given deference because they made their judgments conscientiously and led the government professionally.

Hey, look, Bush and Obama were just the same: let’s sweep the fact that Bush started an unjustified war that drained the treasury and killed hundreds of thousands of people under the rug, and compare lawlessness and the erosion of privacy to efforts to protect the environment from out-of-control capitalism and to opening the doors to refugees…some of them from the regions Bush devastated. But they’re just the same, don’t you know.

And look at that: Bush must be gratified that now, suddenly, “history” of the sort peddled by propaganda organs, is deciding that he was “conscientious” and “professional”. Dear god. Maybe New Orleans needs to replace a statue of Robert E. Lee with one of George W. Bush, so that some later generation can recover their sanity and tear it down.

The culture we have is the culture they enforce

Richard Collins III was murdered in cold blood in public by Sean Urbanski. That’s heinous enough, and Urbanski deserves to be locked up for the rest of his life for his vicious crime. The other factor though is that Collins is black; Urbanski is white. Urbanski was a member of a facebook group with like-minded assholes.

Look closely at that. It’s so representative of the current state of the internet.

First, this was permitted by Facebook. If you show a photo of a breast-feeding mother or any flash of nudity, Facebook will barrel in with righteous indignation and ban your offensive ass, but promote Nazi ideology and brag about killing mud people, and you’re fine, until you actually get out and murder someone. Then your group gets taken down, but not because they’re endorsing racism and criminality…but because it’s suddenly become bad PR for Facebook. Fuck you, Facebook.

Secondly, notice their facade — it’s the same one every troll who has sent me a threatening email has used, the same one used to excuse harassing women. Controversial humor. Memes. It’s funny, dude. Lighten up!

Only it’s not funny. It’s people wallowing in hate and justifying it as a joke — they know that it’s evil shit, they know that they are merrily endorsing it, and they also know that healthy, civilized people find it vile and repugnant, so they try to cover that turd with a a candy coating and call it “humor”. Fuck you too, people who indulge in sickness and racism as a “joke”.

We know that our social media outlets, places like Facebook and Twitter and YouTube and Google, are fully capable of policing content, because they already do — it’s just that they choose to silence people who make tech culture uncomfortable, rather than people who violate basic human decency, because, apparently, they’re run by horrible people who don’t care about human decency.

Just a suggestion, but one place Facebook could start is by taking the 1155 scumbags who joined “Alt-Reich: Nation” and kicking them out. Clean up your act. Get rid of the toxic people.


There’s also a personal element to this story. It’s easy for me to empathize with the family of Richard Collins.

The Anti-Gwyneth

Gwyneth Paltrow is feeling a slight irritation despite being cosseted away in her airy palace of privilege and pretense, raking in money for peddling quackery. She’s challenging critics to “bring their A-game”; her work is such lazy crap that I doubt that is necessary, but I’m also confident she’ll continue to skate along, skimming cash from her fellow rich white women until, of course, the Revolution.

Anyway, she has also pissed off Jen Gunter. Goopy Gwyneth is in trouble, although she doesn’t know it yet. Eventually, if you continue to blithely babble anti-scientific nonsense like this:

More and more scientists will start exposing you.

Gwyneth doesn’t need to worry, though. Like Chopra or Dr Oz or the Health Ranger, she’ll continue to get rich in a material sense, because there is no shortage of rubes out there with more money than sense. She’s just going to lose dignity and self-respect, ironically, those spiritual things she claims to value so much.

The Nebulas have been announced, and I haven’t read a single one

I am a terrible person, but in my defense, this has been a rough and stressful academic year, and I haven’t been keeping up. That’ll change this summer, though, so give me a chance. I’m putting all of these on my Amazon wishlist.

  • Best Novel: All the Birds in the Sky, by Charlie Jane Anders. I’ve heard many raves about this one.

  • Best Novella: Every Heart a Doorway by Seanan McGuire. Just bought it. Figured this ought to be high on my list, since McGuire will be at Convergence in July, and there’s a tiny chance I might have to stammer out a few words in a conversation with her. Also, I’ve read most everything she’s written, so why stop now?

  • Best Novelette: The Long Fall Up, by William Ledbetter. Who? I don’t know this person at all, so I guess I get to discover a new author. And hey, that issue of F&SF is free on Kindle Unlimited!

  • Best Short Story: “Seasons of Glass and Iron” in , The Starlit Wood: New Fairy Tales, by Amal El-Mohtar. Oh, I did meet her, very briefly, at NerdCon. She was nice, and her stories are lovely. I’ll have to read this one, too.

  • The Bradbury award went to Arrival. I’ve seen that!

  • The Norton Award for Young Adult fiction went toArabella of Mars by David Levine, another one I don’t know…but YA stuff is remarkably fresh and good and often more challenging than the “adult” stuff. What’s categorized as adult is too often the conventional crap with military hardware and sexy times and surprisingly frequent violence.

So that’s my next week of light reading sorted.

One question: why novel, novella, and novelette? Isn’t it rather arbitrary to set up categories defined by the length of the work? We don’t have categories for Best Picture Over 3 Hours Long vs. Best Picture Less Than 2 Hours Long, or Best Actor Over or Under 6 Feet Tall. Is this a vestige of a genre that was accustomed to its authors getting paid by word count?

The most punchable face in America

Richard Spencer gets interviewed.

I personally have no desire to punch anyone, but when this guy smirks and sniggers I have nothing but sympathy for those in his presence who just feel a need to reach out and smack him one. Charles Barkley and Gerald Griggs are to be commended for their restraint.

I searched YouTube for a clip of this interview, which was a scarring experience. First one I stumbled across was by someone calling himself “Atheism-Is-Unstoppable”, which meant, as you might guess that the video was a horror show, with “A-I-U” covering it over with his idiot commentary consisting mostly of asserting that America is a white nation. I finally found this short clip which doesn’t include any commentary by the uploader.

You definitely do not want to read any of the comments, unless you’re looking for confirmation that YouTube has been overrun with racist assholes.

So…when creationists sneak bad papers into legit journals, does evolution collapse?

A few days ago, a paper was pointed out to me as a particularly horrible example of bad social science: it was titled “The conceptual penis as a social construct”. I glanced at. It was a murky mess and so bad that I couldn’t even get past the first paragraph, so I abandoned it as simply too much effort to criticize. As it turns out, it was a hoax: the authors were trying to pull a Sokal and expose “‘academic’ fields corrupted by postmodernism”.

We intended to test the hypothesis that flattery of the academic Left’s moral architecture in general, and of the moral orthodoxy in gender studies in particular, is the overwhelming determiner of publication in an academic journal in the field. That is, we sought to demonstrate that a desire for a certain moral view of the world to be validated could overcome the critical assessment required for legitimate scholarship. Particularly, we suspected that gender studies is crippled academically by an overriding almost-religious belief that maleness is the root of all evil.

The lead author is Peter Boghossian, whose own biases are rather obvious in that passage, and I think he overplayed his hand. He actually completely failed to demonstrate what he set out to do.

He sent the crap paper to NORMA: International Journal for Masculinity Studies, a journal with an impact factor of 0, and it was rejected. So, wait, the fake paper was punted? How does that demonstrate that “gender studies is crippled academically”?

NORMA nicely sent them off to resubmit to an even more poorly ranked journal, Cogent Social Sciences, which is so new it doesn’t even have an impact factor, and which is also a pay-to-publish journal. Boghossian then coughed up $625 to convince them to publish it.

At this point the hoax has become completely meaningless. There are bad, predatory journals out there that will take anything a hack scribbles up and publish it for a profit. This is not news. It is also not unique to gender studies or sociology. I’ve pointed out these bad papers more than a few times in journals in science fields.

So when I point out that Erik Andrulis published, in complete seriousness, a paper titled Theory of the Origin, Evolution, and Nature of Life that attempts to explain chemistry, development, and evolution as functions of spiral gyres, does that discredit those fields? When David Abel of the Department of ProtoBioCybernetics and ProtoBioSemiotics publishes a paper on the origin of life that is packed full of buzzwords and pseudoscience, does that mean that Nick Lane and Bill Martin are full of crap, too? Because the Journal of Cosmology exists, astronomy is fake science? John Bohannon created an automatic molecular biology paper generator that churned out garbage papers. They were accepted by 157 science journals. I guess we can scratch the entire field of molecular biology.

As I wrote about that last example:

I agree that there is a serious problem in science publishing. But the problem isn’t open-access: it’s an overproliferation of science journals, a too-frequent lack of rigor in review, and a science community that generates least-publishable-units by the machine-like application of routine protocols in boring experiments.

The lesson to be learned here is that Boghossian executed a poorly performed experiment that didn’t succeed in what he engineered it to do, and which was embarrassingly derivative, and then analyzed the results poorly. At least it cost the hack $625 to attempt some click-bait sensationalism.


There’s more. See Kris Wager, and Ketan Joshi lists lots of examples of hoaxes in science disciplines that didn’t indict entire broad fields of research.

Where do numbers come from?

When I was addressing this lunacy about how God exists because minds and mathematics are supernatural, I was also thinking about a related set of questions: biologically, how are numbers represented in the brain? How did this ability evolve? I knew there was some interesting work by Ramachandran on the representation of digits and numerical processing, coupled to his work on synesthesia (which is also about how we map abstract ideas on a biological substrate), but I was wondering how I can have a concept of something as abstract as a number — as I sit in my office, I can count the vertical slats in my window blinds, and see that there are 27 of them. How did I do that? Is there a register in my head that’s storing a tally as I counted them? Do I have a mental abacus that’s summing everything up?

And then I realized all the automatic associations with the number 27. It’s an odd number — where is that concept in my cortex? It’s 33. It’s the atomic weight of cobalt, the sum of the digits 2 and 7 is 9, the number of bones in the human hand, 2 times 7 is 14, 27 is 128, my daughter’s age, 1927 was the year Philo Farnsworth first experimentally transmitted television pictures. It’s freakin’ weird if you think about. 27 isn’t even a thing, even though we have a label and a symbol for it, and yet it’s all wrapped up in ideas and connections and causes sensations in my mind.

And why do I have a representation of “27” in my head? It’s not as if this was ever useful to my distant ancestors — they didn’t need to understand that there were precisely 27 antelope over on that hillside, they just needed an awareness that there were many antelope, let’s go kill one and eat it. Or here are 27 mangoes; we don’t need to count them, we need to sort them by ripeness, or throw out the ones that are riddled with parasites. I don’t need a map of “27” to be able to survive. How did this ability evolve?

Really, I don’t take drugs, and I wasn’t sitting there stoned out of my head and contemplating 27. It’s a serious question. So I started searching the scientific literature, because that’s what one does. There has been a great deal of work done tangentially to the questions. Human babies can tell that 3 things is more than 2 things. An African Grey parrot has been taught to count. Neurons in the cortex have been speared with electrodes and found to respond to numbers of objects with differing degrees of activity. The problem with all that is that it doesn’t actually address the problem: I know we can count, I know there is brain activity involved, I can appreciate that being able to tell more from less is a useful ability, but none of it addresses the specific properties of this capacity called number. Worse, most of the literature seems muddled on the concept, and confuses a qualitative understanding of relative quantity for a precursor to advanced mathematics.

But then, after stumbling through papers that were rich on the details but vague on the concept, I found an excellent review by Rafael Núñez that brought a lot of clarity to the problem and summarized the ongoing debates. It also lays out explicitly what had been nagging me about all those other papers: they often leap from “here is a cool numerical capability of the brain” to “this capability is innate and evolved” without adequate justification.

Humans and other species have biologically endowed abilities for discriminating quantities. A widely accepted view sees such abilities as an evolved capacity specific for number and arithmetic. This view, however, is based on an implicit teleological rationale, builds on inaccurate conceptions of biological evolution, downplays human data from non-industrialized cultures, overinterprets results from trained animals, and is enabled by loose terminology that facilitates teleological argumentation. A distinction between quantical (e.g., quantity discrimination) and numerical (exact, symbolic) cognition is needed: quantical cognition provides biologically evolved preconditions for numerical cognition but it does not scale up to number and arithmetic, which require cultural mediation. The argument has implications for debates about the origins of other special capacities – geometry, music, art, and language.

The author also demonstrates that he actually understands some fundamental evolutionary principles, unlike the rather naive versions of evolution that I was recoiling from elsewhere (I’ll include an example later). He also recognizes the clear differences between estimating quantity and having a specific representation of number. He even coins a new word (sorta; it’s been used in other ways) to describe the prior ability, “quantical”.

Quantical: pertaining to quantity related cognition (e.g., subitizing) that is shared by many species and which provides BEPs for numerical cognition and arithmetic, but is itself not about number or arithmetic. Quantical processing seems to be about many sensorial dimensions other than number, and does not, by itself, scale up to produce number and arithmetic.

Oops. I have to unpack a few things there. Subitizing is the ability to immediately recognize a number without having to sequentially count the items; we can do this with a small number, typically around 4. Drop 3 coins on the floor, we can instantly subitize them and say “3!”. Drop 27, you’re going to have to scan through them and tally them up.

BEPs are biologically evolved preconditions.

Biologically evolved preconditions (BEPs): necessary conditions for the manifestation of a behavioral or cognitive ability which, although having evolved via natural selection, do not constitute precursors of such abilities (e.g., human balance mechanisms are BEPs for learning how to snowboard, but they are not precursors or proto-forms of it)

I think this is subtley different from an exaptation. Generally, but not necessarily, exaptations are novel properties that have a functional purpose that can be modified by evolution to have additional abilities; feathers for flight in birds are an exaptation of feathers for insulation in dinosaurs. Núñez is arguing that we have not evolved a native biological ability to do math, but that these BEPs are a kind of toolkit that can be extended cognitively and culturally to create math.

He mentions snowboarding as an example in that definition. No one is going to argue that snowboarding is an evolved ability because some people are really good at it, but for some reason we’re more willing to argue that the existence of good mathematicians means math has to be intrinsic. He carries this analogy forward; I found it useful to get a bigger picture of what he’s saying.

Other interesting data: numbers aren’t universal! If you look at non-industrialized cultures, some have limited numeral systems, sometimes only naming quantities in the subitizing range, and then modifying those with quantifiers equivalent to many. Comparing fMRIs of native English speakers carrying out a numerical task with native Chinese speakers (both groups having a thorough grasp of numbers) produces different results: “the neural circuits and brain regions that are recruited to sustain even the most fundamental aspects of exact symbolic number processing are crucially mediated by cultural factors, such as writing systems, educational organization, and enculturation.”

Núñez argues that many animal studies are over-interpreted. They’re difficult to do; it may require months of training in a testable task to get an experimental animal to respond in a measurable and specific way to a numerical task, so we’re actually looking at a plastic response to an environmental stimulus, which may be limited by the basic properties of the brain being tested, but aren’t actually there in an unconditioned animal. It says this ability is within the range of what it can do if it is specifically shaped by training, not that it is a built-in adaptation.

What we need is a more rigorous definition of what we mean by “number” and “numerical”, and he provides one.

Strangely, because this is one case where I agree with human exceptionalism, he argues that the last point is a signature of Homo sapiens, but…that it is not hard-coded into us, and that it may also be possible to teach non-humans how to do it. I have to add that all of those properties are hard-coded into computers, although they currently lack conscious awareness or intent, so being able to process numbers is not sufficient for intelligence, and an absence of the cultural substrate to enable numerical processing also does not imply a lack of intelligence.

The paper doesn’t exactly answer all of my questions, but at least it provides a clearer framework for thinking about them.


Up above, I said I’d give an example of bad evolutionary thinking from elsewhere in the literature. Conveniently, the Trends in Cognitive Science journal provides one — they link to a rebuttal by Andreas Nieder. It’s terrible and rather embarrassing. It’s not often that I get flashed by a naked Panglossian like this:

Our brain has been shaped by selection pressures during evolution. Therefore, its key faculties – in no way as trivial as snowboarding – are also products of evolution; by applying numbers in science and technology, we change the face of the earth and influence the course of evolution itself. The faculty for symbolic number cannot be conceived to simply ‘lie outside of natural selection’. The functional manifestations of the brain need to be adaptive because they determine whether its carrier survives to pass on its genes. Over generations, this modifies the genetic makeup of a population, and this also changes the basic building plan of the brains and in turn cognitive capabilities of the individuals of a population. The driving forces of evolution are variation and natural selection of genetically heritable information. This means that existing traits are replaced by new, derived traits. Traits may also shift their function when the original function becomes less important, a concept termed ‘exaptation’. In the number domain, existing brain components – originally developed to serve nonverbal quantity representations – may be used for the new purpose of number processing.

I don’t think snowboarding is trivial at all — there are a lot of cognitive and sensory and motor activities involved — but just focus on the part I put in bold. It’s absurd. It’s claiming that if you find any functional property of the brain at all, it had to have had an adaptive benefit and have led to enhanced reproduction. So, apparently, the ability to play Dungeons & Dragons was shaped by natural selection. Migraines are adaptive. The ability to watch Fox News is adaptive. This really is blatant ultra-adaptationism.

He also claims that “it has been recognized that numerical cognition, both nonsymbolically and symbolically, is rooted in our biological heritage as a product of evolution”. OK, I’ll take a look at that. He cites one of his own papers, “The neuronal code for number”, so I read that, too.

It’s a good paper, much better than the garbage he wrote in the rebuttal. It’s largely about “number neurons”, individual cells in the cortex that respond in a roughly quantitative way to visual presentations of number. You show a monkey a field with three dots in it, no matter what the size of the dots or their pattern, and you can find a neuron that responds maximally to that number. I can believe it, and I also think it’s an important early step in working out the underlying network behind number perception.

What it’s not is an evolutionary study, except in the sense that he has a strong preconception that if something exists in the brain, it had to have been produced by selection. All he’s doing in that sentence is affirming the consequent. It also does not address the explanation brought up by Núñez, that these are learned responses. With sufficiently detailed probing, you might be able to find a small network of neurons in my head that encode my wife’s phone number. That does not imply that I have a hard-wired faculty for remembering phone numbers, or even that one specific number, that was honed by generations of my ancestors foraging for 10-digit codes on the African savannah.

Nieder has done some valuable work, but Núñez is right — he’s overinterpreting it when he claims this is evidence that we have a native, evolved ability to comprehend numbers.


Núñez RE (2017) Is There Really an Evolved Capacity for Number? Trends in Cognitive Sciences 21(6):409–424.

Nieder A (2017) Number Faculty Is Rooted in Our Biological Heritage
Trends in Cognitive Sciences 21(6):403–404.

Nieder A (2016) The neuronal code for number. Nat Rev Neurosci 17(6):366-82.