Who’s the zombie here?

Jonathan Wells has a “new” book — he’s rehashing that dreadful crime against honesty and accuracy, Icons of Evolution. It’s titled, ironically, Zombie Science: More Icons of Evolution, and apparently, from Larry Moran’s discussion, it sounds like pretty much the same old book, flavored this time with an obstinate refusal to honestly consider any new evidence, or previous rebuttals of his previous distortions.

I like the cover, though, but only for its stunningly oblivious lack of self-awareness and self-referential nature. Wells is kind of a zombie, his arguments were destroyed 20 years ago, and he just keeps shambling back.

Botanical Wednesday: Enough, Australia, you’re getting carried away

Even the trees are vicious? And not really trees at all?

Australia has a parasite believed to be the largest in the world, a tree whose greedy roots stab victims up to 110m away. The Christmas tree (Nuytsia floribunda) has blades for slicing into the roots of plants to steal their sap. The blades are sharp enough to draw blood on human lips. They cause power failures when the tree attacks buried cables by mistake. Telephone lines get cut as well.

Never confuse climate with weather

The temptation is strong. I remember some amazingly fierce winters in the Pacific Northwest in the late 1960s and 1970s, where we had several feet of snow on the ground, the ponds froze solid, and the Green River was a churning mass of ice chunks. At that time, there were also a few popular magazine articles that speculated about a coming ice age…which was ridiculous. February is always colder than July, but we don’t mourn on New Year’s Day that the planet is doomed by this recent cold spate called Winter, and if there’s anything we know about weather it’s that it fluctuates.

Nowadays, though, one of the techniques used to discredit concerns about global climate change is to pretend that scientists’ opinions are as flighty as the weather, and therefore just as dismissable. Suddenly we have denialists arguing that scientists were claiming that the climate was slipping toward an Ice Age in the 1970s. Nonsense. So here’s a paper by Peterson, Connolley, and Fleck in which they actually did some history and asked what the scientists were actually thinking back then.

Climate science as we know it today did not exist in the 1960s and 1970s. The integrated enterprise embodied in the Nobel Prizewinning work of the Intergovernmental Panel on Climate Change existed then as separate threads of research pursued by isolated groups of scientists. Atmospheric chemists and modelers grappled with the measurement of changes in carbon dioxide and atmospheric gases, and the changes in climate that might result. Meanwhile, geologists and paleoclimate researchers tried to understand when Earth slipped into and out of ice ages, and why. An enduring popular myth suggests that in the 1970s the climate science community was predicting “global cooling” and an “imminent” ice age, an observation frequently used by those who would undermine what climate scientists say today about the prospect of global warming. A review of the literature suggests that, on the contrary, greenhouse warming even then dominated scientists’ thinking as being one of the most important forces shaping Earth’s climate on human time scales. More importantly than showing the falsehood of the myth, this review describes how scientists of the time built the foundation on which the cohesive enterprise of modern climate science now rests.

So even at the time of severe winter storms, scientists were objectively looking at long term trends and determining what was going on from the data, not from looking out their window and watching snowflakes.

One way to determine what scientists think is to ask them. This was actually done in 1977 following the severe 1976/77 winter in the eastern United States. “Collectively,” the 24 eminent climatologists responding to the survey “tended to anticipate a slight global warming rather than a cooling” (National Defense University Research Directorate 1978).

They also analyze the scientific literature of the period, and nope, no “global cooling”, it was all greenhouse effect.

The denialists have resorted to faking magazine covers to spread the myth of a global cooling fad. That’s how desperate they are.

The plain lesson is to never confuse climate with weather, but also, never confuse Time magazine with the scientific literature, especially when it’s been forged.

Sarah Kendior rips on graduate school

Wow. Sarah Kendzior has the most cynical, depressing take on grad school. She’s not entirely down on it and sees some virtue in advanced study, but also has some venom for the academic complex that is actually deserved.

Graduate students live in constant fear. Some of this fear is justified, like the fear of not finding a job. But the fear of unemployment leads to a host of other fears, and you end up with a climate of conformity, timidity, and sycophantic emulation. Intellectual inquiry is suppressed as “unmarketable”, interdisciplinary research is marked as disloyal, public engagement is decried as “unserious”, and critical views are written anonymously lest a search committee find them. I saw the best minds of my generation destroyed by the Academic Jobs Wiki.

I don’t know about that. I know that there were people who had the fast-track to academic success because they’d mastered the drill of churning out grants and papers that were exercises in technique and throwing money at a problem, rather than actually thinking broadly, but there was still room for creative play in the lab. I think I was lucky to have mentors who thought public engagement was important — I think part of that was the fact of teaching, which keeps an academic grounded.

The cult mentality of academia not only curtails intellectual freedom, but hurts graduate students in a personal way. They internalize systemic failure as individual failure, in part because they have sacrificed their own beliefs and ideas to placate market values. The irony is that an academic market this corrupt and over-saturated has no values. Do not sacrifice your integrity to a lottery — even if you are among the few who can afford to buy tickets until you win.

I knew professors who believed in grad school as a winnowing process, where you make suffering the goal so only the strong survive. They were the minority, but the misery of being in their lab was deep.

Anthropology PhDs tend to wind up as contingent workers because they believe they have no other options. This is not true – anthropologists have many skills and could do many things – but there are two main reasons they think so. First, they are conditioned to see working outside of academia as failure. Second, their graduate training is not oriented not toward intellectual exploration, but to shoring up a dying discipline.

Of my graduate school cohort, maybe 5-10% ended up in academia. There is a tendency to see continuing to do whatever you’re doing, only on a slightly more elevated plane, as “success”. We’ve been working at the undergraduate level to make students aware that becoming a professor is only one narrow slice of the range of outcomes of training in STEM.

We also don’t have the idea of being in a “dying discipline” — biology is thriving, as well as any scientific field in the age of Republican anti-intellectualism can be said to be doing well. Kendzior is an anthropologist; I don’t feel that anthropology is dying so much as being under-appreciated.

Gillian Tett famously said that anthropology has committed intellectual suicide. Graduate students are taught to worship at its grave. The aversion to interdisciplinary work, to public engagement, to new subjects, to innovation in general, is wrapped up in the desire to affirm anthropology’s special relevance. Ironically, this is exactly what makes anthropology irrelevant to the larger world. No one outside the discipline cares about your jargon, your endless parenthetical citations, your paywalled portfolio, your quiet compliance. They care whether you have ideas and can communicate them. Anthropologists have so much to offer, but they hide it away.

I got a lot of bad advice in graduate school, but the most depressing was from a professor who said: “Don’t use up all your ideas before you’re on the tenure track.” I was assumed to have a finite number of ideas, and my job as a scholar was to withhold them, revealing them only when it benefited me professionally. The life of the mind was a life of pandering inhibition.

Jebus. That’s terrible advice. I had the benefit of a graduate advisor who seemed to reinvent himself every few years: from immunologist to neuroscientist to cytoplasmic signalling to lineage tracing developmental biologist to geneticist. It kept us on our toes, and there were times we wondered what, exactly, our lab did. I think he set a good example, and never seemed to run out of ideas.

I ignored this along with other advice – don’t get pregnant, don’t get pregnant (again), don’t study the internet, don’t study an authoritarian regime – and I am glad I did. Graduate students need to be their own mentors. They should worry less about pleasing people who disrespect them and more about doing good work.

Because in the end, that is what you are left with – your work. The more you own that, the better off you will be. In the immortal words of Whitney Houston: “No matter what they take from me, they can’t take away my dignity.” And in the equally immortal words of Whitney Houston: “Kiss my ass.” Both sentiments are helpful for navigating graduate school.

Heh. Yes. I got married and we had two kids while we were both in grad school — you’ll notice most of your academic mentors aren’t getting tenure until they’re in their 40s, and 20 year olds putting the real life thing on hold that long is unwise. Grad school, or your job whatever it may be, is not the whole of your life.

Academic training does not need to change so much as academic careerism. There is little sense in embracing careerism when hardly anyone has a career. But graduate school can still have value. Take advantage of your time in school to do something meaningful, and then share it with the world.

At least that section ends on a positive note. I agree. The whole point of education is to open your mind, not to get you a job, but to prepare you for any opportunity that comes around.

What happened to octodon.social?

I’m a fan of Mastodon, the new microblogging service that is trying to break the hegemony of Twitter. It’s better, cleaner, free of most of the trolls, and promises to take seriously complaints about racists and nazis and misogynists, unlike Twitter. It also has an interesting approach, decentralizing the servers who manage the whole show, so you can even pick Mastodon servers that best reflect your interests.

But that might also be a vulnerability. I went for octodon.social, was trying to contribute regularly to it, and then…kablooiee. It’s been down for a couple of days now. It appears to be no fault of the administrator, but the hosting service itself has screwed up.

Anyway, just be prepared for occasional breakdowns. Mastodon is great, but it demands some flexibility that you don’t get with the monolithic monolith.

That’s quite the racket

Nature Biotechnology published a rather startling paper: DNA-guided genome editing using the Natronobacterium gregoryi Argonaute. It claims “that the Natronobacterium gregoryi Argonaute (NgAgo) is a DNA-guided endonuclease suitable for genome editing in human cells,” which would make it an alternative to CRISPR/Cas9, and would make the authors rich.

I don’t know any of the details, though, because it’s behind a paywall, and my university doesn’t have an institutional subscription (universities don’t automatically get every journal, and the ones we do get cost the institution an arm, a leg, a pound of flesh, and a bucket of blood). I could pay for it personally, but Nature would charge me $32 for a pdf. If you think about it, it’s quite the deal: the authors do all the research work and then pay for the privilege of publishing in a Nature journal, and then Nature charges readers to see it. The last part would be understandable if they charged a reasonable fee, but of course they don’t.

Imagine if the New York Times worked that way. They fire all their journalists, and tell them that their new model is that if they’re very, very good they can continue to be published in the NYT if they pay Arthur Sulzberger for the privilege. Also, Arthur will change subscription policies: it’ll cost you $10,000/year to subscribe, but you could also just pay for individual articles. Yeah, you’ll pay $32 each week to read David Brooks.

But it’s all moot anyway! The paper has been retracted — no one could replicate the results. Or, at least, there’s an editorial expression of concern.

Guess what? I can’t read that one either. $32. Both the article and its ‘retraction’ are still available for a fee.

This is an amazing business model. Publish a tantalizing paper that is crap, charge people to read it. Publish an announcement that said tantalizing paper is crap, charge people to read it. What we need next is an editorial justifying the science journal’s predatory exploitation, charge people to read it.

(via Neuroskeptic)

Matt Herron sent along the paper and the “expression of concern”, if you were curious about the 3 paragraphs you could get for $32.

Editorial Expression of Concern: DNA-guided genome editing using the Natronobacterium gregoryi Argonaute

Feng Gao, Xiao Z Shen, Feng Jiang, Yongqiang Wu & Chunyu Han

Nat. Biotechnol. 34, 768–773 (2016); published online 2 May 2016; addendum published after print 28 November 2016

The editors of Nature Biotechnology are issuing an editorial expression of concern regarding this article to alert our readers to concerns regarding the reproducibility of the original results. At this time, we are publishing the results of three groups (http://dx.doi.org/10.1038/nbt.3753) that have tried to reproduce the results in the critical Figure 4 in the original paper by Han and colleagues, which demonstrates editing of endogenous genomic loci in mammalian cells. None of the groups observed any induction of mutations by NgAgo at any of the loci or under any of the conditions tested above the sensitivity of the assays used. Similar results have been recently reported by a different group of authors in Protein & Cell (doi:10.1007/s13238-016-0343-9).

We are in contact with the authors, who are investigating potential causes for the lack of reproducibility. The authors have been informed of this statement. While the investigations are ongoing, Chunyu Han and Xiao Z. Shen agree with this editorial expression of concern. Feng Gao, Feng Jiang and Yongqiang Wu do not feel that it is appropriate at this time.

We will update our readers once these investigations are complete.

Scott Adams embarks on the Johnny Hart road

I don’t normally read Dilbert — I’ve seen far too much of the benighted ignorant psyche of its creator — but this one was just laid out on a table at the coffee shop yesterday, and I knew I’d have to deal with it. In this one, Dilbert goes full climate science denialist. This might be fun, to dissect Dilbert, because even though it will kill what little humor is present in it, at least we’ll have a good time laughing at Scott Adams. Let’s dissect the shit out of this thing.

Here’s the setup.

OK, this is sort of fine. I think it’s a good idea for companies to think about what impact climate change will have on them, and how they affect the environment. I’m at a green university, and we’ve had these sorts of discussions. Still do, all the time.

It is definitely true that human activity is warming the Earth. It will lead to a global catastrophe, depending on how you define catastrophe: it will cause acute economic disruption, resource wars, and the death of millions. Is that catastrophic enough for you?

By the way, I notice that the scientist is a goateed and balding white man in a lab coat. It’s either unconscious bias (that’s how scientists are supposed to look!), or, I can’t help but notice a weak resemblance to Michael Mann.

Next panel, Dilbert asks Scott Adams’ idea of a smart question.

On the face of it, yes, that is a good question. I’d encourage students to ask that every time an instructor told them something. But consider the context. The answer to that question is readily available — google it. You can read the papers. You should have the answer to that from your high school earth science class. So why is Dilbert being made to ask this trivial question right at the start of this meeting? I can tell right away that this is not a sincere question, this is a derailing tactic to justify a software engineer speaking out of his ass to the scientific expert. Sound familiar?

Then we get the eternal dilemma of the science popularizer. Do you just scorch this ass with contempt because you can see right through him, or do you try to take the question seriously and give the primer in kindergarten climatology he’s asking for?

You can’t win, you know. The game is rigged. If you do the former, you’ll be accused of being hostile and mean. If you do the latter, you’re patronizing and people will write scornful blog posts about how you think raw data dumps will cure all the scientific misunderstandings in the world.

So what do you do? Most of us will take the generous view and try to explain exactly what the questioner is asking for, like our Michael Mann surrogate here:

And that’s also fine. So far, the strip has been true to the characters, and the nature of their interactions. It’s denialist vs. scientist, familiar territory, and now it’s time for the funny, clever twist…but Adams can’t deliver. He has to resort to sticking words in the mouth of the scientist that are not at all true to the character.

That’s just wrong. It’s not what climate scientists say or even think. It’s what Scott Adams, who is no scientist of any kind, says and thinks. And with that betrayal of the premise of the joke, it abruptly falls flat and dies. If all you can do to discredit a point of view is to lie and make puppets say falsehoods, it’s your position that fails. Adams does this because he lacks any insightful response to the honest arguments of scientists.

I guess there’s supposed to be a punchline of some sort next. Once again, Adams fails to meet the minimal standards of his medium.

I think the punchline is supposed to be implying that science supporters can only defend their position by calling True Skeptics mean names. Of course, the entire point of the two panels just above that is to call climate scientists conscious liars.

The only people who will find this at all funny are the denialists who see the panels in which the climate scientist openly maligns his methodology as affirmations of their beliefs. That’s OK, it’ll finally be the death of Dilbert — I skimmed the comments and noticed several people were shocked that Scott Adams endorse an anti-scientific claim. Apparently they’ve never read his blog before.

I shouldn’t claim it’ll kill Dilbert, though. Nothing kills syndicated comics. Johnny Hart went full-blown creationist/evangelical Christian/anti-Muslim bigot, and newspapers just kept right on buying up the strips. Hart died in 2007, and B.C. is still going.

And people think tenured professors have it easy.

Creationists need better evidence than that

I found this claim by Mark Armitage that had determined that a triceratops fossil was only a few thousand years old to be ridiculous. He has a defender, Jay Wile, who disagrees with me. He has two main points.

First, I said that carbon dating a dinosaur fossil is absurd — the 14C levels will be too low to get a reliable ratio. Wile thinks that you can, and that being able to cite a number makes it true.

Well, had Dr. Myers bothered to click on the link given in my post, he would have seen that an age was reported: 41,010 ± 220 years. As I state in that link, this is well within the accepted range of carbon-14 dating, and it is younger than many other carbon-14 dates published in the literature. In addition, the process used to make the sample ready for dating has been spelled out in the peer-reviewed literature, and it is designed to free the sample of all contamination except for carbon that comes from the original fossil. Now as I said in my original post, it’s possible that the reading comes from contamination. However, I find that unlikely, given the process used on the sample, the cellular evidence that Armitage found, and the fact that such carbon-14 dates are common in all manner of fossils that are supposedly millions of years old or older.

There are two sources of 14C we have to be concerned with. The bulk of it is cosmogenic, formed in the upper atmosphere from cosmic ray bombardments of ordinary, stable 14N. This 14C decays at a geologically rapid rate, with a half-life of 5730 years. Living things respire and tend to equilibrate their 14C levels with the environment. Another source, though, is the radioactive decay of other elements that generate high energy particles that can also bang into atoms to generate unstable radioactive isotopes. This is a much rarer event, though, so objects that are dead and buried and isolated from the atmosphere tend to equilibrate to a much lower concentration of 14C.

In carbon dating, the 14C to 12C ratio is measured. If it’s close to that of the atmosphere, it was recently exchanging carbon with the atmosphere. If it’s somewhere above the level of dead carbon buried deep in rocks (which has a non-zero level of 14C), it’s older, and we can estimate how much older from the ratio. You can always calculate a ratio. You can always measure a date. However, it will hit a ceiling of about 50,000 years, because of the limits of precision and because the ratio can converge to a value indistinguishable from the background level of 14C. Date a carbon sample that’s a hundred thousand years old; it will return an age of 50,000 years. Carbon date a chunk of coal from the Carboniferous, 300 million years ago, and it will return an age of 50,000 years.

That an “age” was reported is meaningless. An age of 40,000 years means that about 7 14C half-lives had passed, or that less than 1% of the atmospheric levels of 14C were present in the sample. Wile doesn’t understand this at all. He doesn’t seem to comprehend that there could be another source of 14C than from equilibration with the atmosphere. He thinks it is significant that ancient carbon can have non-zero amounts of 14C.

However, creation scientists have carbon-dated fossils, diamonds, and coal that are all supposed to be millions of years old. Nevertheless, they all have detectable amounts of carbon-14 in them. For example, this study shows detectable levels of carbon-14 in a range of carbon-containing materials that are supposedly 1-500 million years old. Surprisingly, the study includes diamonds from several different locations! Another study showed that fossil ammonites and wood from a lower Cretaceous formation, which is supposed to be 112-120 million years old, also have detectable levels of carbon-14 in them. If these studies are accurate, they show that there is something wrong with the old-earth view: Either carbon dating is not the reliable tool it is thought to be for “recent” dating, or the fossils and materials that are supposed to be millions of years old are not really that old. Of course, both options could also be true.

Or that there are underground sources of radioactive decay that can generate low levels of 14C, and that Jay Wile doesn’t understand basic principles of radiometric dating.

Wile also dismisses the possibility of an inclusion of recent biological material in the sample that might skew the date earlier, which is unjustifiable. Armitage himself writes about Soft, moist, muddy material can be seen surrounding pores of bone vessels on inner horn surfaces and rootlets penetrating lower, interior surface of samples where he claims to spot intact Triceratops cells.

But contamination can’t possibly be a confounding problem, oh no.

The second main point Wile makes is that gosh, those cells sure look like osteocytes, which have a distinctive shape with many branching processes. How would osteocytes have gotten in there?

Armitage did not compromise his own results. He simply wrote truthfully about his fossil. In addition, anyone with a basic understanding of histology would know why plant roots, fungal hyphae, and insect remains do not compromise his results in any way. Based on all the visual evidence, the cells he found are osteocytes. They are not only the shape and size one expects from osteocytes, they have the filipodial extensions that are characteristic of osteocytes. They also have the cell-to-cell junctions one expects in groups of osteocytes. Thus, they cannot be the result of contamination, since plants, fungi, and insects do not have osteocytes.

My answer to that is…I don’t know. It’s weird. And Armitage doesn’t know either, and everything he says about the sample is incompatible with these being intact, preserved osteocytes.

The fact that any soft tissues were present in this heavily fossilized horn specimen would suggest a selective fossilization process, or a sequestration of certain deep tissues as a result of the deep mineralization of the outer dinosaur bone as described by Schweitzer et al. (2007b). As described previously, however, the horn was not desiccated when recovered and actually had a muddy matrix deeply embedded within it, which became evident when the horn fractured. Additionally, in the selected pieces of this horn that were processed, soft tissues seemed to be restricted to narrow slivers or voids within the highly vascular bone, but further work is needed to fully characterize those portions of the horn that contained soft material. It is unclear why these narrow areas resisted permineralization and retained a soft and pliable nature. Nevertheless it is apparent that certain areas of the horn were only lightly impacted by the degradation that accompanied infiltration by matrix and microbial activity. If these elastic sheets of reddish brown soft tissues are biofilm remains, there is still no good explanation of how microorganisms could have replicated the fine structure of osteocyte filipodia and their internal microstructures resembling cellular organelles. Filipodial processes show no evidence of crystallization as do the fractured vessels and some filipodial processes taper elegantly to 500 nm widths.


  • The tissue is not isolated or protected in any way. It’s wet, unmineralized, and filled with a “muddy matrix”. Some of the soft tissues, the “vessels”, are crystallized.

  • The “osteocytes”, though, are perfectly preserved down to the level of organelles, ultrastructural junctions, and delicate processes.

Doesn’t anyone else have a problem with this? I’ve had to struggle with fixative cocktails to get good preservation of single-cell levels of detail; I’ve had animal tissue bathed in a soothing, perfectly balanced medium under my microscope, and seen bacterial infections turn them into disintegrating, collapsing blobs of blebbed out fragments of decaying cells within minutes.

Yet somehow Armitage finds picture-perfect “osteocytes” in tissues that have been soaking in mud, perforated by plant roots, and presumably have been lying there rotting since, by his measure, some time around the Great Flood, a few thousand years ago.

I’m just curious. As an experiment, if we killed a cow and then left it to rot in a damp field for just a month, would that be a good way to make useful histological samples of bone tissue?

How about if we left it there for a year? Or 40,000 years?

The Schweitzer papers on preserved cells in dinosaur bone at least demonstrate careful technique to minimize contamination and artifacts. They also don’t include comments that reveal the author doesn’t understand the basic principles of radiometric dating. The Armitage papers, on the other hand, are sloppy, get improbable results, and reveal a lot of biased reasoning.

I don’t know how cells that look like osteocytes got there, but I’m very suspicious.