Who’s the zombie here?

Jonathan Wells has a “new” book — he’s rehashing that dreadful crime against honesty and accuracy, Icons of Evolution. It’s titled, ironically, Zombie Science: More Icons of Evolution, and apparently, from Larry Moran’s discussion, it sounds like pretty much the same old book, flavored this time with an obstinate refusal to honestly consider any new evidence, or previous rebuttals of his previous distortions.

I like the cover, though, but only for its stunningly oblivious lack of self-awareness and self-referential nature. Wells is kind of a zombie, his arguments were destroyed 20 years ago, and he just keeps shambling back.

Botanical Wednesday: Enough, Australia, you’re getting carried away

Even the trees are vicious? And not really trees at all?

Australia has a parasite believed to be the largest in the world, a tree whose greedy roots stab victims up to 110m away. The Christmas tree (Nuytsia floribunda) has blades for slicing into the roots of plants to steal their sap. The blades are sharp enough to draw blood on human lips. They cause power failures when the tree attacks buried cables by mistake. Telephone lines get cut as well.

Never confuse climate with weather

The temptation is strong. I remember some amazingly fierce winters in the Pacific Northwest in the late 1960s and 1970s, where we had several feet of snow on the ground, the ponds froze solid, and the Green River was a churning mass of ice chunks. At that time, there were also a few popular magazine articles that speculated about a coming ice age…which was ridiculous. February is always colder than July, but we don’t mourn on New Year’s Day that the planet is doomed by this recent cold spate called Winter, and if there’s anything we know about weather it’s that it fluctuates.

Nowadays, though, one of the techniques used to discredit concerns about global climate change is to pretend that scientists’ opinions are as flighty as the weather, and therefore just as dismissable. Suddenly we have denialists arguing that scientists were claiming that the climate was slipping toward an Ice Age in the 1970s. Nonsense. So here’s a paper by Peterson, Connolley, and Fleck in which they actually did some history and asked what the scientists were actually thinking back then.

Climate science as we know it today did not exist in the 1960s and 1970s. The integrated enterprise embodied in the Nobel Prizewinning work of the Intergovernmental Panel on Climate Change existed then as separate threads of research pursued by isolated groups of scientists. Atmospheric chemists and modelers grappled with the measurement of changes in carbon dioxide and atmospheric gases, and the changes in climate that might result. Meanwhile, geologists and paleoclimate researchers tried to understand when Earth slipped into and out of ice ages, and why. An enduring popular myth suggests that in the 1970s the climate science community was predicting “global cooling” and an “imminent” ice age, an observation frequently used by those who would undermine what climate scientists say today about the prospect of global warming. A review of the literature suggests that, on the contrary, greenhouse warming even then dominated scientists’ thinking as being one of the most important forces shaping Earth’s climate on human time scales. More importantly than showing the falsehood of the myth, this review describes how scientists of the time built the foundation on which the cohesive enterprise of modern climate science now rests.

So even at the time of severe winter storms, scientists were objectively looking at long term trends and determining what was going on from the data, not from looking out their window and watching snowflakes.

One way to determine what scientists think is to ask them. This was actually done in 1977 following the severe 1976/77 winter in the eastern United States. “Collectively,” the 24 eminent climatologists responding to the survey “tended to anticipate a slight global warming rather than a cooling” (National Defense University Research Directorate 1978).

They also analyze the scientific literature of the period, and nope, no “global cooling”, it was all greenhouse effect.

The denialists have resorted to faking magazine covers to spread the myth of a global cooling fad. That’s how desperate they are.

The plain lesson is to never confuse climate with weather, but also, never confuse Time magazine with the scientific literature, especially when it’s been forged.

Sarah Kendior rips on graduate school

Wow. Sarah Kendzior has the most cynical, depressing take on grad school. She’s not entirely down on it and sees some virtue in advanced study, but also has some venom for the academic complex that is actually deserved.

Graduate students live in constant fear. Some of this fear is justified, like the fear of not finding a job. But the fear of unemployment leads to a host of other fears, and you end up with a climate of conformity, timidity, and sycophantic emulation. Intellectual inquiry is suppressed as “unmarketable”, interdisciplinary research is marked as disloyal, public engagement is decried as “unserious”, and critical views are written anonymously lest a search committee find them. I saw the best minds of my generation destroyed by the Academic Jobs Wiki.

I don’t know about that. I know that there were people who had the fast-track to academic success because they’d mastered the drill of churning out grants and papers that were exercises in technique and throwing money at a problem, rather than actually thinking broadly, but there was still room for creative play in the lab. I think I was lucky to have mentors who thought public engagement was important — I think part of that was the fact of teaching, which keeps an academic grounded.

The cult mentality of academia not only curtails intellectual freedom, but hurts graduate students in a personal way. They internalize systemic failure as individual failure, in part because they have sacrificed their own beliefs and ideas to placate market values. The irony is that an academic market this corrupt and over-saturated has no values. Do not sacrifice your integrity to a lottery — even if you are among the few who can afford to buy tickets until you win.

I knew professors who believed in grad school as a winnowing process, where you make suffering the goal so only the strong survive. They were the minority, but the misery of being in their lab was deep.

Anthropology PhDs tend to wind up as contingent workers because they believe they have no other options. This is not true – anthropologists have many skills and could do many things – but there are two main reasons they think so. First, they are conditioned to see working outside of academia as failure. Second, their graduate training is not oriented not toward intellectual exploration, but to shoring up a dying discipline.

Of my graduate school cohort, maybe 5-10% ended up in academia. There is a tendency to see continuing to do whatever you’re doing, only on a slightly more elevated plane, as “success”. We’ve been working at the undergraduate level to make students aware that becoming a professor is only one narrow slice of the range of outcomes of training in STEM.

We also don’t have the idea of being in a “dying discipline” — biology is thriving, as well as any scientific field in the age of Republican anti-intellectualism can be said to be doing well. Kendzior is an anthropologist; I don’t feel that anthropology is dying so much as being under-appreciated.

Gillian Tett famously said that anthropology has committed intellectual suicide. Graduate students are taught to worship at its grave. The aversion to interdisciplinary work, to public engagement, to new subjects, to innovation in general, is wrapped up in the desire to affirm anthropology’s special relevance. Ironically, this is exactly what makes anthropology irrelevant to the larger world. No one outside the discipline cares about your jargon, your endless parenthetical citations, your paywalled portfolio, your quiet compliance. They care whether you have ideas and can communicate them. Anthropologists have so much to offer, but they hide it away.

I got a lot of bad advice in graduate school, but the most depressing was from a professor who said: “Don’t use up all your ideas before you’re on the tenure track.” I was assumed to have a finite number of ideas, and my job as a scholar was to withhold them, revealing them only when it benefited me professionally. The life of the mind was a life of pandering inhibition.

Jebus. That’s terrible advice. I had the benefit of a graduate advisor who seemed to reinvent himself every few years: from immunologist to neuroscientist to cytoplasmic signalling to lineage tracing developmental biologist to geneticist. It kept us on our toes, and there were times we wondered what, exactly, our lab did. I think he set a good example, and never seemed to run out of ideas.

I ignored this along with other advice – don’t get pregnant, don’t get pregnant (again), don’t study the internet, don’t study an authoritarian regime – and I am glad I did. Graduate students need to be their own mentors. They should worry less about pleasing people who disrespect them and more about doing good work.

Because in the end, that is what you are left with – your work. The more you own that, the better off you will be. In the immortal words of Whitney Houston: “No matter what they take from me, they can’t take away my dignity.” And in the equally immortal words of Whitney Houston: “Kiss my ass.” Both sentiments are helpful for navigating graduate school.

Heh. Yes. I got married and we had two kids while we were both in grad school — you’ll notice most of your academic mentors aren’t getting tenure until they’re in their 40s, and 20 year olds putting the real life thing on hold that long is unwise. Grad school, or your job whatever it may be, is not the whole of your life.

Academic training does not need to change so much as academic careerism. There is little sense in embracing careerism when hardly anyone has a career. But graduate school can still have value. Take advantage of your time in school to do something meaningful, and then share it with the world.

At least that section ends on a positive note. I agree. The whole point of education is to open your mind, not to get you a job, but to prepare you for any opportunity that comes around.

What happened to octodon.social?

I’m a fan of Mastodon, the new microblogging service that is trying to break the hegemony of Twitter. It’s better, cleaner, free of most of the trolls, and promises to take seriously complaints about racists and nazis and misogynists, unlike Twitter. It also has an interesting approach, decentralizing the servers who manage the whole show, so you can even pick Mastodon servers that best reflect your interests.

But that might also be a vulnerability. I went for octodon.social, was trying to contribute regularly to it, and then…kablooiee. It’s been down for a couple of days now. It appears to be no fault of the administrator, but the hosting service itself has screwed up.

Anyway, just be prepared for occasional breakdowns. Mastodon is great, but it demands some flexibility that you don’t get with the monolithic monolith.

That’s quite the racket

Nature Biotechnology published a rather startling paper: DNA-guided genome editing using the Natronobacterium gregoryi Argonaute. It claims “that the Natronobacterium gregoryi Argonaute (NgAgo) is a DNA-guided endonuclease suitable for genome editing in human cells,” which would make it an alternative to CRISPR/Cas9, and would make the authors rich.

I don’t know any of the details, though, because it’s behind a paywall, and my university doesn’t have an institutional subscription (universities don’t automatically get every journal, and the ones we do get cost the institution an arm, a leg, a pound of flesh, and a bucket of blood). I could pay for it personally, but Nature would charge me $32 for a pdf. If you think about it, it’s quite the deal: the authors do all the research work and then pay for the privilege of publishing in a Nature journal, and then Nature charges readers to see it. The last part would be understandable if they charged a reasonable fee, but of course they don’t.

Imagine if the New York Times worked that way. They fire all their journalists, and tell them that their new model is that if they’re very, very good they can continue to be published in the NYT if they pay Arthur Sulzberger for the privilege. Also, Arthur will change subscription policies: it’ll cost you $10,000/year to subscribe, but you could also just pay for individual articles. Yeah, you’ll pay $32 each week to read David Brooks.

But it’s all moot anyway! The paper has been retracted — no one could replicate the results. Or, at least, there’s an editorial expression of concern.

Guess what? I can’t read that one either. $32. Both the article and its ‘retraction’ are still available for a fee.

This is an amazing business model. Publish a tantalizing paper that is crap, charge people to read it. Publish an announcement that said tantalizing paper is crap, charge people to read it. What we need next is an editorial justifying the science journal’s predatory exploitation, charge people to read it.

(via Neuroskeptic)


Matt Herron sent along the paper and the “expression of concern”, if you were curious about the 3 paragraphs you could get for $32.

Editorial Expression of Concern: DNA-guided genome editing using the Natronobacterium gregoryi Argonaute

Feng Gao, Xiao Z Shen, Feng Jiang, Yongqiang Wu & Chunyu Han

Nat. Biotechnol. 34, 768–773 (2016); published online 2 May 2016; addendum published after print 28 November 2016

The editors of Nature Biotechnology are issuing an editorial expression of concern regarding this article to alert our readers to concerns regarding the reproducibility of the original results. At this time, we are publishing the results of three groups (http://dx.doi.org/10.1038/nbt.3753) that have tried to reproduce the results in the critical Figure 4 in the original paper by Han and colleagues, which demonstrates editing of endogenous genomic loci in mammalian cells. None of the groups observed any induction of mutations by NgAgo at any of the loci or under any of the conditions tested above the sensitivity of the assays used. Similar results have been recently reported by a different group of authors in Protein & Cell (doi:10.1007/s13238-016-0343-9).

We are in contact with the authors, who are investigating potential causes for the lack of reproducibility. The authors have been informed of this statement. While the investigations are ongoing, Chunyu Han and Xiao Z. Shen agree with this editorial expression of concern. Feng Gao, Feng Jiang and Yongqiang Wu do not feel that it is appropriate at this time.

We will update our readers once these investigations are complete.

Scott Adams embarks on the Johnny Hart road

I don’t normally read Dilbert — I’ve seen far too much of the benighted ignorant psyche of its creator — but this one was just laid out on a table at the coffee shop yesterday, and I knew I’d have to deal with it. In this one, Dilbert goes full climate science denialist. This might be fun, to dissect Dilbert, because even though it will kill what little humor is present in it, at least we’ll have a good time laughing at Scott Adams. Let’s dissect the shit out of this thing.

Here’s the setup.

OK, this is sort of fine. I think it’s a good idea for companies to think about what impact climate change will have on them, and how they affect the environment. I’m at a green university, and we’ve had these sorts of discussions. Still do, all the time.

It is definitely true that human activity is warming the Earth. It will lead to a global catastrophe, depending on how you define catastrophe: it will cause acute economic disruption, resource wars, and the death of millions. Is that catastrophic enough for you?

By the way, I notice that the scientist is a goateed and balding white man in a lab coat. It’s either unconscious bias (that’s how scientists are supposed to look!), or, I can’t help but notice a weak resemblance to Michael Mann.

Next panel, Dilbert asks Scott Adams’ idea of a smart question.

On the face of it, yes, that is a good question. I’d encourage students to ask that every time an instructor told them something. But consider the context. The answer to that question is readily available — google it. You can read the papers. You should have the answer to that from your high school earth science class. So why is Dilbert being made to ask this trivial question right at the start of this meeting? I can tell right away that this is not a sincere question, this is a derailing tactic to justify a software engineer speaking out of his ass to the scientific expert. Sound familiar?

Then we get the eternal dilemma of the science popularizer. Do you just scorch this ass with contempt because you can see right through him, or do you try to take the question seriously and give the primer in kindergarten climatology he’s asking for?

You can’t win, you know. The game is rigged. If you do the former, you’ll be accused of being hostile and mean. If you do the latter, you’re patronizing and people will write scornful blog posts about how you think raw data dumps will cure all the scientific misunderstandings in the world.

So what do you do? Most of us will take the generous view and try to explain exactly what the questioner is asking for, like our Michael Mann surrogate here:

And that’s also fine. So far, the strip has been true to the characters, and the nature of their interactions. It’s denialist vs. scientist, familiar territory, and now it’s time for the funny, clever twist…but Adams can’t deliver. He has to resort to sticking words in the mouth of the scientist that are not at all true to the character.

That’s just wrong. It’s not what climate scientists say or even think. It’s what Scott Adams, who is no scientist of any kind, says and thinks. And with that betrayal of the premise of the joke, it abruptly falls flat and dies. If all you can do to discredit a point of view is to lie and make puppets say falsehoods, it’s your position that fails. Adams does this because he lacks any insightful response to the honest arguments of scientists.

I guess there’s supposed to be a punchline of some sort next. Once again, Adams fails to meet the minimal standards of his medium.

I think the punchline is supposed to be implying that science supporters can only defend their position by calling True Skeptics mean names. Of course, the entire point of the two panels just above that is to call climate scientists conscious liars.

The only people who will find this at all funny are the denialists who see the panels in which the climate scientist openly maligns his methodology as affirmations of their beliefs. That’s OK, it’ll finally be the death of Dilbert — I skimmed the comments and noticed several people were shocked that Scott Adams endorse an anti-scientific claim. Apparently they’ve never read his blog before.

I shouldn’t claim it’ll kill Dilbert, though. Nothing kills syndicated comics. Johnny Hart went full-blown creationist/evangelical Christian/anti-Muslim bigot, and newspapers just kept right on buying up the strips. Hart died in 2007, and B.C. is still going.

And people think tenured professors have it easy.