David Brooks still has a job?

The web has been resounding with a mighty echoing “WTF?” — David Brooks has written another column in the New York Times, and it’s weird, even for Brooks. He’s sneering at “Thought Leaders”, apparently this new generation of pundits who are beneath his contempt. At first I thought maybe it was entirely autobiographical, and that he was describing his own career, in which case he really needed to be put on suicide watch. And then I thought, nah, it’s David Brooks — I’m assuming a degree of self-awareness that simply isn’t there.

So I wrote my own impression of Brooks.


David Brooks. Paris. 1789.

David Brooks was awakened early by the shouting and rumbling of carts outside his apartment window — why, it was perhaps as early as 11 o’clock, a most uncivilized hour for a gentleman. It was one of the obligations of nobility, however, to be willing to address his duties at any hour, and by God, he could rise even before the sun had reached its zenith.

He rang the little silver bell by his bedside to summon his servants to come and dress him, rose, and slipped on his silk dressing gown. A pinch of snuff to invigorate the blood, and he was ready to investigate. He looked out his window, down upon the unwashed mobs of Paris.

The tumbrels were rolling. Yet another day when the ranks of the aristocracy would be purged of their dead weight, he thought, leaving only the deserving to lead the country. He recognized one of the men roped in the cart, despite the shabbiness of his velvet coat and the loss of his wig; that young cockerel! His great grandparents had been merchants, and even now he was rumored to dabble in trade. No loss there. Just another trumped up nobody who had dared to regard himself as a match for those privileged by righteous birth.

He was moved to write another missive for the King — the last had been well received and read aloud at court, and he was gaining quite the reputation as the clever wordsmith. His dismissal of the middle class as the “bohemian burgeoisie” had provoked mirthful titters from the right courtiers. An elegant letter explaining how the regime was right and natural and safe, and that the elimination of the arrogant young upstarts was only right and proper would strike just the right tone. He rang his bell again. Where were those lazy servants? He had work to do! These nouveau rapscallions needed to be named and chastised. How else will everyone know the right people to rebuke? And behead?

He rang the bell insistently. Hands on his hips, he stood facing the entry door: the instant that worthless layabout finally arrived, he was going to receive the fiercest verbal scourging, and be thrown out on the street with the rest of the rabble. You do not question the right of David Brooks to be treated with respect and dignity and the deepest humility. You do not delay him.

He waited.

There was a loud and ferocious pounding on the main doors downstairs. The servants will get it.

What is that crashing great racket?


Damn it. Charles Pierce has already done it so much better.

Many people wonder how they too can become Thought Leaders and what the life cycle of one looks like.

Well, you start out being a coddled little genius nurtured by the think tanks and vanity publications and fanzines of the American right. Then you make a career out of whatever pop sociology text you read 10 minutes ago. Then you write a couple of books about how the American genius for mindless consumerism is the future of the country. Then you get a column in the New York Times. Unfortunately, there comes a conservative president who fks up everything from hell to breakfast, and all of the intellectual arboretums in which you were raised fall into disrepute. Dutch Elm disease of the mind become epidemic. So you backpedal as fast as you can, running over several of your previous selves in the process until you finally end up one day writing a column in which you pretend that you haven’t spent your adult life pumping your speaking fees and grazing the buffet tables at various brainiac circle jerks.

I’m sorry. Were we talking about someone else?

Yeah, that’s David Brooks alright.

Megyn Kelly gets DailyShowed and WaPoed

Don’t feel bad for Megyn Kelly. Jon Stewart exposes her stupidity with wonderful thoroughness…

And you might be thinking, “No way can she show up in public without the pointing and laughing,” but keep in mind that she’s a Fox News host. Blithering obliviousness is part of the job description.

Besides, on the same day she got this lovely tongue-bath from the Washington Post. I see that journalistic standards at the WaPo are roughly equivalent to those at the HuffPo.

Seriously, Time magazine?

They’ve announced their person of the year, and it’s…Pope Francis, The People's Pope.

You have got to be fucking kidding me. They’ve got this great pulpit with mass media attention to actually highlight the important events and people on the planet, and they pick the pablum-spewing head of an antique organization that demands its followers adhere to obsolete and dangerous beliefs, and this is what they say about it?

The papacy is mysterious and magical: it turns a septuagenarian into a superstar while revealing almost nothing about the man himself. And it raises hopes in every corner of the world—hopes that can never be fulfilled, for they are irreconcilable. The elderly traditionalist who pines for the old Latin Mass and the devout young woman who wishes she could be a priest both have hopes. The ambitious monsignor in the Vatican Curia and the evangelizing deacon in a remote Filipino village both have hopes. No Pope can make them all happy at once.

Jebus. It’s no more mysterious and magical than the Mafia, or the Medellin Cartel, or Phillip Morris, or the NRA, and the people who turn a septuagenarian into a “superstar” are the sycophants in the media.

And this…

But what makes this Pope so important is the speed with which he has captured the imaginations of millions who had given up on hoping for the church at all. People weary of the endless parsing of sexual ethics, the buck-passing infighting over lines of authority when all the while (to borrow from Milton), “the hungry Sheep look up, and are not fed.”

His virtue is solely palliative — he’s there to say soft words and create the illusion that the church isn’t the domain of child-rapists and oppressors. The church denies family planning to women in Africa, bounces pedophiles around to unsuspecting dioceses, buries tales of generations of abuse in Ireland, demands that women die in the name of fetus worship…oh, look! Pope Francis said atheists might get to go to heaven! Aww, he’s so folksy and kind.

The sheep are still not fed. But maybe they’ll be a little quieter in the slaughtering pen.

There can be only one reply.

As for Time magazine…I remember the old magazines that would gather dust on the coffee table at my grandparents’ house, Reader’s Digest and Look and others so tired that I can’t even recall their names, and I would read them because I was desperately bored, and I would mainly be curious about them because they represented what old people cared about (and near as I could tell, they didn’t even care that much about them). The magazines survived on subscription by habit, I suspect, and even then I could tell they were doomed. Welcome to that club, Time.

A new journal

It’s true that science publishing has some serious problems — can you access the latest results from federally funded research? Do you think Science and Nature are really the best science journals in the world? — so it’s good that some people are taking the lead in changing their approaches and developing alternative publishing models.

Leading academic journals are distorting the scientific process and represent a "tyranny" that must be broken, according to a Nobel prize winner who has declared a boycott on the publications.

Randy Schekman, a US biologist who won the Nobel prize in physiology or medicine this year and receives his prize in Stockholm on Tuesday, said his lab would no longer send research papers to the top-tier journals, Nature, Cell and Science.

Schekman said pressure to publish in "luxury" journals encouraged researchers to cut corners and pursue trendy fields of science instead of doing more important work. The problem was exacerbated, he said, by editors who were not active scientists but professionals who favoured studies that were likely to make a splash.

Easy for Schekman to do. He’s got a Nobel, I don’t think he has to worry about getting and maintaining a position, or even getting published where ever he wants anymore. Cutting out the “luxury” (I think they prefer to be called “prestige”) journals doesn’t discomfit him in the slightest.

Schekman is scathing in his assessment of the popular big name journals. But at least he’s also trying to do something to correct the situation: he is promoting a new open-access journal, eLife, of which he is the editor.

I took a look. It was a bit off-putting at first: Schekman’s face is plastered in the middle of the page, and there’s a link up top to “Follow Randy’s Nobel Journey”, and I thought…uh-oh, are we going to replace “luxury” journals with vanity journals? But then I browsed the several hundred currently published articles, and they’re not bad, at least if you’re interested in cell and molecular biology (oh, hey, I am!).

Looks like I’m adding another journal to the list I regularly check.

Media bias!

OK, so that Daily Mail piece that talked about my refutation of the MFAP hypothesis — the idea that humans are ape-pig hybrids — has been getting a fair amount of attention. The article actually did quote me tearing into the idea fairly thoroughly, although it did also give far too much attention to a crackpot, but here’s the thing: the media doesn’t care that much what I said, it’s all the kook nonsense driving the reporting.

For example, Jennifer Raff heard it on Alex Jones’ radio show. Jones is a wacky, hateful, loud-mouthed conspiracy theorist, and what is he talking about? Not that there are good explanations for human evolution, but that a kook with a ridiculous idea has demolished the theory of evolution.

What really drew my interest in the subject was the way Alex Jones discussed the news article. He called Eugene McCarthy a “top geneticist” and an “expert”, and while rightly dismissing the hypothesis as idiotic, he implied that it was in line with the current scientific consensus on human origins: that evidence was increasingly disproving evolution as an explanation. Instead, alternative ideas, such as an Intelligent Designer (Jones kept calling it “aliens” and mentioning the movie “Prometheus”) were becoming mainstream explanations for human origins, and this chimp-pig hybridization idea was yet another example. Framing the story in this way, he went on to say that “They (scientists) have no idea what they’re doing” (quote paraphrased from my memory), and therefore evolution is nonsense. And he’s not alone. The creationist/Intelligent Design site Uncommon Descent also dismisses McCarthy’s hypothesis while simultaneously dismissing “neo-Darwinism”

Raff goes on to list all the red flags thrown on McCarthy’s loony hypothesis, which ought to get any reasonable journalist to immediately question and reject the idea. But the lesson I’m seeing from the differential attention given to my explanation and McCarthy’s is that I clearly did not throw enough red flags.

So I’m pointing out that a Christian, conservative, dominant minority is conspiring to suppress my radical idea that evolution, a natural process, has generated all the diversity of life on earth, and that as a top expert authority I plan to lead a revolution in scientific thinking. I know that’s a wild claim and that I’ll probably be pilloried by the mainstream establishment, but I’ve just got to get past the scientific gatekeepers to bring this secret knowledge to the masses.

Alex Jones, call me. We’ll talk.

#HeavenAndBack

So I watched this show with Anderson Cooper’s name on it; he didn’t bother to show up, so maybe he has some sense of shame. It was dreadful. It was three anecdotes about people who had experienced serious trauma, and then invented lovely narratives about a happy afterlife to make themselves feel better, or to justify their prior religious beliefs. There was no fact-checking. It was just these three women getting interviewed and telling unverifiable accounts of events that happened while they were unconscious.

First woman: She claims to have “died” in a kayaking accident in Chile. Her kayak was pinned underwater by a rock; she describes all of her sensations, including her legs breaking when her friends dislodged the boat and she was torn free by the current. Her friends were frantic, yet she’s happy to claim that they accurately described the passage of time, and that she was under water and deprived of oxygen for 30 minutes. She said she “gave herself up to god”, was visiting spirits/angels/whatever while resuscitation was attempted, and that she had a conversation with Jesus who told her she had to go back to take care of her husband. Her husband was later diagnosed with lung cancer. Thanks, Jesus! Also, she’s flogging a book

Verdict: completely unverified account of a “death”. This was a religious woman who experienced a serious trauma, and who had also experienced the death of a child and wanted to believe that there was a purpose to life. It was a wish-fulfillment fantasy.

CNN’s verdict: “Amazing”. Not one word of doubt about anything in the account.

Christian Mingles is advertising on this show, of course.

Second woman: Child growing up in Hong Kong, of Indian descent. A friend dies of cancer, and she becomes paranoid; she later is diagnosed herself with Hodgkin’s lymphoma. She deteriorates under treatment, and later lapses into a coma. Claims to have heard doctors talking while she was in a coma, and that they said she was going to die within 24 hours. She was, she said, “in another world” where she felt peace, and her dead friends were all there. Dead people told her to go back and live, so she did.

She recovered consciousness, cancer goes into remission, she’s still alive. In fact, nothing in her account said she died at all.

Verdict: A lot of story telling and confabulation. Nothing remarkable in the story at all; Hodgkin’s has a roughly 80+% 5 year survival rate, and she was apparently getting good medical care.

CNN’s verdict: Accepted every bit of it without reservation. No attempt to verify any of the claimed facts, not that there was anything particularly unusual about it.

Third woman: Has a son with a serious heart condition. He and his mother engaged in a fair bit of Jesus talk. One day he collapses and is hospitalized, and claims to see a bright light and an angel. Later he collapses at his school again, and claims to have been in a good place and not wanting to come back. But “he came back for a reason”. The family does a lot of praying and bible reading. Then the son dies on Christmas day. He doesn’t come back.

Verdict: Absolutely nothing remarkable or unexplainable. No evidence of much of anything presented.

CNN’s verdict: Ends with a clip of a video of the dead boy holding up a sign saying he believes in god and angels.

Overall assessment: Gullible dreck, lots of fantasizing, no evidence presented of much of anything, and no critical thinking from the reporters at all. A disgrace.

I didn’t believe a word of it. There’s only one comment on the show website, and Randy didn’t believe it, either, but for rather different reasons.

This is all a liar, heaven is a holy place and those that Enter must be born again of the water and of the spirit, those that have excepted Jesus as their personal savior and have been born again of the baptism of the Holy Ghost will make it in.

I despair.

It’s the silences, the neglect, the moving on to more important matters

What if the National Association of Science Writers convened a panel on sexual harassment and discrimination, and no one cared? This report on sexual harassment and science writing at NASW is strangely, delicately neglectful, from the beginning where it irrelevantly claims that the Bora Zivkovic story no longer dominates science blogs (So has sexual harassment vanished? Or should we be asking where it will rise up again?), to the bizarrely abrupt segue in which they “Return You to Our Regularly Scheduled Program”, which is all about calculating the number of habitable worlds in the galaxy and more self-promoting fluff from SETI. Apparently, the concerns of women in science is of dwindling concern and a distraction from the Important Subjects of Speculative Astronomy.

The middle is equally weird. It has two sections: Hearing from Women, a two paragraph summary of what the women on the panel said, followed by Hearing from Men, with four paragraphs dedicated to the reactions (admittedly sympathetic) of the men in the audience, which are described as “some of the most powerful and significant statements”. At least the women’s section closed with an ironic comment: “The medical profession is now also heavily female, she [Ginger Campbell] said, but there, too, invisibility is everywhere.” How true that is.

I would like to have read more about “Hearing from Women”, but not only could the writer not be troubled to include more of the women’s statements, but she didn’t even bother to link to any of the panelists. I can correct that, at least: Christie Aschwanden, Deborah Blum, Florence Williams, Kate Prengaman, Kathleen Raven, Maryn McKenna, and Emily Willingham. Isn’t that odd that an article purportedly about this panel didn’t even link to the panelists’ professional pages, neglected to even name one of them, yet still made that special effort to capture men’s opinions on it?

You should read Emily Willingham’s assessment of the article. It’s not at all flattering.

Start looking for the invisible women, and it’s amazing how often you can find these curious omissions. Here, for instance, is a student at Michigan State plugging the virtues of social media for advancing your career in science (and I agree with him!), but he’s especially promoting reddit as a tool…which is problematic if you’re a woman, or have a reputation as a feminist. He touts reddit as the “best bang for the buck” for “thousands of young men and women” and obliviously shows this graph of internet readers who use reddit, titled “Young males are especially likely to use reddit.”

Chart showing that many more men than women use reddit

Apparently we can just ignore the pale blue bars that show that women represent somewhere less than a third of the audience you’ll reach on reddit. We’re not even going to notice the discrepancy, even if it leaps out at you as the most significant factor illustrated by the chart, and even if the title itself calls attention to it. The sexism problem on reddit isn’t even worth mentioning in an article about promoting science.

But that’s the big question that ought to be asked. Why isn’t it? Because invisible people aren’t as important.

Finally, here’s something that’s at least stirring and loud. It’s from a television show (as we all know, fictitious politicians are far more honest and bold than the real ones) in which a woman points out all the subtle signifiers the media and other politicians use to put her in her place.

Are you saying that Governor Reston is sexist?

Yes. I am. And it’s not just Governor Reston speaking in code about gender. It’s everyone, yourself included. The only reason we’re doing this interview in my house is because you requested it. This was your idea. And yet here you are, thanking me for inviting me into my “lovely home.” That’s what you say to the neighbor lady who baked you chocolate chip cookies. This pitcher of iced tea isn’t even mine. It’s what your producers set here. Why? Same reason you called me a “real live Cinderella story.” It reminds people that I’m a woman without using the word.

For you it’s an angle, and I get that, and I’m sure you think it’s innocuous, but guess what? It’s not. Don’t interrupt me when I’m speaking. You’re promoting stereotypes, James. You’re advancing this idea that women are weaker than men. You’re playing right into the hands of Reston and into the hands of every other imbecile who thinks a woman isn’t fit to be commander-in-chief.

Don’t you ever forget, ladies, that the most important parameter of your existence is how well you fit your stereotyped role. But don’t worry, no one will ever let you forget it.

Balance

Science is always working a tough room. It’s inherently progressive — we’re constantly achieving incremental improvements in our understanding, with occasional lurches forward…and sometimes sudden lurches backward, when we realize that we got something wrong. We’re performing for a crowd, the general citizenry and most importantly, the funding agencies, that expect us to fix problems and make the world better, and they’re a fickle bunch who will turn on us with disdain if we don’t keep delivering new medical therapies and tinier electronics and more spectacular robots landing on alien worlds.

Unfortunately, there are a couple of sources of tension in our act.

One problem is that we aren’t doing what everyone thinks we’re doing. The world outside the sciences thinks we’re all about making material improvements in your standard of living, or increasing our competitiveness with other countries. Wrong. We do what we do to increase our understanding. There is an applied side to science that is asking about, for instance, better treatments for cancer, but it’s built on a foundation of scientists just asking, “how do cells work?”

An analogy: imagine building race cars. Everyone watching is thinking that it’s all about winning races (that’s also the case for the backers who are paying for all the machines). But the scientists are the ones who are just thinking about what’s going on inside the engine, tracing the flow of fuel and energy, optimizing and adjusting to make it work. Put a scientist in the driver’s seat, and she wouldn’t be thinking about winning the race; if she heard a mysterious “ping!” at some point, her instinct would be to just pull over then and there and take things apart until she’d figured out what caused it. And figuring out the ping would be more satisfying than finishing the race.

So everyone criticizes the scientist for not winning any races, but the scientist is feeling triumphant because her performance wasn’t what you thought it was — she learned a little bit more about what makes the engine tick, and you should be happy about that!

So that’s one source of tension. Here’s another: funding and public support thrives on positive results, that constant reassurance that yes, we’re proceeding apace towards the finish line, but science itself thrives on criticism. Probing and patching and making fruitful errors and getting criticism that forces us to reconsider our premises and rebuild our hypotheses…that’s the progressive force behind science. And we should be appreciative when someone tells us that a major chunk of research is invalid (and as scientists, we are), but at the same time, we’re thinking that if we have to retool our labs, retrain our students, rethink everything from the ground up, as exciting as it is in a scientific sense, it’s going to be a really hard sell to NSF or NIH. The granting agencies, and the media, love the safe, reliable churn of data that looks like progress from the outside.

Which brings me to an interesting argument. On one side, John Horgan gets all cynical and critical of science, pointing out deep and fundamental flaws in peer review, the overloading of science journals with poor quality work, the lack of progress in many of our goals for science, and bemoaning the reassuring positivity of the media towards science.

…I’m struck once again by all the “breakthroughs” and “revolutions” that have failed to live up to their hype: string theory and other supposed “theories of everything,” self-organized criticality and other theories of complexity, anti-angiogenesis drugs and other potential “cures” for cancer, drugs that can make depressed patients “better than well,” “genes for” alcoholism, homosexuality, high IQ and schizophrenia.

And he’s right! We don’t have any cures for cancer or schizophrenia, and as he also points out, the scientific literature is littered with trash papers that can’t be replicated.

But on the other side, Gary Marcus says wait a minute, we really have learned a lot.

Yet some depressed patients really do respond to S.S.R.I.s. And some forms of cancer, especially when discovered early, can be cured, or even prevented altogether with vaccination. Over the course of Horgan’s career, H.I.V. has gone from being universally fatal to routinely treatable (in nations that can afford adequate drugs), while molecular biologists working in the nineteen eighties, when Horgan began writing, would be astounded both by the tools that have recently been developed, like whole-genome-sequencing, and the detail with which many molecular mechanisms are now understood: reading a biology textbook from 1983 is like reading a modern history text written before the Second World War. Then there is the tentative confirmation of the Higgs boson; the sequencing of Neanderthal DNA; the discovery of FOXP2, which is the first gene decisively tied to human language; the invention of optogenetics; and definitive proof that exoplanets exist. All of these are certifiable breakthroughs.

And he’s right!

See what I mean? It’s conflict and tension all the way through. The thing is that the two are looking at it from different perspectives. Horgan is asking, “how many races have we won?” and finds the results dispiriting. Marcus is asking “have we figured out how the engine works?” and is pleased to see that there is an amazing amount of solid information available.

Here, for example, are some data on cancer mortality over time. In this instance, we are actually looking at the science as a race: the faster that we can get all those lines down to zero, the happier we’ll all be.

Charts of cancer death rates over time

Weinberg, The Biology of Cancer

Look at the top graph first. That’s where we’re doing well: the data from stomach and colon and uterine cancer show that those diseases are killing a smaller percentage of people every year (although you can probably see that the curves are beginning to flatten out now). Science did that! Of course, it’s not just the kind of science that finds a drug that kills cancer; much of the decline in mortality precedes the era of chemotherapy and molecular biology, and can be credited to better sanitation and food handling (hooray for the FDA!), better diagnostic tools, and changes in diet and behavior. We’re winning the war on cancer!

Wait, hold on a sec, look at the bottom graph. It’s more complicated than that. Look at lung cancer; science was helpless against the malignant PR campaigns of the tobacco companies. Some cancers seem relentless and unchangeable, like pancreatic and ovarian cancer, and show only the faintest hint of improvement. Others, like breast cancer, held steady in their rate for a long time and are just now, in the last few decades, showing signs of improvement. It’s complicated, isn’t it? Horgan is right to point to the War on Cancer and say that the complex reality is masked by a lot of rah-rah hype.

But at the same time…Horgan got his journalism degree in 1983, and I got my Ph.D. in 1985. He’s on the outside looking in and seeing one thing; over that same time period, I’ve been on the inside (still mostly looking in), and I’ve seen something completely different.

If I could show my 1985 self what 2013 science publishes as routine, 1985 self would be gibbering in disbelief. Transgenic mice? Shuffling genes from one species to another? Whole genome sequencing? Online databases where, with a few strokes of the keyboard, I can do comparisons of genes in a hundred species? QTLs that allow us to map the distribution of specific alleles in whole populations? My career spans an era when it took a major effort by a whole lab group to sequence a single gene, to a period when a grad student could get a Ph.D. for completing the sequencing of a single gene, to now, when we put the DNA in a machine and push a button.

You can look at those charts above and wonder where the cure for cancer is, or you can look at all the detailed maps of signaling pathways that allows scientists to say we understand pretty well how cancer works. Do you realize that hedgehog was only discovered in 1980, and the activated human ras oncogene was only identified in 1982? It’s rather mindblowing to recognize that genes that we now know are central to the mechanisms of cancer have only emerged in the same short period that Horgan finds disappointing in the progression of science.

Everyone on the outside is missing the real performance!

Unfortunately, a growing problem is that some of the people on the inside are also increasingly focused on the end result, rather than the process, and are skewing science in unfortunate directions. There’s grant money and tenured positions on the line for getting that clear positive result published in Cell! As Horgan points out, “media hype can usually be traced back to the researchers themselves”. We’ve seen that with dismaying frequency; recently I wrote about how the ENCODE project seems to have fostered a generation of technicians posing as scientists who don’t understand the background of biology (and Larry Moran finds another case published in Science this week!). We’re at a period in the culture of science when we desperately need more criticism and less optimism, because that’s how good science thrives.

That’s going to be tricky to deliver, though, because the kind of criticism we need isn’t about whether we’re winning the race or not, or translating knowledge into material benefits or not, but whether the process of science is being led astray, and how that’s happening: by the distorting influence of big biomedical money, by deficiencies in training scientists in big picture science, or by burdensome biases of science publication, or by all of the above and many more.

But ultimately we need the right metrics and to have well-defined outcomes that we’re measuring. It doesn’t help if the NIH measure success by whether we’ve cured cancer or not, while scientists are happily laboring to understand how cell states are maintained and regulated in multicellular eukaryotic organisms. Those are different questions.

How to make a funny-looking mouse

I’m going to tell you about a paper that was brought to my attention by some poor science journalism, so first I have to complain about the article in the Guardian. Bear with me.

This is dreadfully misleading.

Though everybody’s face is unique, the actual differences are relatively subtle. What distinguishes us is the exact size and position of things like the nose, forehead or lips. Scientists know that our DNA contains instructions on how to build our faces, but until now they have not known exactly how it accomplishes this.

Nope, we still don’t know. What he’s discussing is a paper that demonstrates that certain regulatory elements subtly influence the morphology of the face; it’s an initial step towards recognizing some of the components of the genome that contribute towards facial architecture, but no, we don’t know how DNA defines our morphology.

But this is disgraceful:

Visel’s team was particularly interested in the portion of the genome that does not encode for proteins – until recently nicknamed “junk” DNA – but which comprises around 98% of our genomes. In experiments using embryonic tissue from mice, where the structures that make up the face are in active development, Visel’s team identified more than 4,300 regions of the genome that regulate the behaviour of the specific genes that code for facial features.

These “transcriptional enhancers” tweak the function of hundreds of genes involved in building a face. Some of them switch genes on or off in different parts of the face, others work together to create, for example, the different proportions of a skull, the length of the nose or how much bone there is around the eyes.

NO! Bad journalist, bad, bad. Go sit in a corner and read some Koonin until you’ve figured this out.

Junk DNA is not defined as the part of the genome that does not encode for proteins. There is more regulatory, functional sequence in the genome that is non-coding than there is coding DNA, and that has never been called junk DNA. Look at the terminology used: “transcriptional enhancers”. That is a label for certain kinds of known regulatory elements, and discovering that there are sequences that modulate the expression of coding genes is not new, not interesting, and certainly does not remove anything from the category of junk DNA.

Alok Jha, hang your head in shame. You’re going to be favorably cited by the creationists soon.

But that said, the paper itself is very interesting. I should mention that nowhere in the text does it say anything about junk DNA — I suspect that the authors actually know what that is, unlike Jha.

What they did was use ChIP-seq, a technique for identifying regions of DNA that are bound by transcription factors, to identify areas of the genome that are actively bound by a protein called the P300 coactivator — which is known to be expressed in the developing facial region of the mouse. What they found is over 4000 scattered spots in the DNA that are recognized by a transcription factor. A smaller subset of these 4000 were analyzed for their sequential pattern of activation, and three of these potential modulators of face shape were selected for knock out experiments, in which the enhancer was completely deleted.

The genes these enhancers modulate were known to be important for facial development — knocking them out creates gross deformities of the head and face. Modifying the enhancers only leaves the actual genes intact, so you wouldn’t expect as extreme an effect.

One way to think of it is that there are genes that specify how to make an ear, for instance. So when these genes are switched on, they initiate a developmental program that builds an ear. The enhancers, though, tweak it. They ask, “How big? How high? Round or pointy? Floppy or firm?” So when you go in and randomly change the enhancers, you’d expect you’d still get an ear, but it might be subtly shifted in shape or position from the unmodified mouse ear.

And that’s exactly what they saw. The mice carrying deletions had subtle variations in skull shape as a consequence. In the figures below, all those mouse skulls might initially look completely identical, because you aren’t used to making fine judgments about mousey appearance. Stare at ’em a while, though, and you might begin to pick up on the small shifts in dimensions, shifts that are measurable and quantifiable and can be plotted in a chart.

Attanasio-face-enhancers-9

This is as expected — tweaking enhancers (which are not, I repeat, junk DNA) leads to slight variations in morphology — you get funny-looking mice, not monstrous-looking mice. Although I shouldn’t judge, maybe these particular shifts create the Brad Pitt of mousedom. That’s also why I say that implying that we now know exactly how DNA accomplishes its job of shaping the face is far from true: Attanasio and colleagues have identified a few genetic factors that have effects on craniofacial shaping, but not all, and most definitely they aren’t even close to working out all the potential interactions between different enhancers. You won’t be taking your zygotes down to the local DNA chop shop for prenatal genetic face sculpting for a long, long time yet, if ever.


Attanasio C, Nord AS, Zhu Y, Blow MJ, Li Z, Liberton DK, Morrison H, Plajzer-Frick I, Holt A, Hosseini R, Phouanenavong S, Akiyama JA, Shoukry M, Afzal V, Rubin EM, FitzPatrick DR, Ren B, Hallgrímsson B, Pennacchio LA, Visel A. (2013) Fine tuning of craniofacial morphology by distant-acting enhancers. Science 342(6157):1241006. doi: 10.1126/science.1241006.