Treat press releases about science with caution


Journals frequently send out pre-publication copies of what they think are important papers to science journalists under an embargo, where they are free to research the topic and gather material to write articles, but not publish them until the release date that the journal specifies. This enables journalists to write articles that put the research in context and provide alternative and critical views on the research in a timely manner. In the hands of good science journalists, this practice enables the general public to get a reasonable sense of what new research reveals.

But it has also become the practice for press releases to be released about scientific research. These releases are sometimes produced by the universities where the research is done, sometimes by the journals where the research is published, and sometimes by the companies that funded the research and think the results will be good for business. The idea here is to garner publicity for the research, the institution, or journal. There are also other organizations that take these releases and reproduce them verbatim on sites that look like magazines without identifying them as press releases, and the casual reader can be fooled into thinking that they are articles produced by science journalists and thus have some credibility. TV news channels, looking for cheap ways to fill airtime, have also been known to treat as ‘news’ what is in these press releases. While I try to see whether the science articles I read are by science journalists or are press releases. It is not always easy to spot the difference.

This can lead to highly misleading claims. One recent example was the report that said that a recent study had found a sharp drop in obesity rates among US preschoolers.

New federal data published Tuesday show a 43 percent drop in obesity rates among children ages 2 to 5 during the past decade, providing another encouraging sign in the fight against one of the country’s leading public health problems, officials said.

In a statement, first lady Michelle Obama praised the progress in lowering obesity rates among young children and said that participation in her Let’s Move! program was encouraging healthier habits.

While this was welcomed because it reversed a disturbing trend, it was also surprising because the decrease was not accompanied by corresponding drops in other age groups as one might have expected.

It now appears that the conclusion was misleading even though the study was done by the Centers for Disease Control and the results were published in the prestigious Journal of the American Medical Association. So what happened?

It turns out that the group of preschoolers, children ages two to five, included in the study were not a large enough group. In order for these results to be considered accurate, the study should have used a larger sample size. The study included 871 children, ages two to five, in its sample group. Because the obesity rate for this age group is fairly low, using low numbers makes the likelihood of errors from random chance higher. The CDC knew that their sample size was limited, and included that information with their study.

Knowing all of this, the conclusion from the CDC scientists was that there were no significant changes in youth or adults. They even included that the results needed to be interpreted with caution.

So given the highly tentative nature of the authors’ own conclusions, what happened to make this into a big story? The problem was the press release that was released by the CDC itself.

The latest CDC obesity data, published in the February 26 issue of the Journal of the American Medical Association, show a significant decline in obesity among children aged 2 to 5 years. Obesity prevalence for this age group went from nearly 14 percent in 2003-2004 to just over 8 percent in 2011-2012 – a decline of 43 percent.

One can hardly blame the writer of the press release because another part of the problem was that the authors of the paper had themselves said in the abstract of the paper that “There was a significant decrease in obesity among 2- to 5-year-old children (from 13.9% to 8.4%; P = .03)”, which is where the 43% reduction came from.

One had to dig deep into the paper and look at the actual results that are given in Table 6 to get the true picture. There you find that the drop was from 13.9 (10.8 to 17.6 with 95% CI) in 2003-2004 to 8.4 (5.9 to 11.6) in 2011-2012 for a drop of -5.5 (-9.6 to -1.4). When you see how large the uncertainties are in each number, you realize that the drop should be treated with healthy skepticism.

Press releases are often written by people who are generalists and have little detailed knowledge of the science they are writing about. Their job is present the research in an interesting way and this can lead to missteps like this. It does not help when the authors of papers use stronger language than justified in the abstracts and conclusions sections of papers, which are the only sections that many lay people read.

Comments

  1. corwyn says

    Press releases are often written by people who are generalists and have little detailed knowledge of the science they are writing about.

    There is no obesity science in your explanation above, just statistics. I will make the claim that anyone in the field of disseminating science should have a grasp of statistics.

  2. Kevin Kehres says

    Disclaimer: My ox is being gored here. I am a former science journalist and a former public relations executive who wrote TONS of press releases on science subjects.

    1. Marcus — your characterization of the purpose of press releases is far from the truth. It’s an insult, most especially when you say their use is to “fool” people. Trust me, that’s quite impossible to “fool” anyone with a mere press release. For that, you need Koch brothers / Faux News-style long-term propaganda.

    2. It would be FAR more likely for me as a science journalist to misinterpret a paper than it would be for a press release to contain a misinterpretation. As a science journalist, I was required to write about cancer one day, the moon landings the next, “sick kid of the week” the next … and on and on. There is no way one can be a specialist in science journalism with sufficient knowledge of every subject to avoid the occasional mis-step.

    Two stories a day, a longer Sunday piece, plus long-term research and writing on a series — that was my usual work load. Try that for a week on subjects that you’re not a PhD expert on — see how you do. I’ll bet you screw up big-time at least once. Oh, and you can’t just take a paper and abstract it. You have to find one of the lead authors of the paper and some other expert on the subject, schedule and conduct interviews, and then write your piece. No cheating. You have 3 hours — better get moving. Make sure it’s 1000 -- 1200 words, has a great lead, and interprets the findings in a way that the average 9th grade-level reader can understand. Your quotes will be fact-checked, so get them right. Oh, and if there’s a fire in the neighborhood, go cover that as well. Get quotes from the fire chief.

    3. What is your evidence that the writer of the CDC press release was not a scientist or was otherwise unqualified to interpret the data? You have none. Most of the contacts I’ve had with the CDC have been with people with at least Master’s level education, and the vast majority are PhDs and/or MDs. In addition, there are many degree and certificate programs in science and medical writing these days. The science writing world is chock-a-block full of folks with undergraduate science degrees who went on to science writing programs. So, characterizing writers as being uneducated or otherwise unfit to write about their subject is grossly in error.

    4. A press release writer does not work in a vacuum. They are not given free reign to write whatever comes into their head. They are told which parts are to be emphasized. Their work — especially the emphasis of key findings — is checked and re-checked. The “spin” comes from the top, not the bottom. If there’s someone to “blame” for emphasizing the statistics in the abstract and not in the body of the paper, that would come from WAY above the writer’s pay grade. The signs are right there in the release — the White House wanted these results to be emphasized. But you’re blaming the writer for misinterpreting the data. Unfair and inaccurate.

  3. corwyn says

    Trust me, that’s quite impossible to “fool” anyone with a mere press release.

    That is objectively false. I cite hundreds of April fools day press releases as examples. But you have included the standard preface to a lie: ‘trust me’. Those words always indicate that a lie follows.

    The rest is just an attempt at blame shifting.

  4. dysomniak "They are unanimous in their hate for me, and I welcome their hatred!" says

    I’m with Marcus, a press release is biased by by definition. And of course it’s painfully predictable that some PR flack would wander in to blather about how they don’t do what anyone with a brain can plainly see them doing every day. Just like on every story about police violence some authoritarian creep has to show up to go on about how most unfair it is judge cops based on the civilians they shoot.

  5. John Horstman says

    I’m still disturbed most by the proportion of both science journalists and SCIENTISTS THEMSELVES who misrepresent correlations as causal relationships, entirely without justification.

  6. Rob Grigjanis says

    corwyn @4:

    Those words always indicate that a lie follows.

    ‘Always’: That word usually signals an overgeneralization.

    Marcus Ranum @2:

    Why does anyone release anything to the press: to manipulate or fool them.

    Or, you know, to get important info out there.

Leave a Reply

Your email address will not be published. Required fields are marked *