Health News Review

I shuddered as soon as I read the BMJ news release headline, which read: “Estimated risk of breast cancer increases as red meat intake increases.“  I shuddered because I predicted to myself that many headlines, if not complete news stories, would report this as proof of cause and effect.  Or, at the very least, caveats about association ≠ causation would be missing.

Call it splitting hairs if you wish.  But the wording matters.  And the wording of the BMJ news release – in my opinion – can easily mislead one into thinking that the study in question made a causal link – not just a statistical association.  Let me break it down.  The release – in short – stated that increased red meat intake increases breast cancer risk.  “Increases risk” is a statement of causation.

Did it mislead journalists?  I can’t prove that.  I’m quick to say I can’t establish cause-and-effect.  But I will suggest there may have been an association between the BMJ news release and stories headlined like this:

Medical News TodayCould red meat consumption increase breast cancer risk?  (This is the old “question mark” form of journalism that Jon Stewart has mocked on the Daily Show.)

In this case, if you don’t understand observational studies, or you don’t understand what “association ≠ causation” means, and you have a major medical journal’s news release using causal language….but you’re still confused or uncertain…you just ask a provocative question, slap on the question mark at the end, and you’re safe.

KARE-11, the NBC station in Minneapolis-St. Paul, had no such uncertainty.  They made it a statement of fact, headlining: “Study: Red meat intake could increase breast cancer.” (Thereby answering Medical News Today’s question, “Could it?”

Fox News in Philadelphia stated, “Red meat may raise breast cancer risk.”  Yes, Fox/Philly, it may.  But then again it may not.

The Daily Express in the UK made a statement of fact: “Red meat raises the risk of breast cancer by a quarter.“  Nope.  That’s inaccurate.  Raises risk implies proof of causation.

The UK’s Daily Mail reported: Three bacon rashers a day raises breast cancer risk for young women.”  Huh?  Where did that come from?

Nature World News reported: “Red Meat Ups Breast Cancer Risk.”  Nope.  Statistical association, not causation. “Ups risk” implies cause and effect.

There were many other examples of stories that inappropriately used causal language to describe the study in question.

The Behind the Headlines service in the UK did its usual fine job explaining the evidence, stating at the end: “it should not be concluded from this particular study alone that red meat and processed meat increase the risk of breast cancer.”

And I was pleased to see that at least one news organization – there may been others – explicitly addressed my pet peeve.  The Guardian quoted a cancer epidemiologist who said, “association does not necessarily imply causation.”

There you have it.  6 little words.  That’s all I was looking for. The rest of the BMJ news release was OK; in fact, it mentioned association four times.  But it never gave the 6 words that journalists – and the public – need to hear about such observational studies.

The BMJ could have included those 6 little words and perhaps helped avoid all of the misreporting – and certainly avoided another one of my rants against their news releases. Past rants:

Please, all offending parties, if you haven’t done so already, please read our primer: “Does The Language Fit The Evidence? – Association Versus Causation.

The words matter.  Public comprehension of research matters.  Avoiding public confusion matters.

And none of this is to suggest that the observational study in question wasn’t important. It is important work.  It may point to a strong statistical association- and that alone may be enough to guide public recommendations.  But it is simply wrong to use causal verbs to imply that this one study established cause-and-effect.  Journalists should care about that point of accuracy.  The public should care. And journals that publish news releases should care.


Follow us on Twitter:

and on Facebook.



R.W. Carmichael posted on June 11, 2014 at 3:16 pm

Good article! We have known for around 3500 years that correlation does not equal causation, but the word seems to be very slow to get to medical schools. One problem is that much of this sort of research is commissioned or done by MD’s with no further credentials. Medical Doctors have absolutely no training or skills for designing, conducting, or evaluating these sorts of studies. They are easily misled by their own data as a result.

Zach posted on June 11, 2014 at 5:18 pm

Do journalists or journals get paid per click (or article view)? Because if so, I could see this problem continuing for eternity because their motives will be to create the most stirring headlines even if they are false or misleading.

Loren posted on June 11, 2014 at 11:25 pm

As with democracy itself (depending for its survival on an educated citizenry) perhaps what is really missing here is a course in media criticism. With so much information and media in their lives shouldn’t children have the benefit of at least one course teaching them about credibility, bias and propaganda.

Also, I think the problem you point at here is the ability of science to translate findings into the lives of so called ordinary people. Okay, the journalists got it wrong from a technical perspective, but what about the hundreds of thousands of women who will be told they have breast cancer this year? If their parents knew during their child-raising years that eliminating (or reducing) meat consumption might be associated with reducing the chance of their daughter getting breast cancer wouldn’t they call it splitting hairs to care about the difference between association and causation? Especially if reducing meat consumption had no negative side-effects, especially if the same Nurse’s Health Study data has found that the leading cause of death for the nurses was/is cardiovascular disease that also has a risk factor association with dietary cholesterol intake? Especially if that is also the leading cause of death for everyone living in the same country as the nurses? Shouldn’t that be the real news story here?

Trish Groves posted on June 12, 2014 at 6:07 am

I agree, of course, with the general point. And I’m well aware that nutritional epidemiology is often overinterpreted – by many authors, editors, press officers, and the media – with spurious causal associations.

But I really think you’ve been too hard on this one. Indeed, I’d argue that this is a good example of how to report a nutritional epidemiology study. Sure, it’s an observational study, but it’s very cautiously interpreted. It’s here, with open access for all to read:

Farvid MS, Cho E, Chen W, Eliassen AH, Willett WC. Dietary protein sources in early adulthood and breast cancer incidence: prospective cohort study. BMJ 2014;348:g3437

Anyway, thanks for linking to the BMJ press release:

As with all The BMJ’s press releases, this was written by an experienced press officer, approved by the study’s authors before final sign off by me, sent out with the full paper plus a short embargo to allow journalists to read the study and contact the authors to ask questions and confirm their stories.

I know you’ve previously said that editors are overly defensive. Probably fair comment.
But we welcome criticism and feedback. That’s why The BMJ has such flourishing postpublication peer review and commenting – with 95,990 Rapid Responses (eletters) posted as of today:

I hope your readers actually read the press release and the paper before commenting here. And, if they have specific criticisms of The BMJ’s papers and/or press releases (which are always posted alongside the papers) I hope they’ll send us Rapid Responses so that we and the studies’ authors can keep trying to provide clear, accurate information.

Dr Trish Groves, Head of Research, The BMJ
Competing interests: I chair The BMJ’s research manuscript committee and I approve all press releases about The BMJ’s research papers.

    Gary Schwitzer posted on June 12, 2014 at 6:50 am


    Thanks for your note.

    6 words. That’s all I’m looking for.

    The Guardian used them.

    I wish BMJ news releases did.

Trish Groves posted on June 12, 2014 at 8:47 am

Good idea.
We will, forthwith.


Trish Groves, The BMJ

    Larry Husten posted on June 12, 2014 at 9:30 am

    I agree with Gary here, but I would go a bit further. I think those 6 words are the bare minimum. Instead, it would be preferable if journals used these press releases to educate journalists (and, by extension, their readers) more generally about the limitations of observational studies. Of course, it happens that authors and editors are often eager to encourage misinterpretation, so this limitation is frequently downplayed, not only in the original paper but in the press release.