Read Original Story

Study: Lots of red meat increases mortality risk

Rating

4 Star

Study: Lots of red meat increases mortality risk

Our Review Summary

This story covers a big study that reports a statistical association between a diet high in red meat and a higher death rate.

The report does a number of things well:

  • It indicates early on that the results support previous findings and current recommendation about red meat, while adding new data.
  • It quotes a variety of sources, including two self-interested representives of the meat industry who question the findings. This attention to the opposition enhances rather than undercuts the report’s credibility.
  • It translates the findings into dinner-table terms: high consumption in this study is defined as about a quarter pound hamburger per day, low is 5 ounces per week. That’s really useful to readers.

The report has two significant shortcomings.

  1. The language of the story – e.g., “lots of red meat increases mortality risk” – suggests that cause-and-effect has been established when this kind of observational study simply CAN NOT establish causation. It can only point to statistical association. The story never made that clear. So even discussing “risk” is an overstatement.
  2. Then, when choosing to discuss “risk,” it used only relative data [“a 22 percent lower risk of death” by cancer for low-red eaters, for example], not the absolute data [1.3 percent compared to .06 percent]. But once the story started using “risk” terminology without explaining that cause-and-effect and risk cannot be firmly established by such a study, it had already missed the mark.

We know we hold the bar high with this expectation that stories explain the quality of the evidence. But it is not simply splitting hairs; it is inaccurate to use language that does not fit the evidence. Journalists and should consumers should review the excellent guest column by Mark Zweig and Emily DeVoto, entitled “Does The Language Fit the Evidence? – Association Versus Causation.”

One of our medical editors wrote to me about this topic:

“I can tell you, the average practicing physician (one of the presumed target audiences for the medical literature) has a less than firm grasp of this difference! I have been teaching evidence-based medicine for years, and it seems that our residents finally get this by the end, but I don’t think it’s a skill that was routinely taught in medical training before the past decade or two, nor one that sticks with most docs.”

None of this denigrates the importance of the study or the quality of the work. But all studies have limitations and news about research should explain those limitations.

Criteria

Does the story adequately discuss the costs of the intervention?

Not Applicable

Again, no treatment is involved. Speculation about the relative costs of high red meat vs. high white meat diets would not be productive.

Does the story adequately quantify the benefits of the treatment/test/product/procedure?

Not Satisfactory

The reporter does a fair job describing the study and the findings.

The article includes such details as size of the study population, the duration of the study and the method of data collection.

But, as already discussed, this kind of study can’t establish cause-and-effect or risk.  Just statistical associations.  However, once it started describing risk, it did that incompletely as well.

It compaes the relative risk of death by heart disease and cancer in the highest- vs. lowest-red meat consumption groups for both men and women.

But the story suffers because it describes the findings only in terms of relative risk–men who ate the most red meat had a 27 percent higher risk of dying from heart disease compared to those who ate the least, for instance.

But the story fails to put these percentages into context. In the example above, about 1.2 percent of the men who ate the most red meat died of heart disease during the 10-year study, compared to .06 percent of those who ate the least red meat

Looking at male deaths by all causes, about 4 percent of the red meat group died, compared to about 2 percent of the least-red-meat group.

Overall, about 13 percent of study participants died during the 10-year study.

This is not to suggest the findings are insignificant. But reporters should always include absolute data to help readers understand the size of the risk an individual faces. (But , as already stated, risk language was risky in this story to begin with.)

Does the story adequately explain/quantify the harms of the intervention?

Satisfactory

The story focuses plainly on the death risks of the diets studied. However, risks are stated only in relative terms, not in absolute terms. We won’t penalize the story for that here, but we do under “quantification of benefits.”

Does the story seem to grasp the quality of the evidence?

Satisfactory

The report is based on results of a large, prospective epidemiological dietary study.

The reporter mentions the important caveats: that the study group may be healthier than the general adult population, and the data is based on self-reports of eating habits, which can be unreliable.

We’ve already commented on the “association does not equal causation” issue, so we won’t rule this criterion unsatisfactory for that flaw.

Does the story commit disease-mongering?

Not Satisfactory

Because the story confuses association with causation, it indeed does commit disease-mongering repeatedly. It may frighten readers into thinking that cause-and-effect has been established when, in fact, only a statistical association has been established. For example, the story says:

  • Headline: “Lots of red meat increases mortality risk.” Risk cannot be firmly established by such an observational study.
  • “22% higher risk of dying of cancer….” is an example of causal language that isn’t warranted with this kind of study finding.

It’s a common pitfall of such stories. Association does not equal causation. It’s not journalists who get it wrong. Scientists and physicians commonly express this incorrectly. But that doesn’t make it acceptable. Most stories we saw on this study stated this incorrectly.

Does the story use independent sources and identify conflicts of interest?

Satisfactory

The reporter does an excellent job sourcing the story. Sources include

  • the lead author
  • the author of a related editorial
  • one independent diet expert
  • two industry sources that dispute the findings

The reporter gets extra points for including two industry sources. While their thoughts appear very late in the story, including them both makes the story seem–and actually be–more credible.

There are no financial conflicts to report.

Does the story compare the new approach with existing alternatives?

Satisfactory

In several places, the article appropiately indicates that limiting red meat in the diet may improve health.

The reporter carefully avoids implications that more extreme options, such as a meatless diet, would cut death risk more.

Does the story establish the availability of the treatment/test/product/procedure?

Not Applicable

This is a study of subjects’ voluntary diets, so no “treatment” is involved.

Does the story establish the true novelty of the approach?

Satisfactory

The story makes no claims for the novelty of this kind of research or this general finding.

In the second paragraph the report states that these findings “bolster[s] prior evidence” about the risks of diets high in red meat.

Does the story appear to rely solely or largely on a news release?

Satisfactory

Given the number of sources cited, the story does not appear to rely on the Archives of Internal Medicine press release or any other news release.

Total Score: 6 of 8 Satisfactory

Comments

Please note, comments are no longer published through this website. All previously made comments are still archived and available for viewing through select posts.