This NPR story is about a study, published in the Journal of the American Medical Association, which attempts to estimate the percentage of deaths from heart disease and diabetes associated with 10 specific dietary factors — things like too much sodium and processed meats, and not enough fruits and vegetables. The authors based their estimates on data from a large federal nutrition survey and other studies, most of which were observational in design.
Here’s the good news: The story puts a price tag on what diabetes costs the health care system, which is a great way to demonstrate why people should care about this kind of research. It also does a very nice job of cataloging the limitations of a study like this. It specifically acknowledges that observational research is subject to confounding bias, and that factors like socioeconomic status and physical activity might have skewed the results.
The bad news? These caveats comes deep into the story, long after a number of untenable claims have been made starting with the clickbait-ish headline: “Eating More — Or Less — Of 10 Foods May Cut Risk Of Early Death.” We wrote some tips on how to write better headlines for diet stories last week.
A complete discussion of the limitations of this study would have informed readers that observational studies cannot prove cause and effect. Thus, the story’s claims that are derived from such studies, including that specific dietary factors “may cut risk” … or “can help raise or lower the risk of death” … or “contribute to about 6-8 percent of deaths,” are an inappropriate overstatement of the evidence. All we can say is that there is a statistical association between these factors and death — not that one caused the other.
Overstatement of evidence from nutrition studies is rampant. And it’s one of the factors that contributes to diet news whiplash — where what’s good for you or bad for you seemingly changes from week to week and study to study. We published a broader analysis of this issue today on our blog,
The story addresses cost at both the individual and societal level, noting that poor diets are “linked to billions of dollars in healthcare spending. For instance, diabetes costs the U.S. $245 billion a year. In the U.S., a women with diabetes incurs, on average, about $283,000 in lifetime health care costs. (Many cost studies don’t separate Type 1 and Type 2 diabetes.).” That’s helpful context..
The study includes some numbers, for instance that “Consuming too much salt was associated with 9.5 percent of the deaths” and that “diets low in seafood, whole grains and fruits and vegetables were found to contribute to about 6-8 percent of the deaths.” It also paints the larger picture: “In 2012, about 700,000 Americans died from these diseases. Diet was linked to nearly 319,000 of these deaths.”
This is a satisfactory summary of the numbers as far they go, though the story could have provided more examples of how much more or less of these foods people would have to eat in order to achieve the claimed benefits. (How much salt is too much?) The bigger issue is the claim that certain dietary factors “contributed” to deaths, which overstates what the evidence can tell us. We’ll address that issue below under the “Evidence” criterion.
The likelihood of suffering harm from adopting any of this diet advice seems low, though concerns have been raised about the health effects of cutting sodium to very low levels. We’ll rate this Not Applicable.
The story makes repeated cause-and-effect leaps that aren’t justified by the evidence:
We were also concerned that the story gives a sense of precision that probably isn’t warranted. Sugary drinks a factor in “7.4% of deaths?” Are we sure it isn’t 7.5%? or 7.3%? By giving a number with that much precision, the article implies a much higher level of exactness than the evidence could produce.
We offer a primer to help journalists do better when describing observational research: Observational studies: Does the language fit the evidence? Association vs. causation
Although the premise that poor diet is causing between 500 and 1,000 cardiovascular and diabetes deaths per day is presented with too much certainty, this doesn’t constitute disease-mongering.
The story quotes from an accompanying editorial, which leads to some useful discussion of the limitations of observational research.
The story discusses exercise as another lifestyle factor that might be associated with reduced risk of death. Smoking could also have been mentioned.
As well, stress management and solid social connections are believed to be important lifestyle factors affecting health, yet unlikely to be captured in those nutrition-related questionnaires.
Availability of the foods discussed in the story is not an issue, which is why we’ll rate this Not Applicable. However, the fact that so many “bad for you” foods are so much more readily available today than they used to be is something the story could have touched on. In addition, such foods are often cheaper than more healthy alternatives in many communities.
The story doesn’t establish what exactly is novel about the research. All of the factors described in the story have been previously associated with higher risk of death and disease, yet the story didn’t distinguish these findings from that past research. And it did let this quote go by unchallenged:
“The good news is that we now understand which foods we need to target to prevent Americans from dying prematurely from cardiometabolic diseases,” says lead study author Renata Micha, a public health nutritionist and epidemiologist at the Friedman School at Tufts University.
“We now understand” is a claim of novelty – yet it’s a causal claim that should have been more forcefully challenged somewhere in the story.
The story includes original reporting and doesn’t rely too heavily on this news release.
Comments
Please note, comments are no longer published through this website. All previously made comments are still archived and available for viewing through select posts.
Our Comments Policy
But before leaving a comment, please review these notes about our policy.
You are responsible for any comments you leave on this site.
This site is primarily a forum for discussion about the quality (or lack thereof) in journalism or other media messages (advertising, marketing, public relations, medical journals, etc.) It is not intended to be a forum for definitive discussions about medicine or science.
We will delete comments that include personal attacks, unfounded allegations, unverified claims, product pitches, profanity or any from anyone who does not list a full name and a functioning email address. We will also end any thread of repetitive comments. We don”t give medical advice so we won”t respond to questions asking for it.
We don”t have sufficient staffing to contact each commenter who left such a message. If you have a question about why your comment was edited or removed, you can email us at feedback@healthnewsreview.org.
There has been a recent burst of attention to troubles with many comments left on science and science news/communication websites. Read “Online science comments: trolls, trash and treasure.”
The authors of the Retraction Watch comments policy urge commenters:
We”re also concerned about anonymous comments. We ask that all commenters leave their full name and provide an actual email address in case we feel we need to contact them. We may delete any comment left by someone who does not leave their name and a legitimate email address.
And, as noted, product pitches of any sort – pushing treatments, tests, products, procedures, physicians, medical centers, books, websites – are likely to be deleted. We don”t accept advertising on this site and are not going to give it away free.
The ability to leave comments expires after a certain period of time. So you may find that you’re unable to leave a comment on an article that is more than a few months old.
You might also like