Note to our followers: Due to a lack of sufficient funding, HealthNewsReview.org will cease daily publication of new content at the end of 2018. Publisher Gary Schwitzer and other contributors may post new articles periodically. If you wish to donate, your gift might help keep the site available to the public for a few more years, by defraying costs of web hosting and maintenance. All of our 6,000+ published articles contain lessons to help people improve their critical thinking about health care. Read more about our change in status. And here's how to make a donation.

Can’t say we didn’t warn you: Study finds popular health news stories overstate the evidence

Joy Victory is Deputy Managing Editor of HealthNewsReview.org. She tweets as @thejoyvictory.

A new study confirms something we here at HealthNewsReview.org have been emphasizing for many years: Health news stories often overstate the evidence from a new study, inaccurately claiming that one thing causes another — as in drinking alcohol might help you live longer, facial exercises may keep your cheeks perky, and that diet soda might be a direct line to dementia.

association is not the same as causationThe researchers looked at the 50 “most-shared academic articles and media articles covering them” in 2015, according to data from the NewsWhip database. Seven of the 50 studies were randomized controlled trials, the gold standard for “causal inference” in medicine (meaning, one can reasonably infer that an intervention caused an outcome, but not always).

The rest were observational studies, which is what it sounds like: Observing people and then seeing what happens to them (or what happened to them, if it’s looking at data collected in the past). They are not true experiments, with a control and placebo group. Sometimes, with lots of observational data–after longterm, repeated findings in thousands of people from different studies that used terrific methodology– the evidence becomes so strong that it can make sense to change public health or medical practice based on only observational data. Smoking and lung cancer is one such case. But it’s also clear that the literature has become littered with poorly done observational studies that make causal claims that cannot be supported.

They found a “large disparity” between what was written in the news stories compared to what the research showed:

  • “44% of the media articles used causal language that was stronger than the academic articles” (and many of those studies were overstated to start with).
  • “58% of the media articles contained at least one substantial inaccuracy about the study.”

X ‘may be caused by’ Y

One way that news stories can overreach is by inaccurately using language that implies x caused y:

  • “may be caused by”
  • “seems to result in”
  • “is caused by”
  • “is due to”

In many cases, the language needed to be dialed back to better describe the research. For help on learning how to do this, see:

Lead study author Noah Haber, a postdoctoral researcher at the University of North Carolina Chapel Hill, said he’s always been interested in how research may or may not determine that a health exposure leads to an outcome, which he calls “causal inference.”

At the same time, he noticed that many of his friends were sharing health news articles on Facebook and Twitter that didn’t accurately describe the research (as in “new study shows drinking red wine seems to result in people living longer”).

“This study takes that process and takes it to research-driven extremes, where we’re looking at what is being shared across all of the Internet in 2015,” he explained.

Aligns closely with our own reviews of news stories

We’ve found that news stories about coffee research is often overstated.

Among the popular stories in 2015, some usual suspects made the list: diet, coffee/caffeine, pregnancy/childbirth, green space, medical devices/treatments, pets, and air pollution were toward the top. Some interesting outliers included the impact of horror movies, birth order, and weekend hospital admissions. These were linked to a wide range of outcomes on things like mood/mental health, cardiovascular disease, IQ, mortality and BMI (and many others). All of the data can be seen here.

Well-known institutions and journals were often the sources. For example, studies from Harvard University were covered in 18% of the news stories. The stories that were shared most often were produced by outlets we review regularly, including CBS News, the New York Times, The Guardian, Los Angeles Times and NPR.

That 58% of the stories inaccurately reported the evidence closely matches our own number of 61% for the more than 2,500 news stories we’ve reviewed and assessed on our evidence quality criterion.

Where does the misinformation start?

Haber emphasized that the study didn’t conclusively pinpoint who’s to blame for the misinformation. The published studies themselves slightly overstated the evidence, for example. And, as we’ve learned from reviewing news releases, publicity is often a common source of misinformation. Haber’s work didn’t look at news releases, though he hopes to investigate that in the future.

Ideally, well-trained journalists should scrutinize the news releases and the original research to look for problems that might produce misleading assumptions. That is the role of the journalist, after all.

All of this matters, he said, because people may make health decisions based on the misinformation they’ve read, a problem we are currently exploring in our series, Patient Harms from Misleading Media.

The media attention lavished on these topics also may have other unintended consequences. For example, it may encourage more researchers to study issues that they see grabbing headlines in major news outlets — since those questions may be viewed as having greater public importance and greater potential to advance careers.

“It also crowds out a lot of the good science information and changes the landscape of what people are producing,” Haber said. “There’s a feedback loop in these things.”

More: All of our content on observational research

You might also like

Comments (2)

We Welcome Comments. But please note: We will delete comments left by anyone who doesn’t leave an actual first and last name and an actual email address.

We will delete comments that include personal attacks, unfounded allegations, unverified facts, product pitches, or profanity. We will also end any thread of repetitive comments. Comments should primarily discuss the quality (or lack thereof) in journalism or other media messages about health and medicine. This is not intended to be a forum for definitive discussions about medicine or science. Nor is it a forum to share your personal story about a disease or treatment -- your comment must relate to media messages about health care. If your comment doesn't adhere to these policies, we won't post it. Questions? Please see more on our comments policy.

Jim Pantelas

June 14, 2018 at 11:57 am

This is a well written piece. But can I add a thought about your reference to smoking and lung cancer?

The lung cancer community has been battling the notion and stigma associated with its relationship to smoking since the 60s. While it is true that smoking is a contributor to the disease, it is also well known that smoking is a contributor to multiple other cancers, heart disease, and a variety of other illnesses. The fact that most publications, yours included, mentions smoking and lung cancer readily, but not do so with all of these other diseases helps to perpetuate the stigma.

Further, more than 15% of people diagnosed with lung cancer are never smokers, and over 50% are people that have not smoked in many years. As a lung cancer survivor, the most common first statement that follows an admission of the diseases is “how long did you smoke?” The assumption is almost universal.

I mention this because it inhibits our ability to raise money for research, attract more clinicians to the cause of lung cancer research, and to move the lung cancer community, including the American Lung Association and the American Cancer Society beyond their funding of smoking cessation programs and into actual, curative research. And it leaves those of us with the disease with the blame for this disease that is killing over 440 people every day in the US.

As for the 15% that never smoked, or the over 50% who haven’t smoked in years, it also puts them at a unique disadvantage because it inhibits the detection of their disease until it is simply the last thing to look for. The number of young, healthy, athletic women that are getting diagnosed with stage IV disease is astounding. But, because they never smoked, their disease is not even looked for until it is often too late.

I know that 15% being never smokers seems like a small percentage, but the actual numbers are not small because so many people get diagnosed every year. In the US alone over 260,000 people will be diagnosed this year, and over 157,000 people will die. Lung cancer is the single largest cancer killer of women and of men, and it kills more people than do the next four cancers combined.

Can you help us to curtail that stigma? Can you help us to let the world know that smoking isn’t a cause of lung cancer alone? And can you help us to let people know that smoking is far more dangerous than just being a contributor to lung cancer?

Thank you.

Peggy Zuckerman

June 14, 2018 at 2:15 pm

I have seen the headlines in medical news coverage that overstate the findings of clinical trials. The headline touting new and better, etc often does not reflect the context of the study, i.e., the studied patients were terribly sick, or carefully selected NOT to be terribly sick. Of course, we all must learn to read the whole article, and see if there are errors of omission in the reporting. Gaping holes as to the details, or the lack of any peer comments is a clue. And in oncology or highly specialized fields, there may be little value in getting the view of a general physician as to the true impact.