Joy Victory is Deputy Managing Editor of HealthNewsReview.org. She tweets as @thejoyvictory.
A new study confirms something we here at HealthNewsReview.org have been emphasizing for many years: Health news stories often overstate the evidence from a new study, inaccurately claiming that one thing causes another — as in drinking alcohol might help you live longer, facial exercises may keep your cheeks perky, and that diet soda might be a direct line to dementia.
The researchers looked at the 50 “most-shared academic articles and media articles covering them” in 2015, according to data from the NewsWhip database. Seven of the 50 studies were randomized controlled trials, the gold standard for “causal inference” in medicine (meaning, one can reasonably infer that an intervention caused an outcome, but not always).
The rest were observational studies, which is what it sounds like: Observing people and then seeing what happens to them (or what happened to them, if it’s looking at data collected in the past). They are not true experiments, with a control and placebo group. Sometimes, with lots of observational data–after longterm, repeated findings in thousands of people from different studies that used terrific methodology– the evidence becomes so strong that it can make sense to change public health or medical practice based on only observational data. Smoking and lung cancer is one such case. But it’s also clear that the literature has become littered with poorly done observational studies that make causal claims that cannot be supported.
They found a “large disparity” between what was written in the news stories compared to what the research showed:
One way that news stories can overreach is by inaccurately using language that implies x caused y:
In many cases, the language needed to be dialed back to better describe the research. For help on learning how to do this, see:
Lead study author Noah Haber, a postdoctoral researcher at the University of North Carolina Chapel Hill, said he’s always been interested in how research may or may not determine that a health exposure leads to an outcome, which he calls “causal inference.”
At the same time, he noticed that many of his friends were sharing health news articles on Facebook and Twitter that didn’t accurately describe the research (as in “new study shows drinking red wine seems to result in people living longer”).
“This study takes that process and takes it to research-driven extremes, where we’re looking at what is being shared across all of the Internet in 2015,” he explained.
We’ve found that news stories about coffee research is often overstated.
Among the popular stories in 2015, some usual suspects made the list: diet, coffee/caffeine, pregnancy/childbirth, green space, medical devices/treatments, pets, and air pollution were toward the top. Some interesting outliers included the impact of horror movies, birth order, and weekend hospital admissions. These were linked to a wide range of outcomes on things like mood/mental health, cardiovascular disease, IQ, mortality and BMI (and many others). All of the data can be seen here.
Well-known institutions and journals were often the sources. For example, studies from Harvard University were covered in 18% of the news stories. The stories that were shared most often were produced by outlets we review regularly, including CBS News, the New York Times, The Guardian, Los Angeles Times and NPR.
That 58% of the stories inaccurately reported the evidence closely matches our own number of 61% for the more than 2,500 news stories we’ve reviewed and assessed on our evidence quality criterion.
Haber emphasized that the study didn’t conclusively pinpoint who’s to blame for the misinformation. The published studies themselves slightly overstated the evidence, for example. And, as we’ve learned from reviewing news releases, publicity is often a common source of misinformation. Haber’s work didn’t look at news releases, though he hopes to investigate that in the future.
Ideally, well-trained journalists should scrutinize the news releases and the original research to look for problems that might produce misleading assumptions. That is the role of the journalist, after all.
All of this matters, he said, because people may make health decisions based on the misinformation they’ve read, a problem we are currently exploring in our series, Patient Harms from Misleading Media.
The media attention lavished on these topics also may have other unintended consequences. For example, it may encourage more researchers to study issues that they see grabbing headlines in major news outlets — since those questions may be viewed as having greater public importance and greater potential to advance careers.
“It also crowds out a lot of the good science information and changes the landscape of what people are producing,” Haber said. “There’s a feedback loop in these things.”
Comments (2)
Please note, comments are no longer published through this website. All previously made comments are still archived and available for viewing through select posts.
Jim Pantelas
June 14, 2018 at 11:57 amThis is a well written piece. But can I add a thought about your reference to smoking and lung cancer?
The lung cancer community has been battling the notion and stigma associated with its relationship to smoking since the 60s. While it is true that smoking is a contributor to the disease, it is also well known that smoking is a contributor to multiple other cancers, heart disease, and a variety of other illnesses. The fact that most publications, yours included, mentions smoking and lung cancer readily, but not do so with all of these other diseases helps to perpetuate the stigma.
Further, more than 15% of people diagnosed with lung cancer are never smokers, and over 50% are people that have not smoked in many years. As a lung cancer survivor, the most common first statement that follows an admission of the diseases is “how long did you smoke?” The assumption is almost universal.
I mention this because it inhibits our ability to raise money for research, attract more clinicians to the cause of lung cancer research, and to move the lung cancer community, including the American Lung Association and the American Cancer Society beyond their funding of smoking cessation programs and into actual, curative research. And it leaves those of us with the disease with the blame for this disease that is killing over 440 people every day in the US.
As for the 15% that never smoked, or the over 50% who haven’t smoked in years, it also puts them at a unique disadvantage because it inhibits the detection of their disease until it is simply the last thing to look for. The number of young, healthy, athletic women that are getting diagnosed with stage IV disease is astounding. But, because they never smoked, their disease is not even looked for until it is often too late.
I know that 15% being never smokers seems like a small percentage, but the actual numbers are not small because so many people get diagnosed every year. In the US alone over 260,000 people will be diagnosed this year, and over 157,000 people will die. Lung cancer is the single largest cancer killer of women and of men, and it kills more people than do the next four cancers combined.
Can you help us to curtail that stigma? Can you help us to let the world know that smoking isn’t a cause of lung cancer alone? And can you help us to let people know that smoking is far more dangerous than just being a contributor to lung cancer?
Thank you.
Peggy Zuckerman
June 14, 2018 at 2:15 pmI have seen the headlines in medical news coverage that overstate the findings of clinical trials. The headline touting new and better, etc often does not reflect the context of the study, i.e., the studied patients were terribly sick, or carefully selected NOT to be terribly sick. Of course, we all must learn to read the whole article, and see if there are errors of omission in the reporting. Gaping holes as to the details, or the lack of any peer comments is a clue. And in oncology or highly specialized fields, there may be little value in getting the view of a general physician as to the true impact.
Our Comments Policy
But before leaving a comment, please review these notes about our policy.
You are responsible for any comments you leave on this site.
This site is primarily a forum for discussion about the quality (or lack thereof) in journalism or other media messages (advertising, marketing, public relations, medical journals, etc.) It is not intended to be a forum for definitive discussions about medicine or science.
We will delete comments that include personal attacks, unfounded allegations, unverified claims, product pitches, profanity or any from anyone who does not list a full name and a functioning email address. We will also end any thread of repetitive comments. We don”t give medical advice so we won”t respond to questions asking for it.
We don”t have sufficient staffing to contact each commenter who left such a message. If you have a question about why your comment was edited or removed, you can email us at feedback@healthnewsreview.org.
There has been a recent burst of attention to troubles with many comments left on science and science news/communication websites. Read “Online science comments: trolls, trash and treasure.”
The authors of the Retraction Watch comments policy urge commenters:
We”re also concerned about anonymous comments. We ask that all commenters leave their full name and provide an actual email address in case we feel we need to contact them. We may delete any comment left by someone who does not leave their name and a legitimate email address.
And, as noted, product pitches of any sort – pushing treatments, tests, products, procedures, physicians, medical centers, books, websites – are likely to be deleted. We don”t accept advertising on this site and are not going to give it away free.
The ability to leave comments expires after a certain period of time. So you may find that you’re unable to leave a comment on an article that is more than a few months old.
You might also like