Here we go again. Headlines across America today blaring lines like “Coffee may reduce stroke risk.”
It was a big study, but an observational study. Not a trial. Not an experiment. And, as we say so many times on this site you could almost join along with the chorus – observational studies have inherent limitations that should always be mentioned in stories. They can’t prove cause and effect. They can show a strong statistical association, but they can’t prove cause and effect. So you can’t prove benefit or risk reduction. And stories should say that.
USA Today, for example, did not explain that in its story. Nor did it include any of the limitations that were included in, for example, a HealthDay story, which stated:
“The problem with this type of study is that there are too many factors unaccounted for and association does not prove causality, said Dr. Larry B. Goldstein, director of the Duke Stroke Center at Duke University Medical Center.
“Subjects were asked about their past coffee consumption in a questionnaire and then followed over time. There is no way to know if they changed their behavior,” Goldstein said.
And, he noted, there was no control for medication use or other potential but unmeasured factors.
“The study is restricted to a Scandinavian population, and it is not clear, even if there is a relationship, that it would be present in more diverse populations. I think that it can be concluded, at least in this population, that there was not an increased risk of stroke among coffee drinkers,” he said.”
When you don’t explain the limitations of observational studies – and/or when you imply that cause-and-effect has been established, you lose credibility with some readers. And you should. Note some of the comments left on the USA Today website:
“Within a few weeks a new ‘study’ will come out telling us how bad coffee is for us.”
“Sign…I wish someone would make up their minds! Wasn’t it just a week or so ago there was a study about smog, coffee, etc., being bad for ya?”
“Remember when “scientific” studies were considered trustworthy and reliable?? How can anyone tell the few pearls of knowledge in a world of pointless studies that flip-flop results and rehash incessantly??”
“Drinking coffee reduces strokes per this study. Didn’t another say it causes cancer?”
USA Today wasn’t alone in being incomplete.
WebMD was just plain inaccurate when it stated: “1 or More Cups of Coffee a Day Reduces Stroke Risk in Women.” The study didn’t prove that.
CBSNews.com had a simply silly story that led:
” Ladies, you knew there was a good reason for that double mochachino you have every morning and maybe that one at lunch too.”
At least they came back later and explained:
“As for your mochachino, no word yet on the benefits of whipped cream and chocolate sauce.”
But why even go there to begin with?
ABCNews.com, by comparison, emphasized this study showed “association, not causation.” Kudos to them.
For anyone – journalist or consumer – or researcher, for that matter – who doesn’t grasp the importance of using the correct language to describe observational studies, please see our primer on this topic.
Comments (2)
Please note, comments are no longer published through this website. All previously made comments are still archived and available for viewing through select posts.
Jason Crain
March 11, 2011 at 11:22 amThanks for posting this. It’s refreshing to read.
It’s also worth noting that journalists are not entirely to blame. There is a disturbing cognitive slip that happens when this data is publicized. Take, for example, this quote from the National Cancer Institute:
“There have been no controlled clinical trials on the effect of regular physical activity on the risk of developing cancer. However, observational studies have examined the possible association between physical activity and a lower risk of developing colon or breast…”
Most journalists, and publicists, looking to gain social or economic capital for some cause, would blow this entire section completely out of proportion.
Our Comments Policy
But before leaving a comment, please review these notes about our policy.
You are responsible for any comments you leave on this site.
This site is primarily a forum for discussion about the quality (or lack thereof) in journalism or other media messages (advertising, marketing, public relations, medical journals, etc.) It is not intended to be a forum for definitive discussions about medicine or science.
We will delete comments that include personal attacks, unfounded allegations, unverified claims, product pitches, profanity or any from anyone who does not list a full name and a functioning email address. We will also end any thread of repetitive comments. We don”t give medical advice so we won”t respond to questions asking for it.
We don”t have sufficient staffing to contact each commenter who left such a message. If you have a question about why your comment was edited or removed, you can email us at feedback@healthnewsreview.org.
There has been a recent burst of attention to troubles with many comments left on science and science news/communication websites. Read “Online science comments: trolls, trash and treasure.”
The authors of the Retraction Watch comments policy urge commenters:
We”re also concerned about anonymous comments. We ask that all commenters leave their full name and provide an actual email address in case we feel we need to contact them. We may delete any comment left by someone who does not leave their name and a legitimate email address.
And, as noted, product pitches of any sort – pushing treatments, tests, products, procedures, physicians, medical centers, books, websites – are likely to be deleted. We don”t accept advertising on this site and are not going to give it away free.
The ability to leave comments expires after a certain period of time. So you may find that you’re unable to leave a comment on an article that is more than a few months old.
You might also like