Confusing association with causation – common journalistic pitfall

Posted By

Tags

We’ve just posted a new guide on the importance of the language used to describe the results of observational studies. Day after day we see stories that use active powerful verbs like “prevent…boost…lower your risk…may cut death rate” to describe the results of these studies. That’s misleading and inaccurate because such studies can’t prove cause-and-effect. Read more about why in this detailed, thoughtful piece by Mark Zweig, MD, and Emily DeVoto, PhD. (It’s actually a revision of a piece these two wrote for us two years ago. The new one fits nicely in our “Tips for Understanding Studies” section.)

——————

award_lr.gif
NOMINATED FOR ’09 BEST MEDICAL BLOG

You might also like

Comments

Please note, comments are no longer published through this website. All previously made comments are still archived and available for viewing through select posts.

Nick

February 11, 2010 at 12:17 am

Heh. You can tell it was written by doctors. That ‘suggested language’ is terrible – long words, technical language, convoluted sentences, packed with qualifications.
The only good thing you can say about it is readers are unlikely to come out the end with the wrong impression of the study.
This is because they’re going to come out of it with no idea whatsoever about the study. Most won’t finish the first sentence, because it’s horrible writing.
Typical scientists. They want to speculate about cause and effect, but they don’t want to be SEEN to speculate about it, because it might not be true. After all, people are far too stupid to understand the true meaning of ‘may’, aren’t they??? Such condescension!
Epidemiology is about scanning the patterns in the world to find those that betray hidden causality. Some do, some don’t but to pretend that it has nothing to do with underlying cause and effect is to doom the entire science into irrelevance.
You don’t refuse to report half time sports scores because they might give the wrong impression of the final result.
Use of the words “seems,” “may,” or “appears” is being honest, not misleading. It’s how the scientists would feel about the result, even if it’s not what they would want to put in words.
Sure, reporters often go over the top. Headline writers, especially. But journalism’s number one priority is communication. Technical precision is pointless if the truth, the message, the story is lost along the way.
If this ”suggested language” is the best we can do, we might as well give up on all science journalism right now.
By the way, I have this comic strip up on my desk as a constant reminder:
http://xkcd.com/552/

Jeremy Smith

February 11, 2010 at 4:22 am

Interesting piece. I work in the media (17 years and counting), and I definitely recognise the description of hand-to-hand “improvement” of the message from study to final printed/broadcast story. Can I just say that in my view, asking for more nuanced and careful language in the existing story framework may not be effective. It busts the genre and makes pieces look weak and hedged to a lay reader (which is like it or not commercially and creatively a bigger no-no, and a more urgent concern for lay writers, than making them look dumb to a specialist reader). So what is required is a new story framework within which these subtler messages will not seem out of place. I’m not sure what that might be, but my experience suggests that it’ll be easier to invent this and move some science reporting into it, than alter an established voice.

Gary Schwitzer

February 11, 2010 at 7:37 am

Nick,
Thanks for your comment. But I must disagree – strongly.
You wrote, “journalism’s number one priority is communication. Technical precision is pointless if the truth, the message, the story is lost along the way.”
I believe that journalism’s number one priority is accuracy. The truth IS, indeed, lost along the way if a story implies or states that research has shown cause-and-effect when it has not.
You also wrote,
“If this ”suggested language” is the best we can do, we might as well give up on all science journalism right now.”
I would say that if a journalist can’t explain the limitations of observational studies, then, yes, he/she should give up on science journalism and go into the sports reporting field that you mentioned.

Andrew Holtz

February 11, 2010 at 10:34 am

Nick is right that the examples of suggested language are not punchy. But he also says he wants stories to convey the truth… and that’s where the real problem is. The truth is that these observational studies provide clues and suggestions, not definitive conclusions about cause and effect.
When a study reports that people who eat fish every week have better eyesight than people who don’t… it is NOT telling the truth to report that the study found fish may help preserve eyesight. There are many, many reasons that the two items, fish-eating and eyesight, might cluster together, including income, education, other health behaviors and on and on.
The grid with the “suggested language” sets up a bit of an unfair comparison, because the suggested language includes additional information that the other boxes leave out… information that is vital if readers are to understand the truth.
If including caveats, context and explanation make a story too weak, there is a simple solution: don’t run it. One reason people feel so overwhelmed by apparently contradictory reporting on health science is that there are too many stories that sacrifice the (complicated and nuanced) truth in order to create a punchy lead and headline.
Weak reporting has real health consequences. Observational studies associated vitamin A with lower cancer risk… but then experiments showed smokers who took beta carotene supplements increased their lung cancer risk. Remember all the observational studies (and news reports) linking vitamin E and heart health? Too bad the experiments that followed blew holes in those hopes. And then there is hormone therapy for post-menopausal women. Lots of observational studies… and lots of punchy news leads… helped convince millions of women to pop estrogen pills in hopes of reducing heart disease risks. Then comes the Women’s Health Initiative… an actual experiment… that found that for certain women the hormones could cause more problems than they solved.
The truth is often messy. Reporters and editors need to deal with it. When a study says two things appear to go together… but there is no way to know which causes which (or if there might be a third factor causing both)… then say all of that… or just toss the story idea aside until some actual evidence of cause and effect comes along.
There is no benefit to news writing that takes shortcuts.
PS You might enjoy a radio commentary I wrote (Effects & Cause) that explores this topic… including the association between alarm clocks and sunrise. Oh, to make a snappier headline, I guess that should be “alarm clocks may cause the sun to rise.”
See the “Effects & Cause” link on this page: http://holtzreport.com/radio_commentaries.html

Ivan Oransky

February 12, 2010 at 9:34 am

To paraphrase and flip the old saw: I don’t think journalists should let a good story get in the way of facts, context, and accuracy.
I’m with Gary and Andrew. I think accuracy is paramount. Should news reports be engaging and readable? Of course. But only if they’re accurate.
Scientists speculate all the time about the reasons for a particular link. They do it in print. Reporters who read through to the discussion section of studies know that. Reporters who don’t go beyond press releases don’t.
It’s not just journos who recognize the problem with relying heavily on press releases. Here’s a recent editorial in Environmental Health Perspectives — a journal that publishes tons of epidemiological studies whose cause/effect relationships are unclear — in which an editor pledges better releases:
http://ehsehplp03.niehs.nih.gov/article/info:doi%2F10.1289%2Fehp.1001913
Most scientists are honest about the fact that they’re speculating. Reporting that seems like a good idea, even if it gets in the way of a clean narrative.
At the risk of sounding like the part-time journo prof I am, journalism is also about keeping people honest. If scientists make claims they can’t support, call them on it.
Sometimes scientists won’t provide enough data to provide that clean storyline. Easy: spike the piece. That’s what we did here at Reuters Health recently:
https://www.healthnewsreview.org/blog/2010/01/news-organization-decides-not-to-report-a-study-when-authors-dont-provide-data.html
Why would journalists want readers, listeners and viewers to get clean but inaccurate story lines? Isn’t it better to produce the smartest coverage than to overstate findings, often aided and abetted by a press release?
Our audience has access to more background and primary source data than ever. We insult them when we gloss over the facts they can find themselves in the name of a good clean story.
Ivan Oransky, MD
Executive Editor
Reuters Health
http://reutershealth.com
Adjunct Assistant Professor, New York University’s Science, Health, and Environmental Reporting Program
http://journalism.nyu.edu/prospectivestudents/coursesofstudy/serp/

Mary Knudson

February 12, 2010 at 10:43 am

Andrew Holtz makes a very important point in his comment: “If including caveats, context and explanation make a story too weak, there is a simple solution: don’t run it. One reason people feel so overwhelmed by apparently contradictory reporting on health science is that there are too many stories that sacrifice the (complicated and nuanced) truth in order to create a punchy lead and headline.”
The very first step to good reporting of medical studies is judging whether a study is worth writing about for a general public audience. How well done was the study? Many studies are not well done. What are the findings? Does this study prove something new? Does it strongly suggest a cause and effect or only an association? Talk to more than one study author to get a better sense of what the study found. Talk to scientists knowledgeable of the field who were not a part of the study. Always find out who funded the study. Careful reporting will lead to knowledgeable writing. After that writing talent and good editing kick in.
Mary Knudson