Michael Joyce is a writer-producer with HealthNewsReview.org and tweets as @mlmjoyce
If you’re one of our regular readers I’ll bet your ears echo from hearing us bang these drums over and over:
Finding examples illustrating the first bullet point is like falling off a log for us. Here are a few misleading headlines from observational studies published within just the past few weeks (click on the links to read our critique):
Hard-working women, go home early to avoid this disease (CNN)
Can smartphones trigger ADHD symptoms in teens? (HealthDay)
Motherhood may affect Alzheimer’s risk, study shows (NBC News)
Not only do observational studies lend themselves to exaggerated headlines like these that can mislead readers, but they can also lead those who should know better to promote glowing outcomes that just happen to align with their agendas. Outcomes that when tested more rigorously via RCTs don’t stand up.
A clear and clever illustration of this is highlighted in a provocative article written by pediatrician Aaron Carroll, MD, and published in the New York Times earlier this week:
The Ineffectiveness of Employer Wellness Programs, and the Importance of Randomized Trials
In the article Carroll tells us about a unique study published earlier this summer looking into the effectiveness of the wellness program at the University of Illinois at Champaign-Urbana. As Carroll points out most evaluations of wellness programs are observational by design. They’re quite prone to selection bias. So when results suggest participation in such programs leads to healthier outcomes it’s hard to tease out if those benefits are — as one of his sources points out — “due to differences in the people rather than differences from the [wellness] program.”
But the Illinois Workplace Wellness Study did something clever: It evaluated the program using both an observational AND a randomized controlled study approach.
The findings?
In a nutshell, every time the observational study suggested a benefit from participating in the program, the RCT analysis did not; in other words, no cause-and-effect benefit. Here are some of those findings:
Gym attendance: The observational approach suggested participants went to the gym 7.4 days/yr vs. nonparticipants = 3.8 days/yr … The RCT findings? 5.8 and 5.9 days/yr, respectively.
Healthcare spending: The observational approach suggested participants spent much less ($525 vs. $657) … The RCT findings? ($576 vs. $568)
It’s not hard to imagine a journalist (or a university public relations writer) taking these observational findings and writing an article with headlines like these:
‘Wellness program participants exercise nearly twice as much as non-participants’
‘Need to save money? Join a wellness program’
Of course, these are hypothetical and you might think so what?
At issue is that nearly every day, of every week, for 12 years we’ve come across headlines such as these that seduce readers with misinformation based on observational findings that very likely — had they been subject to a more rigorous study approach — would have been shown to be inaccurate/incorrect [see our tips for writing better health headlines].
That’s the elegance and beauty of the Illinois Workplace Wellness Study and why we give kudos to Carroll for writing about it, and why we wish more mainstream news outlets had covered it. It’s an important study.
And we were glad to see Carroll bring up these two (paraphrased) caveats:
A final thought worth bearing in mind is that not infrequently, when observational data support an institution’s agenda they are selectively highlighted, but when they do no not they’re often selectively downplayed [here’s an example].
This is a choice made far upstream at the source of much of our health care news. If journalists choose to report such imbalanced results without careful scrutiny (as in the 3 very real headlines listed above) it all but ensures that polluted health care information will be more widely disseminated.
That’s why we’d be well served to always keep the Illinois Workplace Wellness Study in mind.
Just as we’d be well served to wait and see if further RCT studies support or refute its findings.
Over 50 million Americans are enrolled in wellness programs. If you’d like to learn more, check out our podcast from last year:
Comments (2)
Please note, comments are no longer published through this website. All previously made comments are still archived and available for viewing through select posts.
louis clark
August 13, 2018 at 7:07 amI always learn on HNR about something that impacts me or someone I love. Thanks for sharing this analysis about the impact of a wellness program.
As I move through the site, I’ll hold off on sending the Reader’s Digest article to my dad anytime soon.
Al Lewis
August 15, 2018 at 6:07 amIt is also the case that three “natural experiments” conducted by wellness promoters accidentally revealed the same result — 100% of apparent program savings attributable instead to par-vs-nonpar study design — but also did not get media coverage https://www.ajmc.com/contributor/al-lewis-jd/2017/01/do-wellness-outcomes-reports-systematically-and-dramatically-overstate-savings
Our Comments Policy
But before leaving a comment, please review these notes about our policy.
You are responsible for any comments you leave on this site.
This site is primarily a forum for discussion about the quality (or lack thereof) in journalism or other media messages (advertising, marketing, public relations, medical journals, etc.) It is not intended to be a forum for definitive discussions about medicine or science.
We will delete comments that include personal attacks, unfounded allegations, unverified claims, product pitches, profanity or any from anyone who does not list a full name and a functioning email address. We will also end any thread of repetitive comments. We don”t give medical advice so we won”t respond to questions asking for it.
We don”t have sufficient staffing to contact each commenter who left such a message. If you have a question about why your comment was edited or removed, you can email us at feedback@healthnewsreview.org.
There has been a recent burst of attention to troubles with many comments left on science and science news/communication websites. Read “Online science comments: trolls, trash and treasure.”
The authors of the Retraction Watch comments policy urge commenters:
We”re also concerned about anonymous comments. We ask that all commenters leave their full name and provide an actual email address in case we feel we need to contact them. We may delete any comment left by someone who does not leave their name and a legitimate email address.
And, as noted, product pitches of any sort – pushing treatments, tests, products, procedures, physicians, medical centers, books, websites – are likely to be deleted. We don”t accept advertising on this site and are not going to give it away free.
The ability to leave comments expires after a certain period of time. So you may find that you’re unable to leave a comment on an article that is more than a few months old.
You might also like