Why you should back away slowly from health care news stories relying on case studies

Joy Victory is deputy managing editor of HealthNewsReview.org. She tweets as @thejoyvictory.

When science journalist Dave Mosher tries to explain the risks of giving medical case studies a lot of attention, he turns to extraterrestrials.  

“Think like an alien who’s visiting Earth for the first time,” said Mosher, who also is one of our contributors. “If the first being you met was a cow, would you then assume all life on the planet is ‘cow?’ There’s a similar risk with giving case studies too much weight, since they lack size and diversity.”

What are case studies, exactly? As explained by Duke University Medical School, case studies, also known as case reports and case series, “consist of collections of reports on the treatment of individual patients or a report on a single patient.”

A one-person case study doesn’t say much about how well a treatment works. Nor does a two-person case study. Same with a handful or a dozen people. Instead, it takes hundreds, if not thousands, of study participants (and a control/placebo group) to accurately test if a new treatment works.

Yet case studies are often featured in news releases and news stories, as we saw this Monday when the problem popped up for us in our criteria-driven reviews not once but twice–first in a Case Western University news release on a neural stimulation treatment, and in a USA Today story on a smartwatch for Parkinson’s related tremors.

Far from ‘proof’ that something works

For those of us who have dedicated our careers to improving the public dialogue about health care, reading news releases and news stories about medical case studies is not how we prefer to (ideally) spend our Monday mornings.

That’s because, while they can offer up exciting new avenues of research for experts, case studies are far, far too preliminary to be framed as proof that an experimental intervention worked, noted Harry DeMonaco, a HealthNewsReview.org contributor and a visiting scientist at the MIT Sloan School of Management.

“Case studies are important contributors to the data set that should neither be taken as gospel nor dispelled outright,” he said. “Case studies of one or two patients can be viewed as hypothesis-generating and have been shown repeatedly to lead to formal studies and new treatments. But to be clear, a hypothesis is not a demonstration of proof.”

Yet, that’s effectively what USA Today implied in their story on a smartwatch tested on one woman, starting with the headline “Microsoft shows off watch that quiets Parkinson’s tremors.” And Case Western’s news release is not any better: “Stroke, MS patients walk significantly better with neural stimulation.”

‘It’s a sample size of one’

In either case, readers would have to click and read closely to figure out that these examples are about preliminary, isolated cases that may never come to pass as new treatments.

And for people who are not well-versed in the hierarchies of medical evidence, they may never realize this–partly because in both examples, limitations of case studies were not discussed.

“In research that is this preliminary, there should be ample discussion of the limitations. That’s missing here,” reviewers said of the Case Western news release. “[It] doesn’t effectively convey that a large-scale clinical study is needed to determine whether this technology can be effective…”

Reviewers noted the same problem in the USA Today story: “It’s a sample size of one, and what’s shown in the video may very well be a placebo effect without either the patient, designer, or anyone else knowing…The story should have made this point very strongly.”

The lure of a moving anecdote

Still, we can’t help but be drawn in by moving anecdotes and well-told stories, as is the case with the USA Today story, which includes a riveting video where we see the test subject draw without her hands trembling severely.

“This is a classic example of the power of a moving anecdote,” said Tim Caulfield, a Canada Research Chair in Health Law and Policy and a Professor in the Faculty of Law and the School of Public Health at the University of Alberta. “We know that a well-told story can overwhelm the facts about the actual science. And this is a well-told story. But, alas, it is not an actual study.” 

We saw a similar framing in a CBS News story we reviewed last year. The story focused on a case study report of two children with a rare and deadly genetic disease known as metachromatic leukodystrophy undergoing an experimental gene therapy as part of a clinical trial in Italy.

It’s heartbreaking to read about what the family has endured, and we feel enormous relief when we hear the experimental treatment appears to be working for them. The emotions overpower the evidence, in other words.

“The focus on one family uses a single experience to suggest that the experimental therapy is effective, when in fact the small trial has yet to be fully analyzed, published and peer reviewed,” our reviewers noted.

‘Read with abundant caution’

Last year we reviewed two news stories–one from TIME and one from STAT– about the same case study involving a type of cancer immunotherapy, given to one patient.

STAT’s story earned four stars, by avoiding touting the one patient’s experience as evidence of broad “success” and by including important voices of caution to help keep expectations in check.

TIME’s story earned three stars. While it used a few cautious phrases to tamp down expectations, it sorely needed outside expert comment, reviewers noted.

Reading the stories left reviewers (myself included) feeling piqued. As we said on the TIME story, “Stories like this need an FTC or FDA cautionary label: ‘This is a Single-Patient-Only Story. Read With Abundant Caution.’”

You might also like

Comments (2)

We Welcome Comments. But please note: We will delete comments left by anyone who doesn’t leave an actual first and last name and an actual email address.

We will delete comments that include personal attacks, unfounded allegations, unverified facts, product pitches, or profanity. We will also end any thread of repetitive comments. Comments should primarily discuss the quality (or lack thereof) in journalism or other media messages about health and medicine. This is not intended to be a forum for definitive discussions about medicine or science. Nor is it a forum to share your personal story about a disease or treatment -- your comment must relate to media messages about health care. If your comment doesn't adhere to these policies, we won't post it. Questions? Please see more on our comments policy.

Lawrence McGinty

May 18, 2017 at 4:15 am

How ironic that an article attacking the use of case studies should start with…a case study.

Reply

    Kevin Lomangino

    May 18, 2017 at 7:47 am

    I don’t think that medical case studies are comparable to case studies in journalism, and so I don’t agree that there’s much irony here.

    Kevin Lomangino
    Managing Editor

    Reply