Note to our followers: Our nearly 13-year run of daily publication of new content on HealthNewsReview.org comes to a close at the end of 2018. Publisher Gary Schwitzer and other contributors may post new articles periodically. But all of the 6,000+ articles we have published contain lessons to help you improve your critical thinking about health care interventions. And those will be still be alive on the site for a couple of years.

News outlets focus on one dramatic outcome. But did researchers omit data from hundreds in clinical trial?

Michael Joyce is a writer-producer with HealthNewsReview.org and tweets as @mlmjoyce

Last week we reported on ‘One cancer patient’s dramatic response to immunotherapy…’ and highlighted what we thought was a lack of healthy skepticism in the extensive (and mostly fawning) news coverage.

Most importantly we wanted to show that by highlighting the dramatic response of a single patient — without cautioning readers that just one dramatic response by no means justifies using words like “miraculous” or “unprecedented” — some reporters were running the very real risk of giving people false hope, or misleading them.”

“But the problem here is that this single patient is actually part of a larger clinical trial,” said Vinay Prasad, MD, a cancer specialist at Oregon Health and Science University.

“So this one amazing outcome is picked out of dozens or hundreds cases in the trial … we have no way of knowing. Yet in the reporting form to the journal — which is designed by them to address increasing problems with transparency and reproducibility — the authors identify this as a ‘single case report’ and that ‘no data was excluded’. How can that be? Is the denominator 1 or is it the 332 people enrolled in the trial?”

And you can see that confusion regarding the numbers in our original reporting:

  • “Perkins (the patient with the dramatic response) was just one of three breast cancer patients in this Phase II clinical trial. One subject died of an infection and the other did not respond” [source: CBS news]
  • “NPR  reported amongst 45 total patients in the trial with a variety of advanced cancers … there were 7 responders” (15%)

Here is the clinical trial Prasad refers to: (NCT01174121).  It clearly lists 332 participants, and Ms. Perkins is just one of them.

And here is the required Reporting Summary (completed by the author) in Nature Medicine:

 

Other important ways this matters

You can see how this might have a major effect on news coverage. At the very least, one can’t help but wonder how the reporting might have changed if it had been made crystal clear to reporters that Ms. Perkins was 1 of 332 subjects, and not simply 1 of 3 breast cancer patients (as reported by CBS), or 1 out of 45 patients (as reported by NPR)?

Does 1 of 332 lend itself to more caution? Fewer hyperbolic/eye-catching headlines?

I think it might. And Prasad agrees:

“Once I started looking for news stories on this I quickly found about 10 stories that all focused on the one amazing outcome. No mention of other subjects or other outcomes. That starts with the authors and the journal, and it’s unethical because there’s no transparency.”

That raises the issue of appropriate journal publication practices. When we approached Nature Medicine for explanation we were told: “It’s our policy to refer scientific questions to the author.”

Really? When your form is designed to improve “reproducibility” and encourage “consistency and transparency.” Doesn’t the journal have some accountability here?

We did try to reach out to the primary author, but two inquiries went unanswered.

Why we need watchdogs

What’s sobering here is that I missed this nuance. And I’m trained to look for it. And Dr. Prasad was the only cancer specialist (that I know of) who chose to make a point of this discrepancy on social media. Makes you wonder, doesn’t it, how often in this hyper-competitive world of medical research — where publications and news coverage can make or break a career, and hyperbolic headlines routinely trump careful, investigative reporting — we’re being misled by incomplete information?

It reminds me of two things we don’t say nearly enough; maybe because they seem trite or self-evident. First, if a news story seems too good to be true, ask yourself why. Dig deeper. Second, no matter how reputable the academic institution (as we reported earlier this week), or how reputable the journal (as we reported last month), we still need to remain vigilant. We still need watchdogs.

You might also like

Comments

We Welcome Comments. But please note: We will delete comments left by anyone who doesn’t leave an actual first and last name and an actual email address.

We will delete comments that include personal attacks, unfounded allegations, unverified facts, product pitches, or profanity. We will also end any thread of repetitive comments. Comments should primarily discuss the quality (or lack thereof) in journalism or other media messages about health and medicine. This is not intended to be a forum for definitive discussions about medicine or science. Nor is it a forum to share your personal story about a disease or treatment -- your comment must relate to media messages about health care. If your comment doesn't adhere to these policies, we won't post it. Questions? Please see more on our comments policy.

Comments are closed.