NOTE TO READERS: When this project lost substantial funding at the end of 2018, I lost the ability to continue publishing criteria-driven news story reviews and PR news release reviews - once the bread-and-butter of the site going back to 2006. The 3,200 archived reviews, while still educational, are getting old and difficult for me to technically maintain on the back end of the website. So I am announcing that I plan to remove these reviews from the site by April 1, 2021. The blog and the toolkit - two of the most popular features on the site - will remain. If you wish to peruse the reviews before they disappear, please do so by the end of March 2021. After that date you may still be able to access them via the Internet Archive Wayback Machine -

Can we learn anything from Japanese study of cancer news reporting?

A new analysis, “How do medical journalists treat cancer-related issues?” was published in the journal ecancermedicalscience

It’s by Japanese researchers looking at Japanese news coverage.  An excerpt of their findings:

“This study provides important information about journalists who publish articles on cancer. First, the selection of topics is clearly biased; for example, aggressive treatments and survival rates attracted the attention of journalists much more than treatment failure, adverse events, end-of-life care, and death. Unexpectedly, 35 of the 48 participants (73%) had never reported on hospices, which is comparable to previous findings that only 7.6% of cancer-related articles focused on death and dying, whereas 32% focused on survival. This bias may give patients or the general public an inaccurate and optimistic view of the experience of cancer. The journalists should select the topics not for their interests, but for patient’s information needs. For physicians, on their part, they should provide appropriate information contributing to the treatment and end-of-life decision.”

How relevant is this for a US audience, or for US journalism?  That’s difficult to say.  This was a very small study.

Earle Holland, who was the senior science and medical communications officer at Ohio State University for almost 35 years, and who we’re proud to list as one of our new contributors reacted to the study with these comments:

     “Interesting, surely, but looking at the response rates, only 9 percent of those asked said that they had covered cancer therapy, so I’d be cautious about extrapolating the general conclusions too broadly.  Also (and this just may be my experience showing), Japanese journalists have seemed to me to be more accepting of statements by authorities and sources and less investigative, which to me suggests less critical analysis and problematic when applied to US journalists.

I’m always greatly interested in papers like this that look at the science/medical/environmental journalist behaviors, but I always focus first on the percentage of those who actually fill out such surveys.  Given the surveys of journalists I’ve done over the years, I think it is difficult to extrapolate any general conclusions that apply broadly.  That’s especially true now with the changing nature of the news media.”

Ivan Oransky wrote about it on MedPage Today, “Are Reporters Too Optimistic About Cancer?”  He even pointed to a review we published this week on a story “overselling the benefits of proton therapy.”  Despite the small sample size and low response rate, Oransky gleaned some overarching take-home messages:

“I wonder if if many reporters are hearing story ideas through their personal networks and not even realizing the sources of those stories are actually studies.

Either way, on the surface, relying on peer-reviewed research would seem to be a way to provide more balanced coverage, as long as reporters provide context. But I’m reminded that journals aren’t so good at dampening enthusiasm surrounding cancer research, either. A 2013 study found that two-thirds of studies didn’t mention toxicities in their abstracts. And in a third of studies, researchers relied on secondary endpoints if the treatments didn’t work as well as they’d hoped.

If not very many reporters were relying on journals, the authors took some solace in the fact that no journalists reported using drug company press releases — which the authors called “highly biased” — as a primary source. Then again, very few reporters said they relied on academic center press releases, according to the survey. But that may not be such a bad thing either, given the findings of another recent study.

“The journalists should select the topics not for their interests, but for patient’s information needs,” the authors conclude. Perhaps. But if you ask me, the real problem is a lack of skepticism. Study a way to solve that, and you’ll find me first in line to write about that research.


Follow us on Twitter:

and on Facebook.

You might also like


Please note, comments are no longer published through this website. All previously made comments are still archived and available for viewing through select posts.

Comments are closed.