The story made an important attempt to evaluate the quality of the evidence, explaining that “studies looking back at cancer trends in a population are very unreliable when it comes to showing what caused those trends.” This doesn’t make up for some of the other gaps in the story but it provides readers with invaluable information for helping them make a very important medical decision.
This story says that the new study “seems to make a powerful argument in favor of PSA testing”. It’s true that retrospective studies can be powerful tools for understanding health trends, but they are not created equal. This study raises more questions than it answers, and, it’s important for health care journalists to frame stories about studies like this in the right way. That’s why the independent comments WebMD solicited are so crucial to helping readers understand the limitations of these backward looking statistical modeling studies.
There is no cost information in the story, and this is a big gap seen in other stories that covered the topic – in TIME.com and HealthDay. Mentioning the cost of a PSA test and whether insurance typically covers it would have been great, as well as the costs of some of the subsequent tests and treatments.
The story quantified the benefits claimed in the University of Rochester study, saying, “If it weren’t for routine PSA prostate cancer screening, an extra 17,000 Americans each year would learn that they had the worst form of the disease, a new study suggests.”
More important, it made an effort to question this type of statistical modeling “look-back” study.
The story lists some of the potential harms of screening at the very end.
In clear terms, the story explains how the study was conducted. “In 2008, about 8,000 U.S. men were diagnosed with metastatic prostate cancer. By projecting data from the pre-PSA era forward, Messing calculates that without routine PSA tests, 25,000 men would have been diagnosed in 2008 — an extra 17,000 cases of deadly disease.” Then it immediately follows up by saying, “Studies looking back at cancer trends in a population are very unreliable when it comes to showing what caused those trends.”
We wish the story had addressed more explicitly the limitations of such a statistical modeling analysis – to familiarize readers with such techniques and the potential flaws therein.
The story did not engage in disease mongering and did a good job explaining that prostate cancers are not all the same. “That kind of prostate cancer — metastatic prostate cancer, in which the cancer spreads to the bone or other parts of the body — is rapidly fatal, usually within two years or less.”
Most stories on this study relied heavily on the study’s lead author, Dr. Edward Messing, a urologist, who, like most of his colleagues believes strongly in the need for more PSA screening. He’s quoted here, too, of course, but the story also shares with readers important context from Dr. Barnett Kramer of the National Institutes of Health. And it includes input from a US Preventive Services Task Force member.
The alternative to PSA screening is the choice to decline screening, which is implicit in the discussion about the tradeoffs.
The story says that PSA screening is “widespread.”
We react pretty strongly to the statement way up high in the 3rd sentence of the story: “The new study seems to make a powerful argument in favor of PSA testing.” Even though that statement is countered 3 paragraphs later by Dr. Barry Kramer, the sentence may have set an early tone for readers that is difficult to overcome with later skepticism. Instead, why not put in the 3rd sentence of the story something about the limitations of such a statistical modeling analysis?
We admit we’re hard line on this: but the framing of screening stories is important.
The story did not rely on a press release.
Comments
Please note, comments are no longer published through this website. All previously made comments are still archived and available for viewing through select posts.
Our Comments Policy
But before leaving a comment, please review these notes about our policy.
You are responsible for any comments you leave on this site.
This site is primarily a forum for discussion about the quality (or lack thereof) in journalism or other media messages (advertising, marketing, public relations, medical journals, etc.) It is not intended to be a forum for definitive discussions about medicine or science.
We will delete comments that include personal attacks, unfounded allegations, unverified claims, product pitches, profanity or any from anyone who does not list a full name and a functioning email address. We will also end any thread of repetitive comments. We don”t give medical advice so we won”t respond to questions asking for it.
We don”t have sufficient staffing to contact each commenter who left such a message. If you have a question about why your comment was edited or removed, you can email us at feedback@healthnewsreview.org.
There has been a recent burst of attention to troubles with many comments left on science and science news/communication websites. Read “Online science comments: trolls, trash and treasure.”
The authors of the Retraction Watch comments policy urge commenters:
We”re also concerned about anonymous comments. We ask that all commenters leave their full name and provide an actual email address in case we feel we need to contact them. We may delete any comment left by someone who does not leave their name and a legitimate email address.
And, as noted, product pitches of any sort – pushing treatments, tests, products, procedures, physicians, medical centers, books, websites – are likely to be deleted. We don”t accept advertising on this site and are not going to give it away free.
The ability to leave comments expires after a certain period of time. So you may find that you’re unable to leave a comment on an article that is more than a few months old.
You might also like