The story made an important attempt to evaluate the quality of the evidence, explaining that “studies looking back at cancer trends in a population are very unreliable when it comes to showing what caused those trends.” This doesn’t make up for some of the other gaps in the story but it provides readers with invaluable information for helping them make a very important medical decision.
This story says that the new study “seems to make a powerful argument in favor of PSA testing”. It’s true that retrospective studies can be powerful tools for understanding health trends, but they are not created equal. This study raises more questions than it answers, and, it’s important for health care journalists to frame stories about studies like this in the right way. That’s why the independent comments WebMD solicited are so crucial to helping readers understand the limitations of these backward looking statistical modeling studies.
There is no cost information in the story, and this is a big gap seen in other stories that covered the topic – in TIME.com and HealthDay. Mentioning the cost of a PSA test and whether insurance typically covers it would have been great, as well as the costs of some of the subsequent tests and treatments.
The story quantified the benefits claimed in the University of Rochester study, saying, “If it weren’t for routine PSA prostate cancer screening, an extra 17,000 Americans each year would learn that they had the worst form of the disease, a new study suggests.”
More important, it made an effort to question this type of statistical modeling “look-back” study.
The story lists some of the potential harms of screening at the very end.
In clear terms, the story explains how the study was conducted. “In 2008, about 8,000 U.S. men were diagnosed with metastatic prostate cancer. By projecting data from the pre-PSA era forward, Messing calculates that without routine PSA tests, 25,000 men would have been diagnosed in 2008 — an extra 17,000 cases of deadly disease.” Then it immediately follows up by saying, “Studies looking back at cancer trends in a population are very unreliable when it comes to showing what caused those trends.”
We wish the story had addressed more explicitly the limitations of such a statistical modeling analysis – to familiarize readers with such techniques and the potential flaws therein.
The story did not engage in disease mongering and did a good job explaining that prostate cancers are not all the same. “That kind of prostate cancer — metastatic prostate cancer, in which the cancer spreads to the bone or other parts of the body — is rapidly fatal, usually within two years or less.”
Most stories on this study relied heavily on the study’s lead author, Dr. Edward Messing, a urologist, who, like most of his colleagues believes strongly in the need for more PSA screening. He’s quoted here, too, of course, but the story also shares with readers important context from Dr. Barnett Kramer of the National Institutes of Health. And it includes input from a US Preventive Services Task Force member.
The alternative to PSA screening is the choice to decline screening, which is implicit in the discussion about the tradeoffs.
The story says that PSA screening is “widespread.”
We react pretty strongly to the statement way up high in the 3rd sentence of the story: “The new study seems to make a powerful argument in favor of PSA testing.” Even though that statement is countered 3 paragraphs later by Dr. Barry Kramer, the sentence may have set an early tone for readers that is difficult to overcome with later skepticism. Instead, why not put in the 3rd sentence of the story something about the limitations of such a statistical modeling analysis?
We admit we’re hard line on this: but the framing of screening stories is important.
The story did not rely on a press release.