The story tackled a complex topic and had some bright spots.
But its main weakness was a story line that bounced around from telling readers that prostate cancer screening has been controversial for 20 years, that major studies have just been published but that these major studies won’t probably won’t change anything very much.
It felt like a ping-pong game.
Why do experts disagree on whether longer follow up will provide more conclusive findings?
In order to grasp this topic, readers needed careful analysis. But this story didn’t deliver enough of it.
There was no discussion of costs, though the story did mention that in order to save one man from dying of prostate cancer, 1,400 would need to be screened and 48 men treated as well as pointing out that 47 men would therefore be harmed. However there was no estimate of the costs of these screenings and treatments nor was there much discussion of the personal costs of side effects in those other 47 men treated.
The story mentioned that one study found no difference in mortality between the men who were and weren’t screened, while a second study found a 20% reduction in mortality between groups. A 20% reduction in mortality sounds a lot more impressive than reducing the number of deaths by about 7 out of 10,000. The writer should have calculated the absolute risk rather than reporting the relative risk.
However – to its credit – the story included data on the number needed to screen to save one life.
The story did mention that to save one life, 1,400 men would need to be screened and 47 men who didn’t need to be treated would need to run the risk of side effects from treating a prostate cancer that didn’t need to be treated.
The story mentioned impotence and urinary incontinence as potential side effects of treatment but we wish it had provided an estimate of how commonly these occur.
The only descriptions of the two studies reported on were that they were just published and that they were controlled trials. There was also no discussion of the trials’ weaknesses. Without background about the studies reported on, the information presented in the story is less informative for readers trying to determine how seriously they should consider the results.
And though the story reported no difference in survival benefit in one study, it reported that the second study found a 20% reduction in deaths. While the story mentioned that this difference was only barely significant – reporting the absolute difference would have been more useful to help readers grasp the magnitude. Why not just include those numbers? So while it was a 20% reduction, we are talking about a difference of 0.29% in the group of men that were screened versus 0.37% in the control group. It is an example of taking a small risk and making it a little smaller.
The story stated that screening can lead to ‘painful, debilitating and expensive medical treatments without obvious benefit’. But the studies reported on did not include discussion of quality of life outcomes, so this description of outcomes is rather hyperbolic. Additionally, the European study did show a decreased mortality from prostate cancer–an obvious benefit.
Quotes from several individuals with relevant expertise were included in this story.
The treatment options outlined in the story were the choice to be screened or not to be screened.
The story mentioned that the PSA is a blood test that has been available for over 20 years. But it could have made clear how widespread is its use; surveys suggest that about 70% of men between 50 and 75 have undergone PSA testing in the US.
The story made clear that the controversy around screening men for prostate cancer has been around for 20 years.
Did not rely on a press release.