The story tackled a complex topic and had some bright spots.
But its main weakness was a story line that bounced around from telling readers that prostate cancer screening has been controversial for 20 years, that major studies have just been published but that these major studies won’t probably won’t change anything very much.
It felt like a ping-pong game.
Why do experts disagree on whether longer follow up will provide more conclusive findings?
In order to grasp this topic, readers needed careful analysis. But this story didn’t deliver enough of it.
There was no discussion of costs, though the story did mention that in order to save one man from dying of prostate cancer, 1,400 would need to be screened and 48 men treated as well as pointing out that 47 men would therefore be harmed. However there was no estimate of the costs of these screenings and treatments nor was there much discussion of the personal costs of side effects in those other 47 men treated.
The story mentioned that one study found no difference in mortality between the men who were and weren’t screened, while a second study found a 20% reduction in mortality between groups. A 20% reduction in mortality sounds a lot more impressive than reducing the number of deaths by about 7 out of 10,000. The writer should have calculated the absolute risk rather than reporting the relative risk.
However – to its credit – the story included data on the number needed to screen to save one life.
The story did mention that to save one life, 1,400 men would need to be screened and 47 men who didn’t need to be treated would need to run the risk of side effects from treating a prostate cancer that didn’t need to be treated.
The story mentioned impotence and urinary incontinence as potential side effects of treatment but we wish it had provided an estimate of how commonly these occur.
The only descriptions of the two studies reported on were that they were just published and that they were controlled trials. There was also no discussion of the trials’ weaknesses. Without background about the studies reported on, the information presented in the story is less informative for readers trying to determine how seriously they should consider the results.
And though the story reported no difference in survival benefit in one study, it reported that the second study found a 20% reduction in deaths. While the story mentioned that this difference was only barely significant – reporting the absolute difference would have been more useful to help readers grasp the magnitude. Why not just include those numbers? So while it was a 20% reduction, we are talking about a difference of 0.29% in the group of men that were screened versus 0.37% in the control group. It is an example of taking a small risk and making it a little smaller.
The story stated that screening can lead to ‘painful, debilitating and expensive medical treatments without obvious benefit’. But the studies reported on did not include discussion of quality of life outcomes, so this description of outcomes is rather hyperbolic. Additionally, the European study did show a decreased mortality from prostate cancer–an obvious benefit.
Quotes from several individuals with relevant expertise were included in this story.
The treatment options outlined in the story were the choice to be screened or not to be screened.
The story mentioned that the PSA is a blood test that has been available for over 20 years. But it could have made clear how widespread is its use; surveys suggest that about 70% of men between 50 and 75 have undergone PSA testing in the US.
The story made clear that the controversy around screening men for prostate cancer has been around for 20 years.
Did not rely on a press release.
Comments
Please note, comments are no longer published through this website. All previously made comments are still archived and available for viewing through select posts.
Our Comments Policy
But before leaving a comment, please review these notes about our policy.
You are responsible for any comments you leave on this site.
This site is primarily a forum for discussion about the quality (or lack thereof) in journalism or other media messages (advertising, marketing, public relations, medical journals, etc.) It is not intended to be a forum for definitive discussions about medicine or science.
We will delete comments that include personal attacks, unfounded allegations, unverified claims, product pitches, profanity or any from anyone who does not list a full name and a functioning email address. We will also end any thread of repetitive comments. We don”t give medical advice so we won”t respond to questions asking for it.
We don”t have sufficient staffing to contact each commenter who left such a message. If you have a question about why your comment was edited or removed, you can email us at feedback@healthnewsreview.org.
There has been a recent burst of attention to troubles with many comments left on science and science news/communication websites. Read “Online science comments: trolls, trash and treasure.”
The authors of the Retraction Watch comments policy urge commenters:
We”re also concerned about anonymous comments. We ask that all commenters leave their full name and provide an actual email address in case we feel we need to contact them. We may delete any comment left by someone who does not leave their name and a legitimate email address.
And, as noted, product pitches of any sort – pushing treatments, tests, products, procedures, physicians, medical centers, books, websites – are likely to be deleted. We don”t accept advertising on this site and are not going to give it away free.
The ability to leave comments expires after a certain period of time. So you may find that you’re unable to leave a comment on an article that is more than a few months old.
You might also like