This story reports on the results of one study examining the impact of using PSA testing to screen for prostate cancer. Two significant factors were not mentioned in this piece.
Lastly – the story cast the study being reported on as being important new results about which ‘there has been a lack of knowledge’ when the study results are from just one site of a large multicenter European prostate cancer screening trial that previously (March 2009) reported a survival benefit with screening – though of a lesser magnitude.
In order to understand the value of screening for prostate cancer, the potential for harm as well as the chance for benefit from treatment need to be considered – and reported.
While mentioning that PSA testing is rather routine practice in the US, there was no indication of its cost. Now, while the cost of PSA testing itself is relatively minimal, this was a study not just about screening but treatment. As Dr. Otis Brawley of the American Cancer Society wrote on the ACS blog:
So we think the story should have included at least a nod in the direction of WHAT IT COST to achieve the result trumpeted in the headline – especially given the proliferation of expensive robotic surgical systems and newer radiation therapy techniques.
This is a classic example of how cost gets left out of the discussion.
While doing a nice job presenting some of the data from this story about the benefits of PSA screening, additional care with words should have been employed in distinguishing between all-cause mortality and prostate cancer mortality. The story didn’t convey that PSA screening for prostate cancer did not change the overall chance of men dying. What this means is that there were men saved the fate of dying of prostate cancer that simply died of other causes. This is an important oversight in reporting.
Although reporting that ‘just 12 men would need to be diagnosed with prostate cancer in order to save one life’ the story neglected to follow though and examine the harm done to the other 11 men. There are psychological consequences of a cancer diagnosis. In addition, it wasn’t just that men were diagnosed with prostate cancer that affected the chance that they would die from prostate cancer – at least some of these men also received treatment for prostate cancer. There was no discussion of the possible harms that are a consequence of prostate cancer treatment.
These are not easy issues, but the story did not mention that the number needed to screen to prevent one cancer death was 293. Those who were screened and got a false positive, perhaps leading to unnecessary followup testing like a biopsy, experienced harms.
The story did a good job of presenting information about certain aspects of the study. Information about the nature of the study (i.e. randomized trial) was included. But the article does not indicate that even 20,000 subjects is a small number for a screening trial and that the 44% risk reduction has a wide confidence interval–the true effect could range from 61% to only 18%. These are questions about the quality of the evidence that should have been mentioned.
The story included quotes from several experts. But we’re going to rule the story’s performance on this criterion as unsatisfactory for reasons of balance. It’s possible to have several sources and still have an editorial imbalance in the story.
The entire story was about whether PSA screening is better than no screening.
The story accurately reported the widespread use of the PSA test to screen for prostate cancer in the U.S.
The story accurately reflected that PSA testing for prostate cancer is not new and that there has been an on-going debate about the benefit of PSA screening.
The story does not appear to rely on a news release.
Comments
Please note, comments are no longer published through this website. All previously made comments are still archived and available for viewing through select posts.
Our Comments Policy
But before leaving a comment, please review these notes about our policy.
You are responsible for any comments you leave on this site.
This site is primarily a forum for discussion about the quality (or lack thereof) in journalism or other media messages (advertising, marketing, public relations, medical journals, etc.) It is not intended to be a forum for definitive discussions about medicine or science.
We will delete comments that include personal attacks, unfounded allegations, unverified claims, product pitches, profanity or any from anyone who does not list a full name and a functioning email address. We will also end any thread of repetitive comments. We don”t give medical advice so we won”t respond to questions asking for it.
We don”t have sufficient staffing to contact each commenter who left such a message. If you have a question about why your comment was edited or removed, you can email us at feedback@healthnewsreview.org.
There has been a recent burst of attention to troubles with many comments left on science and science news/communication websites. Read “Online science comments: trolls, trash and treasure.”
The authors of the Retraction Watch comments policy urge commenters:
We”re also concerned about anonymous comments. We ask that all commenters leave their full name and provide an actual email address in case we feel we need to contact them. We may delete any comment left by someone who does not leave their name and a legitimate email address.
And, as noted, product pitches of any sort – pushing treatments, tests, products, procedures, physicians, medical centers, books, websites – are likely to be deleted. We don”t accept advertising on this site and are not going to give it away free.
The ability to leave comments expires after a certain period of time. So you may find that you’re unable to leave a comment on an article that is more than a few months old.
You might also like