This story reports on the results of one study examining the impact of using PSA testing to screen for prostate cancer. Two significant factors were not mentioned in this piece.
Lastly – the story cast the study being reported on as being important new results about which ‘there has been a lack of knowledge’ when the study results are from just one site of a large multicenter European prostate cancer screening trial that previously (March 2009) reported a survival benefit with screening – though of a lesser magnitude.
In order to understand the value of screening for prostate cancer, the potential for harm as well as the chance for benefit from treatment need to be considered – and reported.
While mentioning that PSA testing is rather routine practice in the US, there was no indication of its cost. Now, while the cost of PSA testing itself is relatively minimal, this was a study not just about screening but treatment. As Dr. Otis Brawley of the American Cancer Society wrote on the ACS blog:
So we think the story should have included at least a nod in the direction of WHAT IT COST to achieve the result trumpeted in the headline – especially given the proliferation of expensive robotic surgical systems and newer radiation therapy techniques.
This is a classic example of how cost gets left out of the discussion.
While doing a nice job presenting some of the data from this story about the benefits of PSA screening, additional care with words should have been employed in distinguishing between all-cause mortality and prostate cancer mortality. The story didn’t convey that PSA screening for prostate cancer did not change the overall chance of men dying. What this means is that there were men saved the fate of dying of prostate cancer that simply died of other causes. This is an important oversight in reporting.
Although reporting that ‘just 12 men would need to be diagnosed with prostate cancer in order to save one life’ the story neglected to follow though and examine the harm done to the other 11 men. There are psychological consequences of a cancer diagnosis. In addition, it wasn’t just that men were diagnosed with prostate cancer that affected the chance that they would die from prostate cancer – at least some of these men also received treatment for prostate cancer. There was no discussion of the possible harms that are a consequence of prostate cancer treatment.
These are not easy issues, but the story did not mention that the number needed to screen to prevent one cancer death was 293. Those who were screened and got a false positive, perhaps leading to unnecessary followup testing like a biopsy, experienced harms.
The story did a good job of presenting information about certain aspects of the study. Information about the nature of the study (i.e. randomized trial) was included. But the article does not indicate that even 20,000 subjects is a small number for a screening trial and that the 44% risk reduction has a wide confidence interval–the true effect could range from 61% to only 18%. These are questions about the quality of the evidence that should have been mentioned.
There was no overt disease-mongering.
The story included quotes from several experts. But we’re going to rule the story’s performance on this criterion as unsatisfactory for reasons of balance. It’s possible to have several sources and still have an editorial imbalance in the story.
The entire story was about whether PSA screening is better than no screening.
The story accurately reported the widespread use of the PSA test to screen for prostate cancer in the U.S.
The story accurately reflected that PSA testing for prostate cancer is not new and that there has been an on-going debate about the benefit of PSA screening.
The story does not appear to rely on a news release.