Health News Review

Vague, inaccurate report on a very preliminary study.

Our Review Summary

While the story makes much of the new test’s potential to be a more accurate alternative to prostate specific antigen (PSA) testing, the current standard, it failed to include such basic information as how many samples were included in the study or how many false positives the new test generated. It also left readers with the unsupported impression that earlier detection of cancers — something the new test may be capable of — would be an unalloyed good for patients. In fact, we’re already finding and treating too many cancers that are so slow-growing that they would never cause a problem. What’s truly needed is a new test that can distinguish aggressive cancers from the low-risk ones.    

Please note:  We made an erroneous statement in our original review in the "evidence" criterion below.  We said that "The story also never mentioned that the study was presented at a conference."  The story did, indeed, report that the study was presented as a talk at a conference.  But we went on to comment on how the story did not give "a caveat about the preliminary nature of such reports."  We always expect stories to evaluate the quality of the evidence, including an expectation that stories include at least a brief discussion of the limitations of drawing conclusions from presentations at meetings.  This story didn’t do that.  We consistently grade such stories unsatisfactory on this "evidence" criterion. We take this point so seriously that we post a primer on our site on this very topic.  See:  http://www.healthnewsreview.org/tips-for-understanding-studies.php#tip4.  Research presented at conferences is not subject to rigorous peer review prior to presentation – far less rigorous, indeed, than when a manuscript is submitted for publication in a journal.  This is an important caveat for journalists and for the general public.  This is the point we wanted to emphasize.  Our criterion comment was not well-worded and we have corrected our original erroneous wording but not the score for the reasons given.


Why This Matters

Prostate specific antigen (PSA) testing is controversial. While one large randomized trial found that screening reduced the chance of dying from prostate cancer, PSA is not very accurate. The majority of men with an abnormal PSA test do not have prostate cancer found on biopsy. A substantial proportion of men with PSA-detected cancers are thought to be overdiagnosed–the cancers never would have caused problems during the man’s lifetime. However, these cancers are often treated aggressively, exposing men to treatment complications such as erectile dysfunction and urinary incontinence. Therefore, researchers are searching for the ideal biological test–safe, affordable, easy to perform, with a low rate of false-positive results (which include both men without prostate cancer and those with just low-risk cancers), and a high probability of finding the cancers that are considered dangerous.


Criteria

Not Satisfactory

Does the story adequately discuss the costs of the intervention?

Not Satisfactory

This test is still in a very early stage of development and precise cost figures probably aren’t available. Nevertheless, since the story focuses on the test’s potential to be more accurate than PSA, we think the story should have made some attempt to quantify the likely cost of the test in comparison with PSA.

Not Satisfactory

Does the story adequately quantify the benefits of the treatment/test/product/procedure?

Not Satisfactory

The main thrust of the story is that the new test can help avoid false-positive tests and reduce unnecessary biopsies and treatments. And yet we never learn precisely how much better the new test is at reducing the false-positive rate.

In addition, the story says the new test can "distinguish actual prostate cancer from a more benign condition."  We wish the story had told us which "benign condition" it is talking about and provided some statistics to back up the claim. 

Another problem is that story touts the tests potential to find cancers at a very early stage of development. This is only going to be helpful if the test can distinguish high-risk from low-risk cancers, since the latter cancers are already being overdiagnosed and overtreated based on PSA. The story never explained this.   

Finally, the story notes that the new test is a "simple blood test," suggesting that this is somehow a benefit over PSA. But PSA is also a blood test. In either case, a patient with an abnormal result will be referred for a biopsy–no one would initiate treatment based on the new test alone.

Not Satisfactory

Does the story adequately explain/quantify the harms of the intervention?

Not Satisfactory

The story mentions that PSA tests have a high false-positive rate, and says that this leads to unnecessary biopsies and treatment. This is misleading. While men do get biopsies that they don’t need based on PSA testing, this doesn’t necessarily lead to treatment. There is a big difference between taking small samples (a biopsy) and removing the entire prostate. 

Overtreatment is related to overdiagnosis, which is when men get treatment for slow-growing cancers that most likely would never have caused them any health problems. And the new test appears to have shortcomings similar to PSA in this regard.  Unless the test is detecting high-risk tumors earlier than with PSA, we just trade the high false positive rate of PSA for a higher rate of overdetection with the new test–which usually leads to overtreatment and associated adverse effects.

Lastly, the story never quantified how many men who test positive with PSA don’t actually have cancer (about two-thirds), nor did it ever state precisely how much better the new test is at avoiding false-positives and related unnecessary treatments.

Not Satisfactory

Does the story seem to grasp the quality of the evidence?

Not Satisfactory

Surprisingly, this story provides more detail about a study that hasn’t yet been conducted than it does about the study ostensibly being reported on. The story tells us that this "much more exhaustive follow-on study" will take over 1,800 samples from men with a "whole range of ‘interfering diseases’" to help validate the new test. But we never learn how many samples, from what kind of men, were used in the study that the story is supposedly telling us about. The story omits such important information as the number of cancers, the mixture of early and advanced stage cancers, the mixture of low-risk and high-risk cancers, the number of controls and whether controls had benign prostate disease.

In addition, the story touts the test’s apparent 90% accuracy, but this statistic is virtually meaningless. It refers to the number of diseased patients correctly identified as having disease plus the number of non-diseased patients with negative tests. According to this measure, if 10 of 100 patients have the disease and the test concludes that all patients are normal, then the accuracy is still 90%. The more meaningful statistics are sensitivity and specificity. The report alludes to specificity–fewer false positives than PSA–but provides no data.

The story mentioned that the study was presented at a conference but it didn’t say anything about the limitations of drawing conclusions from such conference presentations. See our primer on this topic.  In our view, a caveat about the preliminary nature of such reports is always a good idea.

Not Satisfactory

Does the story commit disease-mongering?

Not Satisfactory

While this story didn’t exaggerate the impact of prostate cancer, it did oversell the effects of prostate screening and importance of early detection. The last sentence of the article notes that the prostate cancer death rate is going down and that cancers are being found earlier. Though both statements are true, these trends are probably unrelated. While some of the decreasing death rate is attributable to screening, better treatments are also an important factor. Suggesting that earlier detection has benefits is concerning given the high likelihood of overdiagnosis– that is, finding slow-growing cancers that get treated even though they would never pose a health threat.  

Not Satisfactory

Does the story use independent sources and identify conflicts of interest?

Not Satisfactory

The story does solicit a comment from someone not affiliated with the study or the company developing the new test, which normally would be enough a satisfactory rating. However, this source’s one-line quote about personalized medicine has little bearing on the substance of the story, which is primarily about the prostate cancer test. Moreover, his tangential remark is very much overshadowed by the numerous lengthy comments of John Anson, a Vice President at the company developing the test.
 
A close one here, but the story felt very unbalanced and would have benefited from some independent critical analysis of the study.  

Not Satisfactory

Does the story compare the new approach with existing alternatives?

Not Satisfactory

The story focuses primarily on the comparison with PSA testing and notes the important drawbacks of that test. It should have mentioned another option available to men: not getting tested at all.  

Satisfactory

Does the story establish the availability of the treatment/test/product/procedure?

Satisfactory

It’s clear from the story that this test isn’t commercially available and won’t be for some time. Although we generally frown on stories which pass along manufacturer guesses about when a new product will be approved, this story’s prediction that the test might be available "in 10 or 15 years" seems unlikely to generate much false hope.    

Not Satisfactory

Does the story establish the true novelty of the approach?

Not Satisfactory

The inadequacy of PSA testing is widely known and there are many researchers and companies working on similar promising techniques to improve the accuracy and usefulness of prostate cancer tests. The story didn’t mention any of this important background.

Not Applicable

Does the story appear to rely solely or largely on a news release?

Not Applicable

Since the story includes two interviews, we can be sure it didn’t rely entirely on a news release.

Total Score: 1 of 9 Satisfactory


We Welcome Comments

But please note: We will delete comments that include personal attacks, unfounded allegations, unverified facts, product pitches, profanity or any from anyone who doesn't list what appears to be an actual email address. We will also end any thread of repetitive comments. We don't give medical advice so we won't respond to questions asking for it. Please see more on our comments policy.