This story addressed many of the issues that we like to see covered in discussion of experimental diagnostic tests. It explained that this test is not ready for prime time and probably won’t be any time soon. It emphasized that the results of a small study cannot tell us how the test will perform in the real world. It explained that there are already effective diagnostic tests for colon cancer that aren’t being optimally used. It included perspective from an expert who explained potential benefits and drawbacks of the new test. Overall, a very good report.
In any story about diagnostic testing or screening, the discussion has to go beyond how many people with the disease were correctly identified. We also need to hear about how many people without the disease were incorrectly identified (i.e. false positives), and what harms those individuals may face as a result of the incorrect test. Of the two stories we reviewed about the same experimental colon cancer test, only HealthDay made readers aware of this important distinction.
Costs were not mentioned, but since this test is in an early stage of development, it would be difficult to provide an accurate cost figure. That figure would have to include both the cost of the test and any unnecessary follow-up tests and procedures due to false-positive results. We’ll rule it not applicable.
Both this story and the competing WebMD piece repeated the accuracy statistic provided by the study (76%), but neither story provides much detail as to what this figure means. Good diagnostic tests not only have to be able to identify people with diseases correctly, but they also have to be able to rule out people who don’t have the disease correctly. Ideally the story should report true positive rate (sensitivity) and the false positive rate (1- the specificity), though it’s not clear that the percentages listed represent those rates. We would have preferred to see the absolute numbers of people in each group (the cancer patients and controls) who were accurately diagnosed, and a comparison of how that stacks up against currently available tests.
This story did note that the primary benefit of a breath test would be increased convenience and less of an “ick” factor compared with existing tests. In addition, it pointed out that a 75% accuracy rate is a failure for 25% of patients — a nuance missing from WebMD’s coverage. In also explained that we don’t whether this kind of test can detect precancerous polyps, which should ideally be identified and removed before they have a chance to develop into cancer.
A mixed bag, but overall, it meets our standard for a satisfactory.
The story mentioned that this test would inevitably produce false-positive results that would lead to needless invasive follow-up tests.
The story sprinkled useful caveats throughout the coverage:
There was no disease-mongering of colon cancer.
An expert from the American Cancer Society provides much-needed context and perspective on this research.
The story discusses existing options for colon cancer screening, and what some experts think are the appropriate ages and intervals at which these tests should be done.
It’s clear from the story that this test won’t be available any time soon.
The story acknowledges that the idea of using a breath test to catch cancer is not new, and mentions other relevant research.
It doesn’t appear that the story relied inappropriately on this press release.
Comments
Please note, comments are no longer published through this website. All previously made comments are still archived and available for viewing through select posts.
Our Comments Policy
But before leaving a comment, please review these notes about our policy.
You are responsible for any comments you leave on this site.
This site is primarily a forum for discussion about the quality (or lack thereof) in journalism or other media messages (advertising, marketing, public relations, medical journals, etc.) It is not intended to be a forum for definitive discussions about medicine or science.
We will delete comments that include personal attacks, unfounded allegations, unverified claims, product pitches, profanity or any from anyone who does not list a full name and a functioning email address. We will also end any thread of repetitive comments. We don”t give medical advice so we won”t respond to questions asking for it.
We don”t have sufficient staffing to contact each commenter who left such a message. If you have a question about why your comment was edited or removed, you can email us at feedback@healthnewsreview.org.
There has been a recent burst of attention to troubles with many comments left on science and science news/communication websites. Read “Online science comments: trolls, trash and treasure.”
The authors of the Retraction Watch comments policy urge commenters:
We”re also concerned about anonymous comments. We ask that all commenters leave their full name and provide an actual email address in case we feel we need to contact them. We may delete any comment left by someone who does not leave their name and a legitimate email address.
And, as noted, product pitches of any sort – pushing treatments, tests, products, procedures, physicians, medical centers, books, websites – are likely to be deleted. We don”t accept advertising on this site and are not going to give it away free.
The ability to leave comments expires after a certain period of time. So you may find that you’re unable to leave a comment on an article that is more than a few months old.
You might also like