Strengths: a fairly good evaluation of the limitations of the evidence.
Weaknesses: Nothing on harms. Didn’t compare the new approach with another competing new technology (as the New York Times did). But it also didn’t compare the new approach with existing colon cancer screening methods in a meaningful data-driven way. So it failed to give readers a sense of the true scope of the potential benefits.
This topic is really about when we should pay attention to evidence about a new diagnostic test and how we should evaluate diagnostic tests in comparison with one another- a topic that is often inadequately discussed in both medicine and journalism.
A hesitant satisfactory score. The story stated, “the cost of the test has not yet been established. It is expected to cost more than a fecal occult blood test, but far less than a colonoscopy. A fecal occult blood test can cost as little as $23 while a colonoscopy can total $700.” It is expected by whom to cost in that range? What’s the source? (Reuters reported a $300-400 cost estimate from the company.)
The story does quantify the potential benefits, although the numbers are incomplete. It’s not clear, for one, how this compares to other colon cancer screening tests.
The story didn’t specify the sensitivity of the test, but stated: “The sensitivity of the test is much better than what has been seen in other stool screening tests, the ACS’ Brooks added.
Overall, we didn’t think the story gave readers a way to judge the scope of the potential benefits.
The story kept piling up the accolades – “one more advantage” and then “another benefit is…” but in the end the story was all benefits and no harms. The New York Times story, by comparison, addressed false positives.
The story makes an effort to address the limitations of reporting on the topic of a talk that hasn’t even been given yet (!) when it states: “Experts point out that studies presented at scientific meetings do not have to pass the rigorous peer review of studies published in reputable journals.” We applaud that effort but it could be improved easily. The story should state that that’s a limitation of drawing conclusions from such data! Wouldn’t that be far more explicit and clear to readers?
It did include an important independent perspective from Dr. Durado Brooks of the American Cancer Society, who called the findings interesting but said, “They will be more interesting if we ever get this kind of data in a screening population….Showing that in a small group of samples is very different from demonstrating that in a population where only a small number of individuals are going to have polyps of that size.”
Ideally, the story would have explained that initial accuracy studies in non-representative samples of cases and non-cases (like this one) usually overestimate performance. That is why they are planning another study.
At least the input of the director of colorectal cancer for the American Cancer Society was important.
The story did not compare the new approach with other similar approaches now in development, and did not give adequate data-driven comparisons with other screening methods that are now used.
The story explains that “Cologuard is not yet available for sale. Clinical trials comparing the test with colonoscopy are slated to start next year.” But then it allows the lead researcher to get away with saying he “hopes that the test will be approved and available within two years.” Sure he does. That doesn’t make it a prediction you can bank on.
No comparison with – not even a mention of – other competing research as the New York Times did.
It does not appear that the story relied on a news release, but we are curious why both this story and the Philadelphia Inquirer used the same “Holy Grail” analogy.
Comments
Please note, comments are no longer published through this website. All previously made comments are still archived and available for viewing through select posts.
Our Comments Policy
But before leaving a comment, please review these notes about our policy.
You are responsible for any comments you leave on this site.
This site is primarily a forum for discussion about the quality (or lack thereof) in journalism or other media messages (advertising, marketing, public relations, medical journals, etc.) It is not intended to be a forum for definitive discussions about medicine or science.
We will delete comments that include personal attacks, unfounded allegations, unverified claims, product pitches, profanity or any from anyone who does not list a full name and a functioning email address. We will also end any thread of repetitive comments. We don”t give medical advice so we won”t respond to questions asking for it.
We don”t have sufficient staffing to contact each commenter who left such a message. If you have a question about why your comment was edited or removed, you can email us at feedback@healthnewsreview.org.
There has been a recent burst of attention to troubles with many comments left on science and science news/communication websites. Read “Online science comments: trolls, trash and treasure.”
The authors of the Retraction Watch comments policy urge commenters:
We”re also concerned about anonymous comments. We ask that all commenters leave their full name and provide an actual email address in case we feel we need to contact them. We may delete any comment left by someone who does not leave their name and a legitimate email address.
And, as noted, product pitches of any sort – pushing treatments, tests, products, procedures, physicians, medical centers, books, websites – are likely to be deleted. We don”t accept advertising on this site and are not going to give it away free.
The ability to leave comments expires after a certain period of time. So you may find that you’re unable to leave a comment on an article that is more than a few months old.
You might also like