What does it mean to say that a new test is 87% accurate?
I think most journalists writing about a paper published in the journal Alzheimer’s & Dementia could not satisfactorily answer that question – about a statistic they repeatedly quoted in stories about that paper.
The answer is: Not much, if you don’t take positive predictive value into account.
And again, most journalists writing about the study won’t know what this means either.
The BBC called it “a major step forward in developing a blood test to predict the onset of Alzheimer’s disease.”
I’ve often warned readers to head for the hills if anyone promises a “simple blood test.” Are you in the hills yet? Because Reuters reported, “Study paves way for simple blood test to predict Alzheimer’s.” While there were caveats at the very end of the story, who at Reuters is responsible for this simplistic headline?
Other stories called it “a major breakthrough.” One stated definitively that “A simple blood test can predict Alzheimer’s Disease” while another said “Alzheimer’s blood test not far away.” I’m not sure that either news story should make the claim it made at this point in time.
Now let’s get to the analysis of the claims.
The NHS Choices Behind the Headlines site wrote, “Blood test for Alzheimer’s ‘no better than coin toss.‘ ” Excerpt:
The media coverage was broadly accurate, but none reported the positive predictive value of the test. This reduces the impressive sounding 87% accurate figure to around the 50% level, depending on the prevalence assumptions, giving the test the same predictive value as a coin toss.
This important information should have been highlighted to avoid overstating the utility of the test on its own.
In a HypeWatch column on MedPage Today, John Gever writes, “Another Dementia Blood Test Oversold.” Excerpts:
Only about 10% of patients of patients with MCI convert to clinical dementia per year. With nearly 30% of positive results false (remember, the specificity was 71%) as well as 15% of negative results false, most of the positive results in such a group will be false.
Yes, it’s time once again for a tutorial in positive predictive values. If we have 100 MCI patients and a 10% conversion rate, then 10 of them will develop dementia. These are the true positives. There will be 90 true negatives — the ones who don’t convert.
But with a specificity of 71%, the test will falsely identify 29% of the 90 true negatives, or 26, as positive. Meanwhile, with a false negative rate of 15%, only nine (rounding up from 8.5) of the 10 true positives will be correctly identified.
That’s 26 false positive results against nine correctly positive. That’s useless in a clinical setting. In fact, it’s worse than useless, since the false-negative results will expose patients to unnecessary clinic visits and treatments, and generate anxiety for them and their families.
But here’s lead author Abdul Hye from King’s College London in the press release: “We now have a set of 10 proteins that can predict whether someone with early symptoms of memory loss, or mild cognitive impairment, will develop Alzheimer’s disease within a year, with a high level of accuracy.”
…
In the U.S., most coverage (such as this at Huffington Post) appeared to be simply rewrites of the press release with no independent viewpoint at all.
Sigh.
We’ve been down this road before.
And we shall travel it again.
————————–
Tweet
Follow us on Twitter:
https://twitter.com/garyschwitzer
https://twitter.com/healthnewsrevu
Comments
Please note, comments are no longer published through this website. All previously made comments are still archived and available for viewing through select posts.
Comments are closed.
Our Comments Policy
But before leaving a comment, please review these notes about our policy.
You are responsible for any comments you leave on this site.
This site is primarily a forum for discussion about the quality (or lack thereof) in journalism or other media messages (advertising, marketing, public relations, medical journals, etc.) It is not intended to be a forum for definitive discussions about medicine or science.
We will delete comments that include personal attacks, unfounded allegations, unverified claims, product pitches, profanity or any from anyone who does not list a full name and a functioning email address. We will also end any thread of repetitive comments. We don”t give medical advice so we won”t respond to questions asking for it.
We don”t have sufficient staffing to contact each commenter who left such a message. If you have a question about why your comment was edited or removed, you can email us at feedback@healthnewsreview.org.
There has been a recent burst of attention to troubles with many comments left on science and science news/communication websites. Read “Online science comments: trolls, trash and treasure.”
The authors of the Retraction Watch comments policy urge commenters:
We”re also concerned about anonymous comments. We ask that all commenters leave their full name and provide an actual email address in case we feel we need to contact them. We may delete any comment left by someone who does not leave their name and a legitimate email address.
And, as noted, product pitches of any sort – pushing treatments, tests, products, procedures, physicians, medical centers, books, websites – are likely to be deleted. We don”t accept advertising on this site and are not going to give it away free.
The ability to leave comments expires after a certain period of time. So you may find that you’re unable to leave a comment on an article that is more than a few months old.
You might also like