This CNN story describes the results of a newly published study designed to see if a protein that can be measured in a blood sample accurately distinguishes between Parkinson’s Disease and a group of a rarer — but generally more rapidly disabling — disorders with similar neuromotor symptoms.
The article earns good marks for efforts to educate readers about the challenges of correctly diagnosing movement disorders and what the stakes are for patients who may get inadequate or overzealous treatment for the wrong condition. It also was detailed about the false-positive and false-negatives rates for the test–an important detail not often included in stories about screening tests.
The advent of a simple blood test that can, with significant accuracy, help primary care physicians as well as specialists achieve a more accurate diagnosis is newsworthy. Readers of this story will be given a mostly thorough, non-hyped account of what this study found.
Costs are not included. Presumably a blood test would be cheaper than a test of spinal fluid, which requires a lumbar puncture.
Atypical for many news stories like this, this one provided the actual specificity and sensitivity numbers for the blood test, and how that compares (in general) to the typical spinal fluid test.
That said, the story could have been a little more thorough in its discussion of what sensitivity and specificity mean. It says sensitivity is “the percentage of positives that are correctly identified, and specificity, the percentage of negatives that are correctly identified.” That’s accurate, but the story could have gone further to explain, for example, that a low specificity test means it will have a high false-positive rate (more people who don’t have the disease are erroneously told that they have it). A low-sensitivity test, by contrast, will have a high false-negative rate (more people who actually have the disease are falsely reassured by a negative test result).
Harms of the experimental test were not discussed. One important potential harm here is that there may be a much higher false-positive or false-negative rate once you use this test in the general public. For now, it’s only been tested in controlled situations where the health status of the study participant was known.
The article does a good job of explaining what the study was intended to do and how it was designed. It’s especially good on explaining how blood tests like this should be validated.
The story does not disease monger, and it did a nice job describing how common these atypical diseases are, compared to Parkinson’s disease.
The story included two independent sources not connected with the research.
However, several of the authors in the study had connections to companies that work on this kind of test, and that wasn’t disclosed in the story.
The story discusses the alternative–a spinal tap test.
Via the lead researcher’s discussion of what the test may do in the future, the article implies that the blood test is not yet available.
Ideally that hint should have been more explicit and perhaps accompanied by some details on what’s left before this test becomes available.
The story makes it clear that the novelty here is that this would be a blood test to distinguish among Parkinson’s disease and similar disorders, versus the more invasive spinal tap.
The story does not appear to rely on a news release.
Comments
Please note, comments are no longer published through this website. All previously made comments are still archived and available for viewing through select posts.
Our Comments Policy
But before leaving a comment, please review these notes about our policy.
You are responsible for any comments you leave on this site.
This site is primarily a forum for discussion about the quality (or lack thereof) in journalism or other media messages (advertising, marketing, public relations, medical journals, etc.) It is not intended to be a forum for definitive discussions about medicine or science.
We will delete comments that include personal attacks, unfounded allegations, unverified claims, product pitches, profanity or any from anyone who does not list a full name and a functioning email address. We will also end any thread of repetitive comments. We don”t give medical advice so we won”t respond to questions asking for it.
We don”t have sufficient staffing to contact each commenter who left such a message. If you have a question about why your comment was edited or removed, you can email us at feedback@healthnewsreview.org.
There has been a recent burst of attention to troubles with many comments left on science and science news/communication websites. Read “Online science comments: trolls, trash and treasure.”
The authors of the Retraction Watch comments policy urge commenters:
We”re also concerned about anonymous comments. We ask that all commenters leave their full name and provide an actual email address in case we feel we need to contact them. We may delete any comment left by someone who does not leave their name and a legitimate email address.
And, as noted, product pitches of any sort – pushing treatments, tests, products, procedures, physicians, medical centers, books, websites – are likely to be deleted. We don”t accept advertising on this site and are not going to give it away free.
The ability to leave comments expires after a certain period of time. So you may find that you’re unable to leave a comment on an article that is more than a few months old.
You might also like