The photo that ran with this story appeared straight out of a dystopian fantasy: a mother holding her infant, her shoulder draped in a loose, shawl-like lattice of wires connected to a swarm of electrodes attached to her baby’s head. Is this the future? Fortunately for readers, this story did a better job than HealthDay in making better use of independent experts and didn’t allowing the lead author to get away with one of the more over-the-top statements that was quoted in the HealthDay story. But it did fail to explain the harms that may be associated with widespread use of scans or with false positives, the costs, and the alternatives to this type of screening. .
Autism spectrum disorders are increasingly being diagnosed, yet the causes remain elusive and the treatments are far from a cure for most kids. Parents are hungry for solutions, as can be seen in the passionate comments below this article. What we do know is that early detection appears to be important because early intervention has shown promising results in mitigating the effects of the disorder. That’s why it is crucial for stories to carefully sort through the evidence to weed out the blips along the way from the true breakthroughs.
The story did not address costs. To see why this is such a serious omission, look no further than this comment that was posted below the story online on the CNN website:
“This is great, but if your insurance won’t cover any Autism treatments, what’s the point? I have a 3.5 yr old and two year old boys, both on the Autism spectrum. Our insurance covers nothing for them with regard to any treatment. It’s left to us to absorb. Both require intensive speech and behavioral therapy at a cost of roughly $5,000 per month. I barely make $5,000 a month for a family of 4. …I wholeheartedly approve of the advances they are making, but if insurance won’t cover anything, why bother?”
Nowhere does the story address the significance of this: that the main benefit of this test appears to be that it detected with 80% accuracy during a very small timeframe that a child had an older sibling with autism. This means that the study correctly identified most of the time a risk factor that was already known to the parents of the chlid and to the researchers. So, in effect, the study had no actual benefit for the kids or the parents. At least the HealthDay story paraphrased one independent expert who noted that “the study predicted who was at high risk of autism, but it’s unknown if those babies actually went on to develop autism”. This seems like a giant hole that should have been explored.
Neither story quantified the harms. The obvious harm is parental anxiety and labeling an infant as having autism when there must be at least some risk of false positives.
This was framed as a potential screening test. All screening tests carry potential harms. Journalists often report only on screening tests’ potential benefits.
This story did a better job than the HealthDay story of explaining the study’s design, and it did probe some of the evidence through the use of independent experts. It also included this important statement: “Doctors and scientists not connected to the study are intrigued by the results but caution that this is very early research and not something concerned parents can be looking for as a screening tool for their babies any time soon.” There were multiple, cautionary notes in the article that were missing from the HealthDay piece. While the story could have been more critical and done much more to help readers understand the limitations of the study, we felt that it did an adequate job.
This story did a better job than the HealthDay story to avoid disease-mongering. It accurately stated the prevalence of autism spectrum disorders and the much higher risk seen in siblings. We felt the overall tone of the story was still too anxiety-provoking, though, and it ended on a curious note with a fundraising plea from a researcher.
The story made good use of outside experts, although we were dismayed to see the cliched “holy grail” reference, so beloved by researchers and journalists.
The story failed to compare this diagnostic approach to existing approaches, saying only that “Bosl says the first children enrolled in his study are now at 2 and 3 years old, which is the age when autism usually is diagnosed. This will now allow the researchers to evaluate them for autism and then look back at the brain activity patterns of the children who do fit the clinical criteria for autism.” How they are going to be evaluated for autism remains a big question. One presumes that they will be given a battery of behavioral tests, which, for now, are the gold standard. This should have been explained and the reasons why children younger than 2 cannot be accurately screened should have been explored.
The story made it clear that this was an experimental diagnostic tool and not ready for the pediatrician’s office.
The story is quite clear that EEG is not new but that the analysis method is new. The study was a novel contribution to the literature in that the study screens earlier, during infancy, in siblings of affected children. At the same time, we wish the story had placed this finding into a broader context. In December, CNN ran a story that sounded so similar we had to check the dates on this story to make sure it was not the same one. That story said, “Scientists are finding more pieces of the autism puzzle of with the help of MRI scans of brain circuitry, according to a study published Thursday online in the journal Autism Research.” Why not offer more context on other research that you’ve already reported on?
The story didn’t rely on a news release.
Comments
Please note, comments are no longer published through this website. All previously made comments are still archived and available for viewing through select posts.
Our Comments Policy
But before leaving a comment, please review these notes about our policy.
You are responsible for any comments you leave on this site.
This site is primarily a forum for discussion about the quality (or lack thereof) in journalism or other media messages (advertising, marketing, public relations, medical journals, etc.) It is not intended to be a forum for definitive discussions about medicine or science.
We will delete comments that include personal attacks, unfounded allegations, unverified claims, product pitches, profanity or any from anyone who does not list a full name and a functioning email address. We will also end any thread of repetitive comments. We don”t give medical advice so we won”t respond to questions asking for it.
We don”t have sufficient staffing to contact each commenter who left such a message. If you have a question about why your comment was edited or removed, you can email us at feedback@healthnewsreview.org.
There has been a recent burst of attention to troubles with many comments left on science and science news/communication websites. Read “Online science comments: trolls, trash and treasure.”
The authors of the Retraction Watch comments policy urge commenters:
We”re also concerned about anonymous comments. We ask that all commenters leave their full name and provide an actual email address in case we feel we need to contact them. We may delete any comment left by someone who does not leave their name and a legitimate email address.
And, as noted, product pitches of any sort – pushing treatments, tests, products, procedures, physicians, medical centers, books, websites – are likely to be deleted. We don”t accept advertising on this site and are not going to give it away free.
The ability to leave comments expires after a certain period of time. So you may find that you’re unable to leave a comment on an article that is more than a few months old.
You might also like