This story never cites a specific study, nor does it provide links to several existing studies documenting benefits and harms.
The story didn’t give a sense of the strength of evidence behind this drug. For example, it didn’t tell readers what the comparison drug was in the trial, nor how many children were in the trial.
The evidence is so scant that readers aren’t likely to get much use out of this story.
But the story did not do enough to establish that the research is still preliminary — it has not been published nor peer-reviewed, and everyone is relying on just a little bit of data released by the company.
But the story would have been stronger had it described the findings using absolute rather than relative numbers.
The article makes effective use of a question-and-answer format to discuss previous research in this area, and other aspects of the research.
With no independent reporting, the story glosses over how the study was conducted, what was measured, and what the limitations were.
On the flipside, it missed significant limitations to the evidence, the most significant one being that there is no evidence that clot-busters save lives in stroke patients.
This story fails to provide anything more than hyperbole and unsubstantiated claims. Readers deserve more.
The drug reduced migraines “by at least half in 30 percent of patients who had failed up to four previous treatments,” according to the story. These statistics needed to be explained more clearly.