We feel several facets of the study should have been given more attention. Most notably, the drugmaker funded it. Given the ongoing FDA review, we think that’s pretty important. But read our full review for our questions about how the story covered (or didn’t cover) the evidence, benefits and cost issues at play.
The story misses this very important background entirely and present an incomplete and misleading summary. Our unsatisfactory ratings in this review identify the threads missing from the article that we feel are essential to weaving a fair portrait of the research.
No discussion of costs. Ticagrelor isn’t available in the US. But its cost – where available elsewhere – could have been cited. Cost will certainly be an issue if it’s approved because its competitor, clopidogrel, will soon be available in the US as a generic drug.
We have a few suggestions here. Most importantly, we think the take-home message of the study was too oversimplified. The headline claims that Brilinta “Beats Plavix When Paired With Low-Dose Aspirin.” In fact, this study was a geographic analysis to explain why Brilinta wasn’t better than Plavix in the US, unlike in other countries, and the lead conclusion was: “The regional interaction could arise from chance alone.” Next, it says that the aspirin dose was “a possible explanation.” In fact, since last year the dose of aspirin has been suggested as a possible explanation for the regional differences in Brilinta’s effects, and that hypothesis has generated controversy. This new study is important to raise our antennae and spur new research, but it wasn’t designed to prove the effect of aspirin.
Of course, the article’s take-home message is the same one that many in the field (including the FDA) may derive from this study. The aspirin finding is most relevant to current practice, both for the use of ticagrelor globally now and perhaps soon in the US. It’s also valuable for thinking about aspirin in this setting in general. However, we think the context should not be boiled off from the message.
It’s important to keep in mind the limitations of this reported benefit, as we discussed under Evaluate the Quality of Evidence. The details are statistical, such as the weakening of a study’s power by repeatedly slicing and dicing the same data, which is beyond the scope of this article. But we think more of the big picture would have been appropriate. For journalists, the big picture is a question of tone, of conveying the caution urged by the researchers, For doctors, the big picture is that only future randomized trials can investigate whether the pattern observed in this study with aspirin dose is a true effect, a red herring resulting from chance, or a surrogate marker of another factor that’s more important to target.
Additionally, the benefit is quantified as a relative risk reduction. If you’re a regular here, you know how we feel about relative risks vs absolute ones. In absolute terms, the difference between the two drugs was about 2% for heart attack, stroke, and death.
In the setting of heart attacks, the major harms of the drugs are related to their benefits — how well they prevent future cardiovascular tragedies. However, all the drugs in this article, including aspirin, impart a risk of bleeding, which is part of the comparison in PLATO and this follow-up analysis. It was good that the article gives a mention, albeit a brief one in the quote from Dr. Berger, of the increased risk of bleeding with higher doses of aspirin.
On the pros, it acknowledged that conference research is preliminary until a paper has been published in a peer-reviewed journal. (The paper is now available from the journal Circulation.) It also gave us the follow-up date and measurement for the benefits’ data point.
On the cons, we would have liked more about the underlying PLATO study, such as its design and the number of subjects. We also think the article would have tied a lot of threads together by explaining what this particular analysis by Mahaffey et al actually was. Why was it done? How is it related to the delay in FDA deliberations? Was it all about aspirin? Such information could have ushered in the background that in the PLATO trial, ticagrelor actually wasn’t better than clopidogrel at study sites in the US. In fact that’s thought to be why the FDA has delayed its decision, and this geographic analysis was intended to help settle the issue.
That first step would have been to describe that background and what type of study this was. The next step would have been to evaluate its quality. The investigators themselves offered some helpful critiques in their paper: the study was post hoc and data driven, no adjustments were made for multiple comparisons, this type of exploratory analysis is potentially hazardous, and because of all these limitations the indictment of aspirin dose as the culprit behind reduced ticagrelor efficacy should be considered cautiously.
Please see also our comments under Benefits for further discussion on these lines.
The population is pretty well specified in the article as those with acute coronary syndromes, and it explains that the FDA is considering an indication for these patients who are undergoing angioplasty or stenting.
Our biggest bone to pick with the story is its failure to mention that the study was funded by the maker of drug. Keep in mind that Brilinta is going up against Plavix, shortlisted with the best-selling drugs on the planet, and at US sites of PLATO, Plavix “won” (kind of). With FDA approval influenced by the results of this PLATO follow-up, AstraZeneca’s funding was a pretty important conflict of interest to disclose.
One independent source was cited, Dr. Berger. Further sources may have revealed more of the history and controversy in the field. In the discussion at the AHA conference where this new study was announced, at least one panelist said he thought the results were “spurious,” emphasizing the study’s conclusion that the result could have been due to chance.
The main PLATO study was a comparison of an experimental drug (ticagrelor) to an approved one (clopidogrel).
We would have preferred a different tone that didn’t suggest the drug is “awaiting” what then appears as almost inevitable FDA “approval.” The FDA panel that produced the 7-1 vote is not the panel that decides whether to approve drugs. It’s an advisory panel. The FDA usually follows its recommendations, but it has not decided whether to approve the drug. That colors the caveat that the “approval process is still ongoing” as sounding somewhat perfunctory, like it’s only a matter of time. It would be more balanced to suggest it’s still under review.
This was the latest analysis of the PLATO study, comparing different combinations of antiplatelet drugs, including one experimental drug that’s up for review. With more space, it would have been nice if the article further differentiated ticagrelor and clopidogrel for patients.
The article has several differences from the news release, and on that we pass it on this criteria. They do have a similar tone, and neither addresses the investigators’ recommendation for cautionary interpretation of this exploratory study, which they conclude may be reporting the effects of chance. Although the release does mention that AstraZeneca funded the study.
Comments
Please note, comments are no longer published through this website. All previously made comments are still archived and available for viewing through select posts.
Our Comments Policy
But before leaving a comment, please review these notes about our policy.
You are responsible for any comments you leave on this site.
This site is primarily a forum for discussion about the quality (or lack thereof) in journalism or other media messages (advertising, marketing, public relations, medical journals, etc.) It is not intended to be a forum for definitive discussions about medicine or science.
We will delete comments that include personal attacks, unfounded allegations, unverified claims, product pitches, profanity or any from anyone who does not list a full name and a functioning email address. We will also end any thread of repetitive comments. We don”t give medical advice so we won”t respond to questions asking for it.
We don”t have sufficient staffing to contact each commenter who left such a message. If you have a question about why your comment was edited or removed, you can email us at feedback@healthnewsreview.org.
There has been a recent burst of attention to troubles with many comments left on science and science news/communication websites. Read “Online science comments: trolls, trash and treasure.”
The authors of the Retraction Watch comments policy urge commenters:
We”re also concerned about anonymous comments. We ask that all commenters leave their full name and provide an actual email address in case we feel we need to contact them. We may delete any comment left by someone who does not leave their name and a legitimate email address.
And, as noted, product pitches of any sort – pushing treatments, tests, products, procedures, physicians, medical centers, books, websites – are likely to be deleted. We don”t accept advertising on this site and are not going to give it away free.
The ability to leave comments expires after a certain period of time. So you may find that you’re unable to leave a comment on an article that is more than a few months old.
You might also like