Kevin Lomangino is the managing editor of HealthNewsReview.org. He tweets as @KLomangino.
Ever since Amgen teased the public with claims of a landmark study showing that its new heart drug Repatha significantly reduced the risk of cardiovascular events, anticipation has been building for the day when Amgen would release the data to back up its bravado.
That day finally came on Friday, and the wall-to-wall news coverage suggests that Amgen’s public relations strategy was quite effective.
But what about the drug itself? How big was the “significant” reduction that Amgen touted in its news release (which some news outlets reported uncritically in the weeks preceding publication of the data)?
And more importantly, how carefully did news stories evaluate the results and did they provide consumers with the information they need to make informed decisions about treatment?
The coverage runs the gamut and I can only paint in broad strokes.
The patient perspective was covered very thoroughly by patient advocate Dave deBronkart, also known as e-Patient Dave. His excellent round-up of news stories about the study encourages patients to “avoid relative risk reduction (headlines about percentages) and look instead for actual (absolute) numbers of patients helped.”
Here’s what those numbers look like and how those numbers impact cost:
-Major heart problems or strokes happened to 11.3% of patients WITHOUT the new drug, and 9.8% of patients WITH the new drug. In other words, 1.5% of patients avoided a problem event, but 9.8% still experienced a problem event despite taking the drug.
-1.5% means on average, 1 patient in 67 benefits from the drug. That’s the NNT – the number of patients needed to treat for one to get any benefit.
-Note, though, that the drug did not save lives: the same percent died whether or not they got the drug. So it prevented 1.5% of these major cardiac events, but didn’t alter death rates – at least not during the time of this study.
-The drug costs $14,000/year, and these patients were watched a median of 2.2 years, so the cost was about $30,800 per patient.
-The 67:1 ratio means each prevented heart attack came at a cost of 67 x $30,800 = $2.06 million.
-No new side effects were discovered. That’s good – many new drugs bring new risks, too. (But …this study was pretty short, so more news about side effects may come out later.)
deBronkart adds that “this study included very high risk patients, and if your risk isn’t as bad, then the benefits of the drug would not be comparable. You’d be much less likely to benefit, so the NNT for patients like you would be much larger.”
Bottom line: Patients, doctors and policymakers will have to make their own decisions about whether treatment with this drug makes sense for individuals and for society more generally. But those decisions should be based on complete information that has been thoroughly analyzed and reported.
Claims of “landmarks” and “breakthroughs” should be viewed skeptically in the absence of data. The evidence-based reality is almost always messier than initial reports would have you believe.
Comments (1)
Please note, comments are no longer published through this website. All previously made comments are still archived and available for viewing through select posts.
e-Patient Dave
March 21, 2017 at 2:43 pmThanks for the kind words!
Re my post: credit where credit is due: Our blog software only allows listing one author but two other people provided 2/3 of the content, including patient community leader Marilyn Mann and Peter Elias MD. I kinda buried that in the post, and everyone was crediting me, so overnight I added this at top of post: “Please cite this post as “by Dave deBronkart, Marilyn Mann and Peter Elias MD” or, on Twitter, “@ePatientDave, @MarilynMann & @PHEski.”
Truly everything I know about thinking critically about health news has sprung from my first exchanges back in 2008 with Gary. (I wish I could find the video clips we recorded that September at Medicine 2.0 in Toronto – my collar was askew but the words were true.)
Our Comments Policy
But before leaving a comment, please review these notes about our policy.
You are responsible for any comments you leave on this site.
This site is primarily a forum for discussion about the quality (or lack thereof) in journalism or other media messages (advertising, marketing, public relations, medical journals, etc.) It is not intended to be a forum for definitive discussions about medicine or science.
We will delete comments that include personal attacks, unfounded allegations, unverified claims, product pitches, profanity or any from anyone who does not list a full name and a functioning email address. We will also end any thread of repetitive comments. We don”t give medical advice so we won”t respond to questions asking for it.
We don”t have sufficient staffing to contact each commenter who left such a message. If you have a question about why your comment was edited or removed, you can email us at feedback@healthnewsreview.org.
There has been a recent burst of attention to troubles with many comments left on science and science news/communication websites. Read “Online science comments: trolls, trash and treasure.”
The authors of the Retraction Watch comments policy urge commenters:
We”re also concerned about anonymous comments. We ask that all commenters leave their full name and provide an actual email address in case we feel we need to contact them. We may delete any comment left by someone who does not leave their name and a legitimate email address.
And, as noted, product pitches of any sort – pushing treatments, tests, products, procedures, physicians, medical centers, books, websites – are likely to be deleted. We don”t accept advertising on this site and are not going to give it away free.
The ability to leave comments expires after a certain period of time. So you may find that you’re unable to leave a comment on an article that is more than a few months old.
You might also like