Read Original Story

A diabetes drug study, and resulting news coverage, reflect problems with American health care

No heart safety issues seen with Merck Januvia diabetes drug: study

Our Review Summary

benefits and costs signsWhile this story is in some respects exemplary in its coverage of a study on Januvia, a blockbuster diabetes drug, we think it missed the big picture. A full explanation of this research would have required a little more space and a lot more context than this story provided. To know whether Januvia is worth the potential risks associated with taking it, the story would have had to provide more information on the relative benefits of the drug versus other therapies, the costs of the drug in comparison to other therapies, and a more thorough analysis of the quality of the evidence in the study and perhaps other studies related to the drug. The story provides none of this information. It does provide a fairly good discussion of the risks associated with the drug — at least according to this study — but without that additional information, those risk comparisons are at best of little use and at worst highly misleading.


Why This Matters

Diabetes is very common and hard to treat. This drug is expensive and provides little to no benefit compared with other therapies. As the study shows, there was only a very small reduction in blood sugar, and no reduction in heart disease or risk of death in the Januvia-treated patients. It’s problematic that this study is characterized as a success and “setting the stage for a return to sales growth” simply for showing that the drug didn’t actively harm people, while providing no quantifiable benefit. Should we cheer for a drug that rakes in billions merely because it won’t kill us or give us heart disease? This is a classic example of why health care costs so much in America and doesn’t deliver commensurate value.


Does the story adequately discuss the costs of the intervention?

Not Satisfactory

There was no discussion of costs in this story. Cost information is available. It costs about $350 for 30 tablets of 100 mg each.

Does the story adequately quantify the benefits of the treatment/test/product/procedure?

Not Satisfactory

The focus of the story is on the drug’s risk profile, but we think the story should have provided at least some information about whether this drug has been shown to be more efficacious than other therapies. The story missed what was in our view one of the most important points of the study: the practically nonexistent benefit in reducing blood sugar compared with existing therapy. And, while the drug did not increase rates of cardiovascular disease, it also did not reduce them. Also, the study followup was relatively short; median follow up of 3 years. That could have been mentioned somewhere.

Does the story adequately explain/quantify the harms of the intervention?


The story is focused on the potential harms of the drug and it does a fairly good job explaining the differences between the risks for different bad outcomes from the drug and from a placebo. We give very high marks to the story for providing both comparisons using percentages but also using absolute numbers throughout. So, for example, the story said,”There were 228 such hospitalizations for Januvia and 229 in the placebo group, according to data also published in the New England Journal of Medicine.”

It also does a nice job of explaining, at least in one instance, that some of the comparisons between the drug and placebo are not statistically significant. For example, the story said that, “There was also no significant difference between Januvia and placebo in infections, cancer, kidney failure or severe hypoglycemia, which is dangerously low blood sugar, researchers reported. Acute pancreatitis, a concern with some diabetes drugs, was uncommon but higher with Januvia, 23 versus 12. That was not statistically significant.”

With that being said, we wish the story had mentioned that three-year follow up may not be long enough to catch some potential harms. And while the pancreatitis issue was downplayed, a larger and longer study might have shown the concern to be statistically significant if the trend continued. An independent expert analyst may well have commented on such concerns, but no such expert was quoted.

Does the story seem to grasp the quality of the evidence?


The study itself was a good design and the story’s description is consistent with the quality of the evidence. So we’ll award a Satisfactory rating, although the story could have been a little clearer about where this evidence was coming from. The story talks about the drug company presenting information, and then information being presented at a conference, and then some data being published in a journal. There’s a big difference between comments made at a conference or in a company press release and findings that were subjected to peer review and published in a journal. We think the reporter had all those facts at hand but that, on deadline, they did not come out clearly in the final piece.

Does the story commit disease-mongering?


There was no disease mongering in this piece.

Does the story use independent sources and identify conflicts of interest?

Not Satisfactory

There were several problems here. First, the story offered no independent expert perspective on the findings — only one of the study authors is quoted. And, as explained above, it is a little unclear where all the information in the story had its origin, and that connects to conflicts of interest. The story says, incorrectly it seems, that “the Tecos heart safety study was conducted by an independent academic research collaboration between the University of Oxford Diabetes Trials Unit and the Duke University Clinical Research Institute.” But it also talks about Merck making announcements about the study’s findings before they were published. The study itself says at the end, “Supported by Merck Sharp & Dohme, a subsidiary of Merck.” And the study authors reported on their disclosure forms conflicts of interest with Merck and other diabetes drug makers. While the study authors may describe their work as an “independent collaboration,” we expect stories to dig deeper and explain what exactly that means.

Does the story compare the new approach with existing alternatives?

Not Satisfactory

The story does not compare the drug to any alternatives, only to a placebo. It does not point out that the medications the subjects were already taking seemed to be just as effective without adding the new drug.

Does the story establish the availability of the treatment/test/product/procedure?


The story makes it clear that the drug has been available but has been hampered in terms of sales because of safety concerns. The drug is widely available, widely used, expensive, and of unproven value in terms of important outcomes.

Does the story establish the true novelty of the approach?


We’ll give the story the benefit of the doubt here. In the last line, the story explains that the study “was undertaken after heart safety concerns were raised over other diabetes medicines.” So that’s what’s new here. However, the story does not make it clear why this drug would be preferable over other drugs or whether it does anything novel — and that context would have been very valuable.

Does the story appear to rely solely or largely on a news release?


The story does not overly rely on a news release. It appears that one of the study authors was interviewed.

Total Score: 6 of 10 Satisfactory


Please note, comments are no longer published through this website. All previously made comments are still archived and available for viewing through select posts.