Read Original Story

NY Times hails MitraClip a ‘huge advance’ for heart failure, but independent sources might have said otherwise


3 Star


Tiny Device Is a ‘Huge Advance’ for Treatment of Severe Heart Failure

Our Review Summary

This story reported on results of a randomized clinical trial of a device called the MitraClip, used to repair the mitral valve in patients with heart failure. Results of the trial, called COAPT, were published in the New England Journal of Medicine.

The story provided some very important details. For example, it addressed the cost of the device, provided data to quantify the extent of the benefit, and mentioned that the study was funded by the clip’s maker, Abbot.

But the story lacked balance in its sourcing. All of the physicians quoted were somehow involved in the study. The story only briefly mentioned a different trial that showed the device offered no benefit for people with heart failure, and didn’t mention that a third trial is underway. It also didn’t spell out that only about 10% of people with heart failure are similar to those in this trial.


Why This Matters

As this story states, there are millions of people living with heart failure who have very low quality of life, and few treatment options. An advance that could improve and lengthen their lives would indeed be welcome news. However, a single trial that is contradicted by other research is not sufficient to declare a “huge advance,” as this story does. Moreover, news stories should always use caution in reporting on new medical devices, which often have limited data to show they are effective and safe.


Does the story adequately discuss the costs of the intervention?


The story reported that “the device itself costs about $30,000, not counting the cost of the hospital and doctors: a surgeon, an interventional cardiologist and an echocardiologist, among others, all in the operating room.” An idea of the total cost would have been useful, since it’s likely to be far higher than $30k.

Does the story adequately quantify the benefits of the treatment/test/product/procedure?


The story said: “Among those who received only medical treatment, 151 were hospitalized for heart failure in the ensuing two years. Sixty-one died. In contrast, just 92 who got the device were hospitalized for heart failure during the period, and 28 died.”

We wish the story had clarified that the figures for deaths referred only to deaths from heart failure, not deaths from any cause.

Also, the story could have provided percentages to help readers draw precise comparisons. Among those who received only medical treatment, 56.7% were hospitalized for heart failure and 25.9% died from heart failure. Among those with the device, 35.7% were hospitalized for heart failure and 12% died from heart failure.

Death from any cause: During the study period, 29.1% in the device group died, and 46.1% in the control group died.

Does the story adequately explain/quantify the harms of the intervention?

Not Satisfactory

The story did not address potential harms of this device nor mention that medical devices are not required to provide robust safety data to go on the market in the U.S. According to the study, 3.4% of patients experienced complications related to the device within one year. It’s important to keep in mind that this adverse event rate is under ideal conditions–where patients are closely monitored by physicians because of clinical trial enrollment.

While there may be smaller surgical incisions than standard surgery, is it still quite invasive to insert a metal clip into a heart valve. Post-surgery, the device can malfunction, and cause a host of problems, including death.

Does the story seem to grasp the quality of the evidence?

Not Satisfactory

The story did some things well–laying out the basic design and how follow-up worked.

However, we wanted to see more about the limitations with this study, such as the need for longer-term data. The study’s authors said follow-up, which will continue through five years, is necessary to “fully characterize the safety and effectiveness of the device.”

Also, the story goes on to state that “doctors inserting the device first had to demonstrate their expertise doing so. An independent group of experts ascertained that patients’ medical care was optimal; all too often, heart failure patients do not receive ideal treatment.”

It’s worth exploring whether those “impeccable” conditions might have led to better relative outcomes for patients who received the device.

Does the story commit disease-mongering?


The lead states that “almost two million Americans have heart failure, and for them even mundane tasks can be extraordinarily difficult.”

Later, we’re told the number who might ultimately be treated will be less than the number who could be treated.

The story did not fully capture the fact that only a limited number of patients might actually benefit from this technology. In other coverage, the lead investigator was quoted saying that 10% of all patients with heart failure are similar to those in this trial.

Does the story use independent sources and identify conflicts of interest?

Not Satisfactory

The story did a mixed job.

On the plus side, it reported that the maker of the clip, Abbott, funded the study. It also reported that one of the physicians quoted “reported no relevant conflicts, but said that Columbia University gets royalties from the sale of the MitraClip.”

However, the story did not say whether other physicians who were quoted have conflicts of interest. At least one had a substantial conflict that we found. Gilbert Tang, MD, received $57,600 in payments from Abbott in 2017.

Most importantly, all of the physicians quoted had some connection to the trial. The story would have benefited from at least one physician source who wasn’t a cheerleader for this device.

Does the story compare the new approach with existing alternatives?


The story stated, “Until now, there has been little doctors can do” for people with heart failure, aside from drugs to control symptoms. This is perhaps a little pessimistic as there are a number of different medications that have been shown to improve outcomes in heart failure patients.

Does the story establish the availability of the treatment/test/product/procedure?


This was a strong point.

The story stated: “If the device is approved by the Food and Drug Administration for treatment of severe heart failure, as expected, then insurers, including Medicare, most likely will cover it.”

It also stated, “Not every cardiologist is equipped to insert the clip.”

Does the story establish the true novelty of the approach?

Not Satisfactory

The story quoted several doctors who spoke of the trial results in glowing terms: “huge advance” “very, very powerful message,” “game changer,” and “massive.”

But it downplayed another recent trial that showed the clip offered no benefit, saying “that research included many patients with less severe valve problems, the procedure was not performed as adeptly, and the patients’ medications were not as well optimized as in the new study.”

And the story didn’t mention that there’s a third trial, results of which are due to be released soon.

What the story also didn’t say is that some physicians are apparently waiting to see the results of all three trials before rendering a verdict.

As cardiologist John Mandrola, MD, put it on his blog:

What we have then is two randomized trials with conflicting results. That means we don’t know whether the device works, and we need a tie-breaker trial. Fortunately, one is coming soon.

Does the story appear to rely solely or largely on a news release?


The story doesn’t rely on a news release.

Total Score: 6 of 10 Satisfactory


Please note, comments are no longer published through this website. All previously made comments are still archived and available for viewing through select posts.