This story about a study on the overuse of implanted defibrillators packs in more information than many competing stories and hits nearly all of our marks, giving readers good context and a clear understanding of the study and its implications. We would have liked to have seen a bit more background information on the registry and on the guidelines. We are mystified by the lack of attention to the accompanying editorial in the journal that places the study into an important frame of reference and provides a good counter balance.
The study in question examined data collected from a national registry on over 100,000 patients and concluded that 1/5 of the implants were outside existing guidelines. This number is troubling at face value but the story appears to be a bit more complicated. The accompanying editorial places the results in a slightly different perspective. Of the four criteria examined, only one (placement of the ICD in patients with severe heart failure) is an absolute. The other three criteria are related purely to timing and not to whether the device is of value. This nuance was not clarified in the story.
Both stories we reviewed discussed costs, but the LA Times provided more details. “They can be lifesaving in appropriate patients. The devices cost $20,000 to $30,000, and normal hospitalization expenses can bring the total cost to $40,000 or more.” A bit more information would have provided a more complete picture. While the cost per patient is high, the true economic picture is more complicated. In reality, the cost of ICD in appropriate patients rivals those of the treatment of high blood pressure and elevated cholesterol in terms of life years saved.
The story quantifies the potential benefits of following the guidelines for implants in a couple of different ways. “Deaths in the hospital among patients receiving defibrillators outside the guidelines were 0.57%, compared with 0.18% among those who received them within the guidelines. About 3.23% of those receiving the devices outside guidelines had complications, compared with 2.41% of those within guidelines. The median length of a hospital stay was three days for those outside the guidelines, compared with one day for those within them.”
High in the story, the LA Times explains that unnecessary defibrillators increased the risk of harm, but it also put in some important context. “Although the absolute risk of dying is still low — less than 1% — such patients also endure longer hospitalizations and other complications and add substantially to the nation’s healthcare costs, researchers from the Duke Clinical Research Institute in Durham, N.C., reported in the Journal of the American Medical Assn.” The story also quantifies harms later in the piece.
The story does a good job examining the evidence with the help of some outside experts. One of the best parts of the story, in comparison with the Wall Street Journal blog, is how this story explains why certain patients didn’t meet the guidelines for the implants. The story does so in clear terms, unlike the Journal. “The team found that 25,145 of the implants were for causes that were not covered by the guidelines. Of those, 9,257 were in patients who had had a heart attack within 40 days and 15,604 were in patients with newly diagnosed heart failure. These problems fall outside the guidelines because they have not been studied in clinical trials of the devices or because clinical trials have shown that the devices are not effective in treating them.” We wish the story had taken advantage of the accompanying editorial. In reality, only one of the criteria (severe heart failure) is an absolute and only includes about 2.5% of the ICD implanted patients. The other patients who did not meet criteria did so because of the timing of the implant. Not all patients who have a heart attack or who undergo coronary artery surgery will eventually need an ICD. Many however are left with a sufficiently damaged heart to meet criteria. So, while the story does a good job reporting on the study, it really does not go as far as we would have liked in its interpretation.
The story does not engage in disease-mongering, far from it.
The story provides good insights and context from outside sources, including Dr. Shephal Doshi and Dr. Robert Ruelaz. We wish the story had pointed out, as the Wall Street Journal blog did, that the study was funded by Medtronic, which makes one of these devices. That is important context for readers. Without it, we feel this story does not meet this criteria.
And, as already noted above, we often wonder, as we did in this case, why the accompanying journal editorial wasn’t cited.
The story does not offer a comparison of the implants to other treatments. Even a line about other approaches may have satisfied this criterion.
Unlike the WSJ blog post we reviewed, this story pointed out the massive scope of this study and, thus, the widespread use of these implants. “The researchers examined a registry of implants maintained by the American College of Cardiology that covers an estimated 95% of all U.S. implants. The team studied 111,707 implants for what is called primary prevention, performed between January 2006 and June 2009. Primary prevention is for patients who had not had a recent heart attack or a problem with the bottom chambers of the heart.”
This story points out, as the WSJ blog story failed to do, that this same team of researchers had previously studied the underuse of these implants. “In previous studies, Al-Khatib and her colleagues have examined underuse of the devices and found that many patients who could benefit from the defibrillators did not get them. This time, she said, “we decided to look at the flip side.”” This speaks to the novelty and the importance of the devices in the population of patients who meet the guidelines.
The story does not rely on a news release.
Comments
Please note, comments are no longer published through this website. All previously made comments are still archived and available for viewing through select posts.
Our Comments Policy
But before leaving a comment, please review these notes about our policy.
You are responsible for any comments you leave on this site.
This site is primarily a forum for discussion about the quality (or lack thereof) in journalism or other media messages (advertising, marketing, public relations, medical journals, etc.) It is not intended to be a forum for definitive discussions about medicine or science.
We will delete comments that include personal attacks, unfounded allegations, unverified claims, product pitches, profanity or any from anyone who does not list a full name and a functioning email address. We will also end any thread of repetitive comments. We don”t give medical advice so we won”t respond to questions asking for it.
We don”t have sufficient staffing to contact each commenter who left such a message. If you have a question about why your comment was edited or removed, you can email us at feedback@healthnewsreview.org.
There has been a recent burst of attention to troubles with many comments left on science and science news/communication websites. Read “Online science comments: trolls, trash and treasure.”
The authors of the Retraction Watch comments policy urge commenters:
We”re also concerned about anonymous comments. We ask that all commenters leave their full name and provide an actual email address in case we feel we need to contact them. We may delete any comment left by someone who does not leave their name and a legitimate email address.
And, as noted, product pitches of any sort – pushing treatments, tests, products, procedures, physicians, medical centers, books, websites – are likely to be deleted. We don”t accept advertising on this site and are not going to give it away free.
The ability to leave comments expires after a certain period of time. So you may find that you’re unable to leave a comment on an article that is more than a few months old.
You might also like