If you are a candidate for one of these devices, you would be hard pressed to take information from this story and ask your cardiologist any good questions about how you should proceed.
Clinical trials examining the utility of medical devices or drugs are, by definition, artificial. The patient population selected tends not to have confounding medical or social issues and are followed in a rigid and standardized way. As a result, the real world application of a drug or medical device can often result in outcomes that are a far cry from those seen in the clinical trial. This study is important because it is one of the few circumstances where the results seen in actual usage of medical devices matches those seen in a clinical trial. Patients with heart failure have a high incidence of sudden death. Numerous clinical trials have demonstrated the value of implantable cardioverter-defibrillators (ICDs) and cardiac resynchronization therapy-defibrillators (CRT-D) in heart failure patients who have not suffered an episode of near sudden death. These devices are designed to recognize irregular heart rhythms and provide a “shock” to restore them back to normal. A manufacturer of the devices obtained approval from the FDA to establish a registry for people who had one of their devices implanted. This industry sponsored registry allowed researchers to answer four basic questions: (1) whether remote monitoring of events impacts survival; (2)how well patients do if they receive a shock for an irregular heart rhythm; (3) if there are differences in survival among patients receving the different kinds of devices and (4) how survival of “real-world” patients compares with that of patients enrolled in trials. That’s what makes it so important for reporters to carefully break down the evidence for readers.
There is no mention of the costs of the devices, which is a shame for two reasons. First, these devices are costly, and the procedures to put them in are costly. Costs can run as high as $50,000 for the device along with hospital and physician charges. The costs of the remote monitoring program would have been a welcomed addition to the information as well. Given the national discussion on health care cost, some attention to the price of the treatments and monitoring should have been included. Secondly, we’re talking about comparing devices, and there should have been some discussion of cost effectiveness. If we are claiming that there is a significant benefit to choosing one device over another or to subscribing to a remote monitoring plan, which the story does say “is typically free with a cardiologist’s recommendation,” we should know how much each approach costs.
We give the story a barely passing grade here for at least providing readers with some numbers, although we feel that ultimately the numbers provide very little clarity. What was needed were some absolute numbers to show how many patients were in each group, how many died, how many lived and how many suffered side effects, etc. Instead, we are given sentences like this, “Among those outfitted with CRT devices alone, one-year survival was pegged at 82 percent, while five-year survival came in at 48 percent, the investigators found.”
In addition, the study was observational and as such, the benefits of remote monitoring seen may not be applicable to all patients. For example a selection bias (motivated patients and physicians signed on to the manufacturers registry) could explain a good deal of the results. The cautionary comments of Dr. Prystowsky provided sufficient balance to the comments of the investigator.
The fact that the story could be entirely about surgically implanted devices and never acknowledge that surgery has risks or that the devices themselves have risks is odd, to say the least.
The story starts out with a very confusing lead. “Implantable devices designed to control heart rhythm and efficiency while preventing sudden death among heart failure patients are as effective at ensuring patient survival in real-world situations as they are in controlled study environments, new research suggests.” But then, quickly, it raises an interesting red flag, saying, “The study authors noted that the observation is somewhat surprising, given that some of the patients not enrolled in structured studies have already experienced a cardiac event and are therefore prescribed such devices to prevent a recurrence.” We have read this sentence multiple times and still don’t know what it means. The rest of the story meanders through various numbers that appear to be differentiating between the devices, and then, suddenly, the story takes a sharp turn into a discussion of remote monitoring, claiming, “that patients whose implants were monitored remotely, on a continual basis, by a health facility network were about half as likely to die as patients who only had intermittent in-person assessments.” None of this evidence is carefully examined. Also, the study was observational and used existing datasets to determine patient outcomes. There are a number of limitations to this study design, including a lack of randomization and a lack of specific information for each patient in the dataset. The story does not provide the reader with these important provisos to the study results. The study was accompanied by a well written and thoughtful editorial that could have provided the reporter with a useful context for the story.
The story does not engage in disease-mongering, but it also fails to adequately describe the target population for these devices. Another barely satisfactory score.
The story does use one independent source, but it fails in another way. It does not point out that the study was funded by the device maker Boston Scientific. Also, Dr. Prystowsky notes funding support from Boston Scientific (http://www.theheart.org/article/1050051.do ), but this is not acknowledged.
The story fails to provide the reader with sufficient information on the differences between the devices in the study and the rationale for one over another in patients with heart failure. The study sought to answer 4 questions, only one of which was highlighted in the story. As a result a comparison between the various devices and remote monitoring is muddled to say the least.
The story says, “The finding is based on an analysis of nearly 186,000 patients outfitted with either an implantable cardioverter defibrillator (ICD), a cardiac resynchronization therapy device (CRT), or a defibrillator combined with a CRT device (a CRT-D).” The size of the study might make one think that these devices are widely available, but because we are talking about three different devices, some nod should have been made to how widespread they are. The story does make it clear that the remote monitoring system used is available to patients. We give this a barely satisfactory as a result.
The novelty of these devices or of remote monitoring is never established. The story also could have established the relative novelty of the dataset. The manufacturers’ patient registry allowed the researchers to study the real world events in people living with ICD’s and CRT-D’s. The story does not note the rather unique circumstances allowing the research to be conducted; a manufacturer’s patient registry.
Not applicable because we can’t be sure of the extent to which the story relied on a news release.
We do know, because the story acknowledged this, that the quote from the study co-author came from a news release. Why?
The release, it should be noted, actually provides more and better information than the news story.
Comments
Please note, comments are no longer published through this website. All previously made comments are still archived and available for viewing through select posts.
Our Comments Policy
But before leaving a comment, please review these notes about our policy.
You are responsible for any comments you leave on this site.
This site is primarily a forum for discussion about the quality (or lack thereof) in journalism or other media messages (advertising, marketing, public relations, medical journals, etc.) It is not intended to be a forum for definitive discussions about medicine or science.
We will delete comments that include personal attacks, unfounded allegations, unverified claims, product pitches, profanity or any from anyone who does not list a full name and a functioning email address. We will also end any thread of repetitive comments. We don”t give medical advice so we won”t respond to questions asking for it.
We don”t have sufficient staffing to contact each commenter who left such a message. If you have a question about why your comment was edited or removed, you can email us at feedback@healthnewsreview.org.
There has been a recent burst of attention to troubles with many comments left on science and science news/communication websites. Read “Online science comments: trolls, trash and treasure.”
The authors of the Retraction Watch comments policy urge commenters:
We”re also concerned about anonymous comments. We ask that all commenters leave their full name and provide an actual email address in case we feel we need to contact them. We may delete any comment left by someone who does not leave their name and a legitimate email address.
And, as noted, product pitches of any sort – pushing treatments, tests, products, procedures, physicians, medical centers, books, websites – are likely to be deleted. We don”t accept advertising on this site and are not going to give it away free.
The ability to leave comments expires after a certain period of time. So you may find that you’re unable to leave a comment on an article that is more than a few months old.
You might also like