This blog post about a study revealing some new data about the overuse of heart implants does some things quite well, but it falls short on a number of our measures. See the Los Angeles Times piece which we also reviewed for comparison. While The WSJ blog includes cost information, which is too often missed in stories, it does not provide enough hard data on the risks or benefits of heart implants. It also should have done a better job evaluating the evidence in this study but, instead, left reviewers – and, we bet, readers – confused.
When a study this large finds that 1 out of every 5 people may have unnecessarily undergone a surgery that led to longer hospital stays in, in a minority, a shorter life, that’s big news. Reporters need to make sure that they give readers the proper context to process findings this dramatic. Identifying the best candidates for ICD pacement is not an exact science. Guidelines have been developed to help clinicians and patients determine the best course of action. The guidelines are, at best, interpretations of existing information generated in clinical trials. While useful, they are by no means exact. The study in question examined data collected from a national registry on over 100,000 patients and concluded that 1/5th of the implants were outside existing guidelines. While the findings in the study are troubling at face value, the story appears to be a bit more complicated. The accompanying editorial places the results in a slightly different perspective. Of the four criteria examined, only one (placement of the ICD in patients with severe heart failure) is an absolute. The other three criteria are related purely to timing and not to whether the device is of value. Heart failure patients need the tools to help them ask their doctors the right questions. This story did not provide them enough of those tools.
The story deserves high marks for bringing costs into the discussion right in the lead. “In more than 22% of cases, implantable defibrillators are given to heart patients who don’t meet the guidelines for receiving the pricey devices, according to a study just published in JAMA.” The story also says, “ICDs, which can cost north of $30,000, monitor the rhythm of the heart and produce a shock to bring irregular beating back to normal.” But just as with the Times story, the cost of the device and hospitalizations is only a piece of the economic picture. Additional costs of maintaining the device and battery replacement drive the costs higher. In reality however the overall economic impact of the devices is just about the same as treatment of high blood pressure and elevated cholesterol when life years (a commonly applied economic marker) is considered.
As with the harms, the benefits are not quantified.
The story says up high that patients who fell outside the guidelines had a “higher risk of dying in the hospital and of complications from the implantation.” But there are no numbers provided. In fact, there are more percentages provided in this story about the stock prices of the various device makers involved than there are about the harms or benefits. This is the Wall Street Journal, of course, but we still thought some additional numbers about harms or benefits would have been helpful to any reader.
The story attempts to evaluate the quality of the evidence, but it falls short. In comparing the way this story brings context to the shocking findings to the way the LA Times does, this story is confusing and potentially misleading. We have read these paragraphs several times and still are not sure what they mean:
Did these patients attempt to sign up for clinical trials and were denied? And what is the bill that the 22.5% will not fit? The LA Times story examines the study in much clearer language and provides the right context.
The story does not engage in disease-mongering.
We spent some time on the fence here, but we were ultimately willing to give the story the benefit of the doubt. The blog calls out a link to a much more complete Wall Street Journal story about the study, and it says, “Read the WSJ story to get perspective from Medtronic and Boston Scientific.” But it spends too much time quoting the lead author and does not quote anyone else. It does, however, provide information that was missing in the LA Times and other coverage. “Authors of the study reported receiving funding from Medtronic and other device- and drug makers.” We thought that context, along with the link and a quote from statement from the Heart Rhythm Society were enough to pass this criteria. As with the LA Times story, though, we don’t understand why the editorial accompanying the journal article wasn’t cited.
Neither this story nor the LA Times story compare this approach to existing alternatives, although this story does at least briefly mention medication as the first line of treatment for patients with newly diagnosed heart failure. But no meaningful comparison to other approaches was given.
Like the LA Times story we reviewed, this story shows how widespread these implants are by explaining the scope of the study. “The research, based on 111,707 cases submitted to a national registry over a three-and-a-half-year period, also finds that those patients who didn’t meet the guidelines had a higher risk of dying in the hospital and of complications from the implantation.” Both stories could have benefited from a sentence about how likely hospitals in less populated areas are to have access to the implants and to have properly trained staff.
The story does not make clear that this approach is novel. Guidelines for the placement of ICD’s have been in evolution for a number of years and have been based on cllinical trials. The use of a registry to identify the clinical outcomes in the real world has been used in a number of scenarios. The information provided by registries provides clinicians with an improved understanding of the value of a technology in a heterogenous patient population. We would have liked to have seen some mention of the intent of the registry in the story.
The story does not rely on a news release.