This story about spinal fusion surgeries that use bone-growth proteins [BMPs] pursues an important cost angle more aggressively than some other stories we saw on this same study. But it missed an important point:
The current study does not have the power to resolve the long-standing question about whether the higher short-term costs of BMPs are offset by lower long-term follow-up care. All it does is confirm that short-term costs are higher.
The story mentions the lack of long-term data as a mere caveat.
If you strip down the study you see a few interesting observations: About a quarter of spinal fusions now use BMPs. Surgeries on the inside of the spine near the neck cause breathing problems. BMP surgeries appear to be used more often in certain populations, but limitations in the data make any conclusions about disparities premature.
That isn’t much to wrap a story around.
A reader seeing a story about the cost or complications of BMPs has a broader set of questions:
Does the [initially] more expensive technique produce better outcomes? Which patients are best suited to this surgery? What really is the long-term picture? Is this yet another case of an expensive new technique making a company rich when an established treatment is just as good or better?
And what about the angle that most spinal fusion surgeries are deemed unnecessary anyway?
This study, a valuable piece of medical research, nonetheless doesn’t address any of those fundamental questions. And neither did the story.
The story fails spectacularly with the issue of costs.
First, it’s not clear why the story focuses primarily on the 11- to 41-percent higher in-hospital costs of BMP–the study’s findings on prevalence and complication rates are at least as significant.
Yet despite the story’s focus on costs, the story lacks basic reporting on them. The story fails to say how much the surgery costs, how big that 11 to 41 percent premium really is, and who pays for it. And in whose pocket that extra spending winds up in.
As it happens, the journal study itself includes these details: The BMP product costs about $4,000. For anterior cervical fusion, the surgery without BMP cost about $31,000. With BMP, about $46,000.
But it gets worse.
As the story itself explains, the issue of total BMP costs has been controversial since the technology’s inception because [as BMP makers suggest] patients who get BMP surgery may require less treatment over the course of their lives than their peers. Despite the interested source, it’s a legitimate argument worth exploring.
And yet: As the authors of the journal article themselves explain, this study did not follow patients over time and therefore lacked the power to speak to the question of total costs at all. Its findings about in-hospital costs are useful but not in any way conclusive.
In other words: the story focuses on the findings of higher costs of BMP despite the fact that the study cannot resolve the question of total costs–which, from a public health perspective, is the only one that counts.
The JAMA study is silent on whether BMP "works"–which is to say, whether BMP improves outcomes, quality of life, morbidity or mortality.
No harm in that: This piece of research is designed to look only at prevalence, complications and costs.
But the news story’s shorthand conclusions–that the surgery is common, costly and linked to certain complications–begs the question of whether better outcomes outweigh those negatives. It’s hard to imagine what kind of reader would not want to know the answer.
Certainly at least some comparative outcome data exists. [Or if not, then that is worth mentioning.] The reporter didn’t even mention the issue.
In the final paragraph, the story mentions that BMP is linked to more complications, particularly difficulty swallowing in surgeries done high on the spine.
But the story didn’t explain how big were the potential harms. According to the results of the study, the use of BMP in anterior cervical fusion is associated with a 51.4% higher complication rate compared to patients who did not receive BMP (7.09% vs. 4.68%, respectively). However, these data were not presented in the story.
The story fails to provide necessary caveats to help readers understand the study’s limitations.
In fact, the story mentions only the positive attributes of the study: That the data drew on a "broad U.S. sample of 328,000 spine surgeries" and included information from 20 percent of the nation’s hospitals.
But the study has significant limitations: It used existing data from previous surgeries. It did not follow individual patients over time. It did not include information about subsequent outcomes, additional treatments, quality of life, morbidity or mortality.
The story should have stated these shortcomings plainly.
The story does not engage in disease mongering.
The study uses only one live source, the study’s lead author. This is insufficient.
One or two additional voices would have been able to put these findings in context and help readers understand what they might mean.
Having said that, the reporter does get extra points for trying to contact a researcher with clear economic motivations to explain his conclusions, which differ from the current study’s.
The story doesn’t explain how someone with serious chronic back pain becomes a candidate for surgery, and then how he or she becomes a candidate for BMP.
A paragraph that explains the available treatment options at various stages of the condition would have been very useful.
This is particularly true since a long-standing question about spinal fusion surgery, regardless of technique, is whether it is overused and potentially harmful.
The story states correctly that about 25 percent of spinal fusion surgeries use bone-growth proteins [BMPs].
The story clearly cites usage data since 2002, so it’s clear this is not a just-unwrapped product or procedure.
Given the history and the potential conflict of interest question raised in the story, it is safe to assume that it did not rely on a news release.
Comments
Please note, comments are no longer published through this website. All previously made comments are still archived and available for viewing through select posts.
Our Comments Policy
But before leaving a comment, please review these notes about our policy.
You are responsible for any comments you leave on this site.
This site is primarily a forum for discussion about the quality (or lack thereof) in journalism or other media messages (advertising, marketing, public relations, medical journals, etc.) It is not intended to be a forum for definitive discussions about medicine or science.
We will delete comments that include personal attacks, unfounded allegations, unverified claims, product pitches, profanity or any from anyone who does not list a full name and a functioning email address. We will also end any thread of repetitive comments. We don”t give medical advice so we won”t respond to questions asking for it.
We don”t have sufficient staffing to contact each commenter who left such a message. If you have a question about why your comment was edited or removed, you can email us at feedback@healthnewsreview.org.
There has been a recent burst of attention to troubles with many comments left on science and science news/communication websites. Read “Online science comments: trolls, trash and treasure.”
The authors of the Retraction Watch comments policy urge commenters:
We”re also concerned about anonymous comments. We ask that all commenters leave their full name and provide an actual email address in case we feel we need to contact them. We may delete any comment left by someone who does not leave their name and a legitimate email address.
And, as noted, product pitches of any sort – pushing treatments, tests, products, procedures, physicians, medical centers, books, websites – are likely to be deleted. We don”t accept advertising on this site and are not going to give it away free.
The ability to leave comments expires after a certain period of time. So you may find that you’re unable to leave a comment on an article that is more than a few months old.
You might also like