We don’t think the story conveyed the conclusion of the researchers, nor that of the the editorial writer.
The researchers concluded:
“While robotic assisted and laparoscopic surgery are associated with fewer deaths, complications, transfusions and shorter length of hospital stay compared to open surgery, robotic assisted laparoscopic surgery is more costly than laparoscopic and open surgery. Additional studies are needed to better delineate the comparative and cost effectivenss of robotic assisted laparoscopic surgery relative to laparoscopic surgery and open surgery.”
That conclusion sends a somewhat different message than that delivered in the story – and the difference is important.
And the editorial writer stated that the study “failed to capture 3 important elements.” And then he clearly spelled those out. The story didn’t convey the depth/details of those concerns.
What are the outcomes that readers/patients should really care about? That’s the question the editorial raised – or that many other independent observers would raise about this study. While the story was functional, it could have helped readers become smarter health care consumers with a bit more context.
Much of the focus of the piece was on costs.
The story did an adequate job quantifying the operative mortality, blood transfusion and length-of-stay rates among the three groups studied. (Unfortunately, it did so inconsistently – sometimes giving data for all 3 types of surgery, sometimes providing results for only 2 surgical approaches.)
We’ll give it a satisfactory score here.
But are these the outcomes that really matter? That’s something we address in the “Evidence” criterion below.
As with the benefits criterion above, the details provided were adequate.
The story nodded in the direction of the accompanying editorial, but we wish it had captured the important points as written by the editorialist:
While this study provides some important information that will help patients, providers and policy makers make these value judgments, it fails to capture 3 important elements. First, as the authors note, the costs estimates do not include the capital investment of purchasing a robotic system or the indirect economic benefits of patients’ early return to work and increased productivity. Second, the analysis fails to capture the human cost of the learning curve. In other words, as providers learn new surgical techniques, outcomes are often worse for patients early in the learning curve as hypothesized by Hu and others. Finally, and perhaps most importantly, the study fails to capture patient reported outcomes such as postoperative pain, return to baseline functional status and health related quality of life, all of which are highly germane to the procedures under study here. It is difficult, if not impossible, to assess the costeffectiveness of minimally invasive technologies without including these critical outcomes. Future comparative effectiveness studies of these techniques must include patient reported outcomes as the primary end point if they are to inform the debate regarding the value of our interventions.
The story included some input from the editorial writer – although, as stated above in the “Evidence” criterion – perhaps not to its greatest effectiveness.
The focus of the story was on a study comparing robot-assisted surgery with two other surgical techniques.
The availability of all 3 surgical approaches studies – while not explicitly described – could be inferred from the study details reported.
The story didn’t provide any context on the growing body of literature about comparative effectiveness questions in this field as robotic surgery proliferates.
It does not appear that the story relied on a news release.
Comments
Please note, comments are no longer published through this website. All previously made comments are still archived and available for viewing through select posts.
Our Comments Policy
But before leaving a comment, please review these notes about our policy.
You are responsible for any comments you leave on this site.
This site is primarily a forum for discussion about the quality (or lack thereof) in journalism or other media messages (advertising, marketing, public relations, medical journals, etc.) It is not intended to be a forum for definitive discussions about medicine or science.
We will delete comments that include personal attacks, unfounded allegations, unverified claims, product pitches, profanity or any from anyone who does not list a full name and a functioning email address. We will also end any thread of repetitive comments. We don”t give medical advice so we won”t respond to questions asking for it.
We don”t have sufficient staffing to contact each commenter who left such a message. If you have a question about why your comment was edited or removed, you can email us at feedback@healthnewsreview.org.
There has been a recent burst of attention to troubles with many comments left on science and science news/communication websites. Read “Online science comments: trolls, trash and treasure.”
The authors of the Retraction Watch comments policy urge commenters:
We”re also concerned about anonymous comments. We ask that all commenters leave their full name and provide an actual email address in case we feel we need to contact them. We may delete any comment left by someone who does not leave their name and a legitimate email address.
And, as noted, product pitches of any sort – pushing treatments, tests, products, procedures, physicians, medical centers, books, websites – are likely to be deleted. We don”t accept advertising on this site and are not going to give it away free.
The ability to leave comments expires after a certain period of time. So you may find that you’re unable to leave a comment on an article that is more than a few months old.
You might also like