The following is a guest post by Earle Holland, who, for almost 35 years, was the senior science and medical communications officer at Ohio State University. He’s now a member of our editorial team.
Sometimes, researchers can do all the right things in reporting their results and still the outcome falls short of the public”s need for information. That”s the case with a recent report published in the Journal of General Internal Medicine.
A team from Massachusetts General Hospital and the Harvard Medical School looked for a link between the ratings included on an institution”s Facebook page and the likelihood of a high or low readmission rate for patients, one indicator of the quality of care at a medical center. They wanted to know if a high Facebook rating might suggest a low readmission rate.
And it did.
They looked at 791 hospitals whose readmission rates for patients fell outside the national average, either higher or lower. Facebook pages allow viewers to rank an institution on a five-star basis and those hospitals with low readmission rates seemed to have garnered a higher number of stars in their ranking. The opposite was also true, based on the data the researchers reviewed.
It seemed a clear correlation.
The problem arises when the public perceives that linkage as causal, in other words, that the higher Facebook rating grew directly from the lower readmission rate. And that clearly wasn”t the case. In fact, the researchers were clear in saying just that:
“The study design is cross-sectional and correlative, which limits assigning causality in the findings.”
Still, people want simple answers, especially when it comes to complicated health care. And that”s why social media has become such a key part of most institutions” communications programs. Social media (Facebook, Twitter, Instagram, etc.) has given a quick and easy voice to the masses. Moreover, every voice has, for the most part, the same value. An anecdote from a disgruntled patient can carry the same weight as the advice of a knowledgeable expert. Social media has leveled the playing field for all.
And as liberating as that might seem, where health care is concerned, all commentary isn’t equal.
A cursory, and admittedly unscientific, review of a half-dozen med centers” Facebook pages shows the shortcomings of social media evaluations. True, Facebook will tally commenter’s’ rankings and post the composite score for viewers. – Massachusetts General got a 4.6 – but people have to choose 1 through 5 to be allowed to comment. Viewers who said they wanted to offer a zero couldn”t comment.
There are risks with assigning too much value to social media activity. It does offer institutions the chance to open a dialogue with consumers. But in the case of that cursory review I mentioned, commenters whose messages were negative toward the institution invariably got boilerplate responses, hardly the kind of feedback capable of changing attitudes.
Lastly, human nature plays a large role in social media. People who may have had a good or bad experience at an institution often don”t take the time and effort to share that on a Facebook page, so their experiences aren’t factored in. Commenters are always self-selected, adding a certain bias to any conclusions drawn from social media data.
In this case, the Mass General researchers seemed to include the necessary caveats in this study. They cited its limitations and were wary in extrapolating their findings too far. The same can”t be said about the news release that the hospital issued about the study. It compounded the other problems.
To begin with, there”s the headline on the release: – “”Hospital ratings on social media appear to reflect quality of care.”” I think that headline goes too far, appearing to reflect a causal link that actually hasn”t been established.
The release also quotes the lead researcher saying, “”While we can”t say conclusively that social media ratings are fully representative of the actual quality of care, this research adds support to the idea that social media has quantitative value in assessing the areas of patient satisfaction . . . and other quality outcomes.””
But research that “adds support” to an idea may not be information that will aid someone’s’ informed decision making.
News releases like this should reflect more of the caveats and limitations of the published journal article.
Note from Publisher Gary Schwitzer: We’ve recently introduced our new feature of systematic, criteria-driven reviews of health care news releases. Look for our database of such reviews to grow soon.
————————
Follow us on Twitter:
https://twitter.com/garyschwitzer
Comments
Please note, comments are no longer published through this website. All previously made comments are still archived and available for viewing through select posts.
Comments are closed.
Our Comments Policy
But before leaving a comment, please review these notes about our policy.
You are responsible for any comments you leave on this site.
This site is primarily a forum for discussion about the quality (or lack thereof) in journalism or other media messages (advertising, marketing, public relations, medical journals, etc.) It is not intended to be a forum for definitive discussions about medicine or science.
We will delete comments that include personal attacks, unfounded allegations, unverified claims, product pitches, profanity or any from anyone who does not list a full name and a functioning email address. We will also end any thread of repetitive comments. We don”t give medical advice so we won”t respond to questions asking for it.
We don”t have sufficient staffing to contact each commenter who left such a message. If you have a question about why your comment was edited or removed, you can email us at feedback@healthnewsreview.org.
There has been a recent burst of attention to troubles with many comments left on science and science news/communication websites. Read “Online science comments: trolls, trash and treasure.”
The authors of the Retraction Watch comments policy urge commenters:
We”re also concerned about anonymous comments. We ask that all commenters leave their full name and provide an actual email address in case we feel we need to contact them. We may delete any comment left by someone who does not leave their name and a legitimate email address.
And, as noted, product pitches of any sort – pushing treatments, tests, products, procedures, physicians, medical centers, books, websites – are likely to be deleted. We don”t accept advertising on this site and are not going to give it away free.
The ability to leave comments expires after a certain period of time. So you may find that you’re unable to leave a comment on an article that is more than a few months old.
You might also like