What does the public really know about patient safety at US hospitals? Not much

Trudy Lieberman is a veteran health care journalist and regular blogger for HealthNewsReview.org. She often writes about consumer cost and safety issues. She tweets as @Trudy_Lieberman.

Nick Budnick, a reporter who knows his way around Oregon’s healthcare scene, just wrote a chilling tale in the Portland Tribune about a 52-year-old patient who died in 2015 after 12 teeth were pulled at Oregon Health & Science University’s dental surgery clinic.  The man’s liver had trouble producing the clotting agents the body needs to stop bleeding. His infected teeth needed to be pulled in order for him to qualify for a liver transplant. However, instead of extracting a couple of teeth and hospitalizing him overnight to make sure his blood clotted properly, Oregon Health & Sciences University pulled 12 teeth at once, observed the man for an hour, and sent him home, a three-and-a-half-hour drive from the hospital.

“That series of decisions may have led” to the man’s death judging by interviews with several doctors experienced with liver patients and oral surgery, Budnick reported.  One hematologist said, “the pre-surgery test results indicated red flags that were ‘clear.’”

A case that raises larger questions

Just as bad for the hospital was its attempt to pin the medical mistakes on David Lambert, an assistant professor of oral surgery.  Lambert told Budnick he never saw the patient or evaluated him. “I had absolutely nothing to do with this guy’s treatment. They were trying to hang this on me. They tried everything they could.” The Oregon Board of Dentistry agreed with Lambert and took enforcement action against his boss, Dr. Pamela Hughes, chair of the department of oral surgery. This January the Board proposed a consent decree with Hughes saying that her treatment plan for the man constituted unacceptable patient care. The dead man’s sister has sued the hospital along with Hughes and the surgeon-in-training on duty that day. The sordid details of the case are now trickling out.

Why are those details so important? As Budnick explained, they raise larger questions about how OHSU “supervises trainees, tracks internal records and handles vulnerable group of patients.”

The revelations about what went wrong at the Oregon hospital are troubling, especially coming almost 20 years after publication of the landmark Institute of Medicine report, “To Err Is Human.” That report propelled the issue of hospital safety into the media, and elevated public awareness that hospitals can — and do — kill or injure their patients.  Mistakes continue but media coverage is sporadic. The topic typically makes news when someone dies, a lawsuit is filed, a whistleblower steps forward, or the hospital itself warns the public, a rare occurrence.

Circling the wagons and keeping the public in the dark

What do patients and their families really know about the workings of a hospital that could potentially hurt them? The answer is not much. That’s ironic given all the talk about turning healthcare into a marketplace where patients can shop for price and quality like they shop for computers. They cannot do that when it comes to hospitals. For the most part, these institutions have circled the wagons to make sure that the public is kept in the dark when mistakes are made.

Data that could provide a better picture of the care provided at hospitals remain under wraps at some agencies involved with hospital safety. After years of prodding from the Association of Healthcare Journalists (AHCJ), the U.S. Centers for Medicare and Medicaid Services (CMS) released data from state hospital inspection reports that can be accessed on a website maintained by AHCJ. However, there are significant limitations to those data and many reports that could be shared are not.  Furthermore, the Joint Commission, which is private and the largest accrediting agency, does not release its survey reports or any details of their inspections. The public can search for a hospital’s accreditation status online, and they should be skeptical of any facility that is not fully accredited.

Insider tip exposes outbreak at UC Irvine

The Los Angeles Times broke another frightening safety story focused on the UC Irvine Medical Center in April. Melody Petersen reported that 10 critically ill infants in the hospital’s neonatal intensive care unit were infected with MRSA, a lethal bacteria especially dangerous for premature babies. The outbreak began in August, but the Orange County Health Department learned about the infection in December and chose not to notify the public. Health officials claimed they had no evidence the infants being treated in the neonatal unit were at higher risk than babies admitted elsewhere. Marian Hollingsworth, a member of a state advisory committee on hospital-acquired infections who alerted the paper, said it appeared the hospital and government officials were trying quietly to handle the problem internally.

Petersen also reported the hospital said it had been disclosing the outbreak in a letter to parents of all babies in intensive care and was working aggressively to prevent more infections using measures that “meet or exceed best industry practices.”  But a few days later, Petersen reported, a mother she had interviewed said the hospital did not warn her that infants in the NICU were being treated for MRSA. The mother, whose baby also became ill with the infection, pieced together the evidence and found that the hospital had moved a baby already infected into a room next to her son. “That baby should never have moved next door,” she told the Times. “To learn now they knew this had been spreading since August – why would you do that?”  In May another Petersen story raised more questions about the hospital’s practices. It featured the hospital’s infection control expert whose plan for keeping infections under control had not worked.

Petersen’s account, added to the Oregon tale and many similar stories, raises a further question:  What responsibility do the news media have for keeping patients safe?  Thinking that it was time once again to get answers I turned to two of the best experts on the  subject – Ashish Jha, MD a practicing physician and professor of health policy at the Harvard T. H. Chan School of Public Health, and Lisa McGiffert who heads the Safe Patient Project at Consumers Union.

‘The bad ones get off scot-free’

Ashish Jha, MD

Jha said, “As for openness and transparency, we’ve made a little bit of progress mostly with small stuff, but the really big things in patient safety have not been addressed effectively.” For one thing, he says, patient safety is not “super simple,” and no one wants accountability.  For another, something called “surveillance bias” casts doubt on whether a hospital is good or bad. Hospitals that are diligent and look for problems find them. Those are the ones that get penalized while “the bad ones get off scot-free.”  One solution is to move away from data based on insurance claims, Jha says. Such data may indicate there are no serious problems, but it’s possible a complex problem shows up three weeks later, after the claims are evaluated by researchers, and never gets measured. “If I wanted to go to the hospital in New York City with the lowest risk of safety problems and medical errors, I would have no idea,” he said. “I could triangulate and make a rough guess, but this is where we are 18 years after the IOM report came out.”

McGiffert’s take was similar. Some hospitals are doing a good job, but too many of them are still falling behind, she told me. Many are reluctant to devote the resources they need to control infections and

prevent medical errors, which McGiffert says may be getting too little attention. But there’s another problem, which stems from the all-too-typical way we write hospital safety stories. At too many outlets it’s a once-a-year story usually based on scores released by Medicare, which collects data on most of the nation’s hospitals. If a hospital hasn’t done well on preventing central line or urinary tract infections, for example, do we look further to tell us why? Most of the time we don’t.

Data may offer clues to safety problems, but who’s looking?

Lisa McGiffert

McGiffert looked deeper at infection rates in hospitals in California, and found that the UC Irvine hospital flagged by the LA Times had significantly higher infection rates than other hospitals six times over the last three years for various types of infections. For example, the hospital had significantly higher rates of Clostridium difficile, a nasty gastrointestinal infection, for each of those three years.

Reporting on hospital safety is almost as hard as carrying it out in practice. In my view, the best recent example of hospital safety reporting are the stories by Karen Bouffard and Joel Kurth of the Detroit News last summer exposing serious problems with dirty surgical instruments at hospitals that were part of Detroit Medical Center. Reporters got a tip and then dug until they learned problems with dirty instruments had been going on for 11 years before the News exposé appeared.

The story was tough to do especially because the “whole medical establishment is so afraid of this issue,” Bouffard said. “Nobody wants to be the hospital that comes up on Google for having dirty instruments.” None of the medical systems in the state — except for the VA– would let the reporters tour their facilities to discuss sterilization procedures.  “Detroit Medical Center went into black-out mode,” Bouffard told me, but more recently they’ve agreed to talk to us and let us in for a tour. They’ve fixed their processes and spent $1.6 million to buy more equipment.”

All of which helps answer my question about the role of the news media in keeping the public safe from hospital infections and mistakes. It’s simply indispensable.

You might also like

Comments (1)

Please note, comments are no longer published through this website. All previously made comments are still archived and available for viewing through select posts.

David Lambert

July 6, 2017 at 11:40 am

In the instance of the liver failure patient, this was less about the paucity of institutional safety, but rather more about practitioner error. What is significant however is that the institution – at the behest of risk management – attempted to displace the blame. It is a prime example of why many believe that risk management does not exist to protect clinicians but rather to protect the administrative hierarchy.

I fear the example made in my case will become increasingly familiar as clinicians have lost their administrative power and are pushed to do more with less. This helps create the rarified environment where unsuspecting clinicians can easily become marginalized – all the more so as pay for performance takes hold.