Kevin Lomangino is the managing editor of HealthNewsReview.org. He tweets as @KLomangino.
When advocates want to emphasize the human impact of a deadly disease or a treatment that might cure it, they sometimes draw an analogy to jumbo jets falling out of the sky. It happened recently in this NPR story about a potential new treatment for sepsis, which the story says kills about 300,000 people each year (estimates vary widely depending on the data sources used).
“So that’s the equivalent of three jumbo jets crashing every single day,” said Paul Marik, MD, an ICU physician quoted in the story.
I recall another time the analogy came up when I was reporting on public health recommendations related to salt intake. An American Medical Association official said that deaths due to excess sodium exceeded 150,000 per year.
“This is the equivalent of a jumbo jet with 400 people on it crashing every day,” said Dr. Steven Havas, then the AMA’s vice president of public health, in the Washington Post.
People dying of sepsis or heart disease don’t share a lot in common with plane crash victims. Hurtling to earth in a blaze of fire is an exceedingly rare and spectacular event – precisely the reason we can’t look away from such catastrophes.
Sepsis, by contrast, is common. And while it can occur suddenly in almost anyone who develops an infection, it most often occurs in the hospital among older patients or those with compromised immune systems and serious health problems. In many cases it’s the end of a long, slow descent rather a sudden plummet from the sky.
And salt? There is no conclusive evidence that lowering salt intake would prevent anyone from dying, and so those planeloads of victims simply can’t be said to exist. They are phantom blips on a radar screen tracking a flight that never left the ground.
Risk communication experts I spoke with panned the plane crash analogy and advised journalists to avoid it.
“I don’t think [planes crashing] is a particularly good analogy,“ says Steven Woloshin, MD, Professor of Medicine at the Dartmouth Institute for Health Policy and Clinical Practice and author of Know Your Chances, a book about understanding medical risks.
“Good risk messages communicate both magnitude and context,” he says. But the crash analogy delivers only on the magnitude side of the equation. “If you’re a journalist writing a story, instead of looking at plane crashes, you’d want to compare it with your chances of dying from something similar like cancer or heart disease.”
Mirjam Jenny, PhD, Head Research Scientist at the Harding Center for Risk Literacy, seconded Woloshin’s call for context. She said it was useful to compare risks for diseases with other causes of death that people are familiar with. However, she said the crash analogy didn’t serve this purpose because daily plane crashes simply don’t happen.
“In the ‘falling planes format,’ real risks (sepsis killing 300,000 people each year) are compared with completely made up risks (3 jumbo jets crashing every day),” she said. “Planes crash, of course, but at a very different rate, which makes the risk comparison conceptually useless and confusing at best and manipulative at worst. In my view, comparisons of different risks can provide the reader with useful context, but only if all risks in the comparison are real.”
She offered examples of real comparisons that could put large numbers — which are difficult for people to grasp — in context. For example: “To put this into perspective, roughly 11 out of 100 people who died in 2015 in the US, died of sepsis.” Or, “that is roughly 3 times the number of people that can be seated in the Texas Longhorns football stadium.”
Comparing medical deaths to plane crashes is an attempt to overcome your cognitive defenses. It’s designed to shock you into paying attention to something you wouldn’t ordinarily care about. It’s a play to fear and emotion.
The intent may sound noble – “Do something about sepsis!” – but in practice I find that these appeals are a prelude to oversimplified solutions. Having accepted that the problem is like jets crashing all around us, we are primed to accept whatever answer is being proposed to stop the carnage. Our decision-making is impaired.
In the case of sepsis, the quoted researcher was publicizing preliminary evidence showing that vitamin C may be an effective treatment. But instead of carefully calibrating that message to fit the early stage of the research, he trumpeted the treatment as a potential “cure” in a news release that received worldwide media attention.
The benefits of sodium reduction initiatives are similarly exaggerated. While it’s certain that cutting sodium would reduce population-wide blood pressures, it’s not clear that desalinating the food supply would yield the promised lifesaving benefits. Gold-standard clinical trials haven’t shown that cutting sodium prevents the outcomes that people care about like heart attacks and death (few such studies have been done). Some observational studies show that reducing sodium intake to levels recommended by public health groups is associated with adverse effects.
The plane crash analogy also has been extensively used to call attention to deaths caused by medical errors. One recent study found that medical errors contribute to the deaths of 250,000 people each year, making it the third leading cause of death in the United States.
Vox recently described the tally from a different study as “the equivalent of nearly 10 jumbo jets crashing every week — or the entire population of Birmingham, Alabama dying every year.”
Once again, readers should tread carefully and question whether the numbers here really add up.
The eye-popping 250,000 figure is extrapolated from data that includes just 35 total deaths in all – hardly a representative sample to draw conclusions about the entire U.S. population.
Moreover, assigning blame for causes of death is a notoriously tricky business. Medical errors occur more frequently in older, sicker individuals who are receiving care that’s more complicated and more prone to mistakes. When an error occurs in someone who’s already on death’s door, is it fair to assign 100% of the blame for that death on a medical mistake? How much of the death is due to the error and how much to the underlying condition?
As a starting point, stories addressing this issue could make it clearer to whom the increased risk applies rather than simply passing along a massive estimate of deaths (the true size of which is vigorously disputed).
This doesn’t excuse medical errors or suggest that errors shouldn’t be fully investigated and addressed — no matter how old or sick the patient is or how close to death he or she might be.
Nor is it a knock on patient safety groups and watchdog journalists who are pressing for systemic reforms in this area to reduce medical mistakes. Their work is vitally important.
It’s a call for everyone to exercise care when communicating such big numbers to the public — and in choosing the imagery that accompanies the statistics.
There is potential for unintended consequences when we equate hospitals with crashing jetliners.
Might some people frightened by such messages decide to avoid medical care altogether and die unnecessarily from treatable conditions?
The answer is, we don’t know. And that brings up another concern that Woloshin raised with respect to the plane crash message: He doubts it’s ever been tested to see how people interpret it.
“Maybe it works and they hear it as meaning a big number – but maybe it introduces cognitive dissonance, as in plane crashes are really rare, so maybe this isn’t worth paying attention to?”
Most reporters, of course, are never going to test the messages they use in their stories. But they can follow best practices for messaging that are based on risk communication research.
Here’s a crash course.