The following post is by Joy Victory, who joined HealthNewsReview.org in March as deputy managing editor. She tweets as @thejoyvictory.
A recent 20-page policy report from the Environmental Working Group included alarming news: According to a study they conducted, “nearly three in 10 of the women had more mercury in their bodies than the EPA says is safe,” and rates were highest among women who ate seafood frequently.
Based on this, they issued a news release with the alarming headline of “U.S. Seafood Advice Could Expose Women And Babies To Too Much Mercury, Not Enough Healthy Fats.”
That sounds like important–and clicky–news, and the media acted accordingly: At least a dozen different news outlets wrote about the study, including high-profile publications like The Washington Post (“Why it’s still so hard to eat fish and avoid mercury”), TIME Magazine (“Canned Tuna Is Too High In Mercury for Pregnant Women: Health Group”), and CNN. (“Study of mercury in fish brings call to strengthen government guidelines”)
The seafood industry trade group National Fisheries Institute caught wind of the report and resulting news coverage, and fired back big time, with a news release, “Mercury ‘Study’ Out of Step with Real Science.” They didn’t stop there, turning their ire specifically at TIME Magazine, asking “Seriously….what is wrong with TIME Magazine?”
While it’s debatable whether this miffed tone helps or hurts the trade organization’s public relations effort, NFI does have a point: The news coverage, in general, could have been stronger.
Before we get into what journalists could have done differently, we do want to stress that EWG’s study conclusions–that mercury contamination in fish is more widespread than government agencies acknowledge–very well may be true. It’s just their report doesn’t prove this, certainly not on its own.
As HealthNewsReview.org contributor and medical research communications expert Joann Rodgers says of EWG’s data, “there’s smoke, but there’s certainly no smoking gun.”
The first thing to keep in mind is that this was a self-published study, not peer reviewed and published in a journal. While peer review is far from an infallible process, it does set the bar higher and allow independent experts to weigh in and question the work before it’s published.
For this reason, it’s important to indicate if and when a study was published, and where, as well as note any potential conflicts of interest of the researchers. Yet, this wasn’t made clear in these stories–that this was a self-published study not vetted by outside experts.
But, perhaps the bigger lapse was how the news coverage mischaracterized EWG’s study as more conclusive than it really was, while also leaving out important limitations.
In our reviews of health news reports, we expect journalists to include the actual data (not just broad summaries), and to assess the quality of the evidence by briefly detailing what the study’s data collection method actually was, how it fits into the levels of evidence hierarchy and what limitations should be kept in mind.
By leaving out key study details, The Washington Post left readers in the dark. The were few specifics about the study design in the Post report–only that it was “a study of more than 250 women of childbearing age who ate approximately the amount of seafood recommended by the federal guidelines.”
The Post then goes right on to explaining the study results, not exploring the data collection methods at all. Later on, the story does include quotes from an independent expert who points out limitations of the study, which is important. Yet it ends up making for a confusing read, since we don’t know how the data was obtained.
Meanwhile, TIME Magazine was slightly more specific, stating “EWG asked 254 women …to record their seafood consumption and submit hair samples for mercury testing.”
And, CNN was similar, with: “The EWG tested hair samples from 254 women of childbearing age from 40 states who reported eating as much or slightly more fish than the government recommendations over a period of two months.”
These details at least let readers know this was self-reported data, a limitation of the study. But we do wish the news stories had hit harder on that: These women weren’t monitored to make sure they were eating the seafood they said they were, or even told to record it in a food diary as they ate it.
Instead, they were asked–once–to recall how many times they ate seafood over the past two months, and to state what kinds of fish they ate.
“That information should not masquerade as data about what these folks actually ate,” notes HealthNewsReview.org contributor Sharon Dunwoody, a journalism professor emeritus at the University of Wisconsin. “It is unlikely that these people could actually remember with that level of specificity.”
Studies designed this way can be helpful in detecting possible relationships, she noted, but they’re not meant to determine cause-and-effect.
And, importantly, a study of this size shouldn’t be used to extrapolate to all pregnant women in the U.S.
None of the news outlets noted this. Instead, they used broad, conclusive-sounding statements like “Women who frequently eat fish have 11 times higher mercury levels than those who rarely eat it” (CNN) and “eating fish the way the government recommends is exposing people, especially pregnant women, to unsafe levels of mercury.” (TIME)
But the reality is a lot less sexy: Researchers at an environmental advocacy group self-published a small, non-peer-reviewed study that showed seafood consumption might have been linked with higher mercury level among study participants.
That leaves us with non-news about something we already know: Seafood consumption might increase mercury levels.
There’s certainly a strong argument to be made for passing on this one. Not every study is newsworthy, especially if it doesn’t hit some basic proof points (peer reviewed and published for starters; bonus points if it’s randomized and controlled and enrolled many people).
This is one that may well point to a wealth of broader story ideas–how to eat seafood as safely as possible, weighing the many dietary trade-offs of health risks and benefits during pregnancy–but doesn’t deserve headlines in and of itself.
But this assumes reporters can take a step back and look at the bigger picture. On deadline, this is hard to do, when the content beast needs feeding. Our request: Proceed with caution, and avoid using language that lends an air of false certainty to research that is far from certain.
Or, as Rodgers puts it, “the study lends some moderate evidence” to the issue that seafood dietary guidelines need to be updated.
“The real ‘lede’ here is that the EWG survey suggests–but does not absolutely prove–that the ‘do not eat’ lists need to be adjusted and expanded.”