News media sound the alarm on mercury in seafood during pregnancy — was it a false alarm?

The following post is by Joy Victory, who joined in March as deputy managing editor. She tweets as @thejoyvictory.

A recent 20-page policy report from the Environmental Working Group included alarming news: According to a study they conducted, “nearly three in 10 of the women had more mercury in their bodies than the EPA says is safe,” and rates were highest among women who ate seafood frequently.

seafood, sushi, pregnancy, mercury
Based on this, they issued a news release with the alarming headline of “U.S. Seafood Advice Could Expose Women And Babies To Too Much Mercury, Not Enough Healthy Fats.”

That sounds like important–and clicky–news, and the media acted accordingly: At least a dozen different news outlets wrote about the study, including high-profile publications like The Washington Post (“Why it’s still so hard to eat fish and avoid mercury”), TIME Magazine (“Canned Tuna Is Too High In Mercury for Pregnant Women: Health Group”), and CNN. (“Study of mercury in fish brings call to strengthen government guidelines”)

Seafood industry fires back hostile response that, well, partially made sense

The seafood industry trade group National Fisheries Institute caught wind of the report and resulting news coverage, and fired back big time, with a news release, “Mercury ‘Study’ Out of Step with Real Science.”  They didn’t stop there, turning their ire specifically at TIME Magazine, asking “Seriously….what is wrong with TIME Magazine?”

While it’s debatable whether this miffed tone helps or hurts the trade organization’s public relations effort, NFI does have a point: The news coverage, in general, could have been stronger.

Before we get into what journalists could have done differently, we do want to stress that EWG’s study conclusions–that mercury contamination in fish is more widespread than government agencies acknowledge–very well may be true. It’s just their report doesn’t prove this, certainly not on its own.

As contributor and medical research communications expert Joann Rodgers says of EWG’s data, “there’s smoke, but there’s certainly no smoking gun.”

What journalists missed

The first thing to keep in mind is that this was a self-published study, not peer reviewed and published in a journal. While peer review is far from an infallible process, it does set the bar higher and allow independent experts to weigh in and question the work before it’s published.

For this reason, it’s important to indicate if and when a study was published, and where, as well as note any potential conflicts of interest of the researchers. Yet, this wasn’t made clear in these stories–that this was a self-published study not vetted by outside experts.

But, perhaps the bigger lapse was how the news coverage mischaracterized EWG’s study as more conclusive than it really was, while also leaving out important limitations.

In our reviews of health news reports, we expect journalists to include the actual data (not just broad summaries), and to assess the quality of the evidence by briefly detailing what the study’s data collection method actually was, how it fits into the levels of evidence hierarchy and what limitations should be kept in mind.

By leaving out key study details, The Washington Post left readers in the dark. The were few specifics about the study design in the Post report–only that it was “a study of more than 250 women of childbearing age who ate approximately the amount of seafood recommended by the federal guidelines.”

The Post then goes right on to explaining the study results, not exploring the data collection methods at all. Later on, the story does include quotes from an independent expert who points out limitations of the study, which is important. Yet it ends up making for a confusing read, since we don’t know how the data was obtained.

Meanwhile, TIME Magazine was slightly more specific, stating “EWG asked 254 women …to record their seafood consumption and submit hair samples for mercury testing.”

And, CNN was similar, with: “The EWG tested hair samples from 254 women of childbearing age from 40 states who reported eating as much or slightly more fish than the government recommendations over a period of two months.”  

The pitfalls of self-reported data

These details at least let readers know this was self-reported data, a limitation of the study. But we do wish the news stories had hit harder on that: These women weren’t monitored to make sure they were eating the seafood they said they were, or even told to record it in a food diary as they ate it.

Instead, they were asked–once–to recall how many times they ate seafood over the past two months, and to state what kinds of fish they ate.

“That information should not masquerade as data about what these folks actually ate,” notes contributor Sharon Dunwoody, a journalism professor emeritus at the University of Wisconsin. “It is unlikely that these people could actually remember with that level of specificity.”

Studies designed this way can be helpful in detecting possible relationships, she noted, but they’re not meant to determine cause-and-effect.

And, importantly, a study of this size shouldn’t be used to extrapolate to all pregnant women in the U.S.

None of the news outlets noted this. Instead, they used broad, conclusive-sounding statements like “Women who frequently eat fish have 11 times higher mercury levels than those who rarely eat it” (CNN) and “eating fish the way the government recommends is exposing people, especially pregnant women, to unsafe levels of mercury.” (TIME)

But the reality is a lot less sexy: Researchers at an environmental advocacy group self-published a small, non-peer-reviewed study that showed seafood consumption might have been linked with higher mercury level among study participants.

That leaves us with non-news about something we already know: Seafood consumption might increase mercury levels.

So what should journalists have done?

There’s certainly a strong argument to be made for passing on this one. Not every study is newsworthy, especially if it doesn’t hit some basic proof points (peer reviewed and published for starters; bonus points if it’s randomized and controlled and enrolled many people).

This is one that may well point to a wealth of broader story ideas–how to eat seafood as safely as possible, weighing the many dietary trade-offs of health risks and benefits during pregnancy–but doesn’t deserve headlines in and of itself.

But this assumes reporters can take a step back and look at the bigger picture. On deadline, this is hard to do, when the content beast needs feeding. Our request: Proceed with caution, and avoid using language that lends an air of false certainty to research that is far from certain.

Or, as Rodgers puts it, “the study lends some moderate evidence” to the issue that seafood dietary guidelines need to be updated.

“The real ‘lede’ here is that the EWG survey suggests–but does not absolutely prove–that the ‘do not eat’ lists need to be adjusted and expanded.”

You might also like

Comments (2)

Please note, comments are no longer published through this website. All previously made comments are still archived and available for viewing through select posts.

Ned Groth

May 4, 2016 at 4:59 pm

The EWG study certainly has limitations and journalists could always do a better job of looking critically at findings. That said, the EWG study, while limited and relatively small in scope, is in fact quite powerful evidence and supports EWG’s conclusions very strongly. Unfortunately ALL studies that try to link health outcomes with dietary patterns suffer from the problems of self reporting and recall. That applies equally to studies showing benefits of fish consumption that the NFI so fondly cites. EWG addresses many of the issues in their methodological appendix, and their approach is both reasonable and defensible. Their study shows what they say it shows–essentially, if women follow the current draft FDA/EPA advice, eat 8-12 ounces of fish weekly and have no detailed guidance about omega-3 and mercury content, they get far too little omega 3s, and 30 percent of them exceed the outdated guidelines for excessive mercury exposure. Those guidelines were based on a 1997 study; in the past 20 years about 15 well-designed epidemiological studies have now associated adverse effects with doses right around the “reference” levels, which when adopted were thought to provide a 10-fold safety margin. I.e., current (peer-reviewed) evidence suggests that the 30 percent of EWG’s subjects who exceeded those guidelines were not just above some arbitrary safety limit, they were at risk for actual harmful effects.
I am an environmental health scientist, and I was aware of this study and advised EWG to publish it in a peer-reviewed journal. The study definitely is of publishable quality and their finding makes scientific sense: many women who eat a lot of fish get too much mercury, and need sharper advice to help them reduce their exposure, while they continue to eat fish for its benefits. Although the value of peer review is indisputable, the trade-off is delay, and FDA/EPA are in the process of trying to finalize a draft advisory. EWG chose to self-publish in part because their results are timely and EWG wanted them to be considered in that decision-making process.
Health News Review certainly has the right to criticize individual studies and the media for not doing a better job. But don’t throw the baby out with the bath water, Before using the NFI’s reaction as a jumping off point, it would serve you well to read the literature–starting with those epidemiological studies I mentioned. The EWG study is, unfortunately, right on the money and absolutely in line with recent peer-reviewed research.

    Joy Victory

    May 5, 2016 at 11:49 am

    Ned, thanks so much for weighing in.

    You stated that the study was “limited and relatively small in scope” and yet “quite powerful evidence and supports EWG’s conclusions very strongly.”

    Perhaps the latter is true, but we think the media coverage could have said more (or even anything) about the former.

    Your suggestion we “read the literature” before commenting on the quality of news coverage: Keep in mind very few news consumers are going to do that–they’re going to depend on the story to provide the appropriate context. And in this case, the necessary context was lacking in the stories we evaluated–which was the point of the post.

    We’re not making any definitive statements about the underlying science or whether U.S. seafood consumption guidelines should be changed.