A tale of two (BMJ) studies: one gets attention, the other gets neglected. Why?

Michael Joyce produces multimedia at HealthNewsReview.org and tweets as @mlmjoyce

Within the past couple of months the BMJ published two separate observational studies looking at how two very different lifestyle factors might impact memory and dementia.

brain dementia mental healthBoth studies draw from the same group of research subjects: the Whitehall II cohort, which has followed roughly 10,000 British civil service workers for the past 30 years or so. Here are what the studies investigated:

Is moderate alcohol consumption a risk factor for, or protective against, cognitive decline?

Does physical activity protect against dementia?

This captured our attention for two reasons.

Do positive findings trump negative ones?

First, the alcohol study — which found that moderate alcohol consumption was linked to some tissue wasting in one of the key memory centers of the brain (the hippocampus) – was widely covered by the media. But the exercise study – which didn’t find any association between physical activity level and cognitive decline – received no mainstream coverage we could find.

Could it be that the alcohol study – with its “positive finding” (that some people might erroneously equate with showing cause and effect) –  is somehow more attractive, dramatic, or reportable than the physical activity study, with its “negative finding” (that some people might erroneously equate with “finding nothing”)?

In other words, is it possible that a headline which suggests moderate drinking is harmful to the brain is more click-worthy than one which can “only” claim physical activity level has no effect on memory? There were dozens of headlines featuring the alcohol study that certainly reinforce that notion. Here’s a sampling:

Newsweek: Alcohol & Brain Damage: Moderate Drinking Linked to Cognitive Decline

CBS News: Even moderate drinking could harm the brain

The Guardian: Even moderate drinking can damage the brain, claim researchers

Japan Times: Moderate drinking linked to brain damage: study

This is as good a point as any to raise the significant limitations of both these studies. Because both studies are observational, they can not make statements regarding cause and effect. Likewise, both studies rely on self-reporting (of drinking and exercise patterns) which is notoriously unreliable.

Furthermore, the cohort group from Whitehall II isn’t exactly representative of the public at large because it’s biased towards well-educated, middle class, white men.

Can news releases dictate conventional wisdom?

What I haven’t told you yet — and it could certainly affect the disproportionate media coverage mentioned above — is that the BMJ issued a news release for the alcohol study but not the activity study.

Why is that?

A Twitter reaction to the physical activity & cognitive decline study in BMJ

If it’s for the same reasons noted above (i.e. that “positive” results are more attractive than “negative”) then it raises compelling questions regarding how media coverage — and therefore public opinion, conventional wisdom, and even public policy — can be dictated, to some extent, right from story inception.

Not just which studies get published, but which studies get released for general public consumption.

Dr. José Merino is the US research editor for The BMJ. In my email exchanges with him he quoted an email from his PR manager who wrote that the journal sends out news releases based on the following:

“Press releases are designed to generate news, and content is selected on the basis of its news potential. We make decisions based on what we think will be of interest to journalists – who will hopefully want to cover the story for print, broadcast, or online.”

I asked Merino for his take on why some studies generate news releases, others don’t, and how this impacts media coverage.

“Perhaps [in this case] it’s related to the topic of the paper. Is alcohol more interesting than exercise?  Studies more likely to be picked up by the media are those that find an association between exposure and outcome (‘positive’ studies), as well as those that have a press release. These are probably linked because it is possible that journals may be more likely to issue a press release when a study finds an association.

In this case it’s hard to know whether a press release would have meant that the exercise paper would have been picked up by more outlets, but we expect it would have. This may represent another aspect of positive publication bias: we publish a ‘negative’ study but don’t press release it, and even if we had done so, it’s possible that the press would not have picked it up, or would have picked it up less than the ‘positive’ one.”

Merino acknowledges the possibility that both publishers and readers may have an unconscious — and even conscious — bias against “negative” studies, and thinks it would make for a compelling study.

Why This Matters

  • News releases are powerful. Maybe increasingly so, as time-pressured journalists count on them for story ideas, and can use their content — sometimes exclusively –for background information and even quotes.  Therefore, news releases often function as gateways and gatekeepers for what stories the public has access to
  • So-called “negative” results are just as important as positive ones, yet we often don’t hear about them. People make decisions about their health based on what they read in the news; if that news is slanted toward positive findings, the public doesn’t have a solid foundation for making good choices
  • Using clickability as a determinant of newsworthiness may sell ads, but it shortchanges informed public discourse. Even worse it can misguide and cause real harm.

I’m certainly not accusing the BMJ of intentionally choosing to highlight just positive findings. Nor am I saying the news coverage was universally superficial.

What I am pointing out is a flow of information from scientific studies to public consumption that can be modified or disrupted at several key points along the way – from the medical journals that publish studies to the headlines most of us read. 

And I think it should give us all pause to realize that the person who controls the faucet is often a PR manager. This is not a scientist or health care professional. This is someone motivated primarily by the “news potential” of the study, and not necessarily the overall health of the public.

And I think it is imperative to bear in mind that these editorial choices ultimately do influence the health care choices made by real people.

You might also like

Comments (2)

Please note, comments are no longer published through this website. All previously made comments are still archived and available for viewing through select posts.

David Littleboy

August 1, 2017 at 1:50 am

Sorry to be argumentative here, but I’m _not_ surprised the alcohol study got a lot of attention. A few years ago, there were lots of reports/articles claiming that moderate alcohol consumption was “better” than zero, and moderate alcohol consumption was at least at one point recommended over zero by the medical community. I thought that this was probably a wrong (non-drinkers include people who don’t drink for medical reasons and skew the data), and it’s interesting that there are beginning to be studies that show that.

Since moderate drinking was recommended, this is an important study.

    Kevin Lomangino

    August 1, 2017 at 8:02 am

    Thanks for your comment David. I don’t think the post is arguing that the alcohol study didn’t deserve attention. It’s merely asking/wondering why that study was so widely covered while the exercise study received zero coverage. There has certainly been much news coverage suggesting that exercise is protective against dementia and this study contradicts that finding. For that reason, isn’t it an important study that also merited coverage?

    Kevin Lomangino
    Managing Editor