Health News Review

Kevin_Lomangino_81x100.jpg
The following is a guest post by Kevin Lomangino, one of our story reviewers on HealthNewsReview.org. He is an independent medical journalist and editor who is currently Editor-in-Chief of Clinical Nutrition Insight, a monthly evidence-based newsletter which reviews the scientific literature on nutrition for physicians and dietitians. He tweets as @Klomangino.

——————————————————————————-

Misreporting of Observational Studies: Can Scientific Journals Help?

Readers of this blog know that there can be a huge gap between what scientific evidence tells us and what gets reported to the public by the media. And perhaps nowhere does this chasm yawn as widely as it does in the field of nutrition research, with its steady stream of observational studies linking particular foods and nutrients with risk of various diseases.

Even though they are not designed to prove cause and effect, these studies inevitably get reported with overblown headlines about how coffee, for example, “reduces cancer risk” (or maybe it “increases” risk?) and olive oil
“cuts stroke risk.”

Gary has discussed the many reasons why journalists tend to sensationalize these kinds of studies, including the lack of time, space, and training needed to report more thoroughly. An important, but less appreciated, factor is the influence of press offices at scientific journals and at academic institutions. In an effort to increase publicity for their authors and organizations, they sometimes encourage journalists to pump up these studies beyond what the evidence warrants.

BMJ news release.jpg

Consider a press release put out last week by the BMJ – British Medical Journal – with the following headline: “It’s official – chocolate linked to heart health.”

Intrigued by the headline (as the BMJ no doubt hoped I would be), I searched the release for an indication that some prestigious independent body – the Institute of Medicine? a World Health Organization expert committee? – had come together to evaluate the evidence on chocolate’s cardiovascular effects. As unlikely as I found that prospect, I recognized that it could justify an “official” declaration of an association between chocolate and heart disease outcomes.

But no: the “official” designation was apparently bestowed by a headline writer in the BMJ press office. It was prompted by the publication of a study that found a correlation between high levels of chocolate intake and lower risk of cardiovascular disease and stroke.

As with all observational studies (or, in this case, a meta-analysis that pooled data from 7 observational studies), the study had many important limitations that were well catalogued by the authors in their paper. They note that “the available literature on this topic is limited and novel” and that more studies are required “to confirm or refute the results” of the analysis. To its credit, the BMJ referred to some of these limitations high up in the press release, which noted that the results might be explained “by some other unmeasured (confounding) factor” besides chocolate.

But why then trumpet the results in the headline as evidence of an “official” and apparently conclusive link? Journalists might be swayed by such language to give this study much more weight than it deserves.

A related question is whether a scientific organization should be spending any of its limited resources to promote this kind of study to the public. After all, the data are quite preliminary and the underlying premise about chocolate is, in my opinion at least, dubious at best. While there is certainly evidence that chocolate with a high cocoa content can make heart disease risk markers move in the right direction, most of the chocolate sold today has a lower concentration of cocoa solids, but lots of sugar, fat, and calories. Eating more of it seems just as likely to increase rather than decrease cardiovascular risk.

I hope that researchers will one day develop more conclusive evidence that chocolate benefits heart health. In the meantime, isn’t there something more important for one of the world’s top medical journals to tell us about?

Comments

Carolyn Thomas posted on August 29, 2011 at 2:41 pm

Kevin, what are you trying to tell me? The only good thing about surviving my heart attack is the belief that I and all women living with heart disease now cling to that lusciously dark chocolate (every day!) is what’s keeping us alive these days.
The BMJ knows this, which is why they didn’t trumpet the far less eye-catching but more accurate headline: “Limited Studies of Marginal Quality Suggest That Researchers Really Don’t Know Much About Chocolate’s Impact on Heart Health!” Who would read THAT?
But every woman heart patient I know is quoting the BMJ study – and why not? It IS the prestigious British Medical Journal after all . . .

Susan Fitzgerald posted on August 29, 2011 at 5:48 pm

I never believe these chocolate headlines and I don’t even read the articles. I have to figure they all fall into the “too good to be true” category…sadly. I wouldn’t mind eating a couple ounces of 70% dark chocolate a day (downed with several more ounces of red wine) if that really conferred more benefit than risk!

Deborah posted on August 30, 2011 at 8:28 am

It seems like nutrition science goes one way then the other; caffeine is good, caffeine is bad; chocolate is good, chocolate is bad; alcohol is good, alcohol is bad! With all these conflicting claims, I opt for a dose of moderation in the hopes that I am hitting the middle of the road.

Gary Schwitzer posted on August 30, 2011 at 8:55 am

Deborah,
Thanks for your note.
But don’t be too quick to think that it is the science that flip flops.
These stories are almost always based on observational studies which CANNOT establish cause-and-effect. They can only point to statistical associations. Maybe those associations are valid, maybe not. There may be many other factors at play to explain what is observed.
This is the real harm of news stories that don’t emphasize the inherent limitations of observational studies. It may lead some to think that scientists don’t know what they’re doing when, in fact, it is probably those who are communicating about the science who don’t realize what they are doing.

Tony Long posted on August 30, 2011 at 11:11 am

The unfortunate thing is that most people seem to be looking for things other than the obvious when it comes to their health, such as eating a plant based diet, and will usually believe headlines such as the one you spoke of only to find out later that they were not completely accurate.

Earle Holland posted on September 2, 2011 at 12:41 pm

Gary:
Kevin’s points are well-taken but he seems to be taking a broad-brush approach when he says the following: “An important, but less appreciated, factor is the influence of press offices at scientific journals and at academic institutions. In an effort to increase publicity for their authors and organizations, they sometimes encourage journalists to pump up these studies beyond what the evidence warrants.”
In no way am I offering a broad defense of press offices at academic institutions — much less at the journals — but it does seem fair to expect that, at a blog respected for its adherance to what the data says, that he would offer something other than ancedotes. Granted, I do know of press offices at some institutions that hype their findings, and I’ve seen examples of journals doing even more just to hype the buzz of this or that paper, but neither my experiences nor Kevin’s qualify as what we see as data.
More importantly, at least at most big-time research institutions, stories and news releases are always vetted by the principal investigators who have influence over how the final story reads. Therefore, they carry the largest share of blame when a release’s verbiage exceeds the scope of the actual research. Good institutional science writers don’t try to exaggerate a study’s significance — their job is hard enough just to get the science right and, at the same time, interesting to a broad public.
It has always been convenient to blame the institutional PIO since they rarely have much sway in the power struggle with researchers. But it’s also a myth that university science writers would independently hype a release beyond what the research allows. That would be the quickest way for them to lose their jobs!
Earle Holland
Asst VP for Research Communications
Ohio State University

Kevin Lomangino posted on September 4, 2011 at 2:12 pm

I appreciate the thoughtful comments on this post and wanted to respond specifically to the points made by Earle Holland. I agree with Earle that researchers have a role to play in making sure their findings are conveyed appropriately. Some research suggests that study authors tend to “spin” their data right in the manuscript itself ( http://jama.ama-assn.org/content/303/20/2058.full), so I don’t doubt that investigators can push press release language beyond the appropriate limits.
But if that’s what’s happening, then I would say that the communications people in press offices need to find a way to push back and make sure what they distribute to the media is accurate. (This may be easier said than done, but the right path isn’t necessarily an easy one.) In short, there’s plenty of blame to go around when it comes to misleading the public about scientific research findings. And I think everyone involved the communication of scientific research – including researchers, press/publication information professionals, and journalists –will need to play a part in making things change.
Regarding Earle’s request for data: I admit that I can’t say how “important” the influence of press offices is when it comes to pumping up stories beyond what the data warrant. However, in my work as a journalist, I do see press releases like the BMJ example I wrote about on a reasonably regular basis. And as a reviewer for this site, I know that what gets put into a press release will often find its way into a published story. So I think it’s reasonable to conclude that press offices bear responsibility for misleading stories in some cases – as we saw with last week’s chocolate stories. I do not know if studies have examined how frequently scientific press releases hype or otherwise misstate research findings, so I don’t have any data to share. Maybe we need an HNR spinoff site focusing specifically on press releases to develop this data and make a more compelling case for change?

Cathy posted on September 6, 2011 at 7:42 am

It’s time to respin this headline. My class (undergraduates, cross-disciplinary course on medical evidence & decision making) is treating this headline as “heart attacks prevent chocolate”. After all, the “evidence” provided could go either way…

Earle Holland posted on September 6, 2011 at 1:39 pm

There’s really more agreement in Kevin’s and my comments than there is dissent. All parties do play a role in how accurately the science is reported, and therefore all hold a share of the blame. My stance is based on more than 35 years of doing science communications at universities including an enormous amount of medical reporting. What’s problematic here is that the historic role of med center public information officers — that of conveying research information — has been usurped by hospital and med center marketing operations bent on public relations rather than public information. That is coupled with the heavy-hand that many docs bring to conveying information about medical research. It is a very rare event for a PIO to be able to withstand the force of a doc bent on hyperbolic interpretation of research. I know of only a few folks in the country who have the ability to withstand those storms.
One root of problems is that the person usually running the med center communications/PR/marketing shop is rarely a person with solid medical writing experience, so the ability in that shop for its leadership to take on overly aggressive docs is severely limited, if it is there at all.
Until med centers decide that seriously accurate representations of their research is the only way to maintain credibility with the public, we’ll continue to have the same plethora of problems we know face.
As to an HNR spinoff to evaluate press releases, I’m on record in favor of it and have mentioned just such a thing to Gary in the past. If you can figure out a way to set one up, count me in!