Mary Chris Jaklevic is a reporter-editor at HealthNewsReview.org. She tweets as @mcjaklevic.
Check out these enticing headlines based on a recent study linking whole grains to lower diabetes risk:
If that’s not enough to make you stock up on quinoa and oatmeal, the Post offered this catchy lead: “Eat your Wheaties to avoid diabetes.”
But before you run the grocery store, consider why it might not be advisable to start eating more whole grains based solely on this study.
As we’ve written many times, observational studies such as this one can’t prove cause and effect; they can only show an association.
It’s possible that people who eat a lot of whole grain in Denmark — where this study was conducted — engage in other healthy behaviors or have other characteristics that make them less prone to diabetes.
While researchers may attempt to adjust the data to account for those factors, it’s not possible to know whether all of the numerous variables have been taken into account. The lingering effect of these issues is known as “residual confounding.”
Nevertheless, some news organizations love to feed the public’s hearty appetite for a dietary “magic bullet” — a single nutrient that will avert disease.
In this case, Chalmers University of Technology accommodated by serving up a heaping course of misleading causal language in its news release:
“It doesn’t matter if it’s rye, oats, or wheat. As long as it is wholegrain, it can prevent type 2 diabetes. … The comprehensive study is a strong confirmation of previous research findings on the importance of whole grains for prevention of type 2 diabetes.”
“Can prevent” diabetes? “Comprehensive” study? “Strong confirmation” of previous findings?
That’s not what the study authors actually wrote in their published paper. They concluded that they found “consistent inverse associations between whole-grain intake and the risk of type 2 diabetes.”
It’s possible this study constitutes a useful clue when considered in the context of other research, but on its own it does not prove that whole grains prevent diabetes.
Unfortunately, that more cautious message wasn’t transmitted to the public in these three news stories.
For example, HealthDay suggested the data did show cause and effect, writing: “Exactly how whole grains help prevent type 2 diabetes isn’t clear from this study.”
In the next sentence, HealthDay flip-flopped, acknowledging that suggestion wasn’t really valid: “Because it’s an observational study, it isn’t designed to prove a cause-and-effect relationship.” If that’s the case, why did HealthDay use cause-and-effect language earlier in its story?
The Post quoted a researcher suggesting the study’s findings challenge trendy low-carb diets. The Post advised: “So feel free to indulge in a slab of whole-wheat sourdough with dinner.”
With zero data, the Spectator asserted that the ability to use whole grain to prevent type 2 diabetes has actually “been known for a long time.” It reported that this study contributed data on the “role of different wholegrain sources” and “how much wholegrain is needed to reduce the risk of developing diabetes.”
In a recent op-ed that reflected widespread frustration with nutrition research, Stanford researcher John Ioannidis, MD, wrote that studies showing associations — often erroneously reported in the news media as demonstrating cause and effect — might erode public trust and harm public health.
“Unfounded beliefs that justify eating more food, provided ‘quality food’ is consumed, confuse the public and detract from the agenda of preventing and treating obesity,” he wrote in the piece, which appeared in the Journal of the American Medical Association.
He noted that in studies like this one, the estimates of benefit “probably reflect almost exclusively the magnitude of the cumulative biases in this type of research, with extensive residual confounding and selective reporting.”
In a recent interview with CBC News, Ioannidis more bluntly described nutritional epidemiology a “scandal” that “should just go to the waste bin.” Rather than embrace such research, he said news outlets should ignore it.
“What it ends up being is that you get things published that are what the investigators, the reviewers and the editors want to see,” he said.
Please note: Observational studies (as with the research that linked smoking to cancer and other problems) can indeed pile up such overwhelming evidence that it would be prudent to make public health recommendations on that basis. However, it’s rare that observational studies reported in the news media rise to this level of evidence. Moreover, an observational study cannot prove cause and effect. Statistical association is not proof of cause-and-effect. It is not unimportant. But no one should make it more than what it is.