“Killer Meat” – headlined an LA Times online column.
“Want to Live Longer? Cut Back on Red Meat” – pronounced CNN.com.
“Daily Red Meat Raises Chances Of Dying Early” warned washingtonpost.com.
It all sounds so certain.
But this was an observational study – not an experiment. It was based on responses to a questionnaire.
Such a study CAN NOT – simply CAN NOT – establish cause-and-effect and therefore CAN NOT establish risk.
So any story that said “higher risk” or “chances of dying” was simply wrong.
Stories on such studies are obliged to point out the potential weaknesses in such studies.
Journalists and consumers should read a column we published on HealthNewsReview.org, entitled “Does Your Language Fit the Evidence?”
And stories that gave these kinds of percentage (as the Washington Post did) are obliged to give you more:
Among women, those who ate the most red meat were 36 percent more likely to die for any reason, 20 percent more likely to die of cancer and 50 percent more likely to die of heart disease. Men who ate the most meat were 31 percent more likely to die for any reason, 22 percent more likely to die of cancer and 27 percent more likely to die of heart disease.
35% of what? 20% of what? 50% of what?
That’s like having a 50% off coupon and not knowing if it applies to the purchase of a Lexus or the purchase of a lollipop. Give the absolute risk reduction figures.
I gave my undergrad health journalism students about 5 minutes to analyze one such story yesterday. They easily came up with the above flaws and more.
Come on, folks. We have to get smarter about evaluating studies – and news coverage of studies.