Not only did the quotes come from a news release, but the news came from a talk at a scientific meeting. There are limitations to the conclusions you can draw from such presentations, as they have not undergone the same kind of rigorous review that, for example, a journal article would. We offer a primer on this topic.
But, to add to the woes of this approach, there were no quotes from any independent expert.
This topic requires careful scrutiny. It didn’t get it from this story.
The selling of statins is already quite successful without journalism adding to the marketing frenzy by passing along wholesale the claims that appear in university news releases – from talks at scientific meetings that have not been rigorously peer-reviewed – without adding an independent voice to the story – and then framing the benefits in the more impressive-sounding relative (not absolute) risk reduction terms. Get the picture? Journalists need to evaluate evidence and scrutinize claims better than this.
Although statin medications have been on the market for a long time and information about the costs of both name brand and generic versions of these drugs is readily available, the story included no information about costs.
The story only presented relative risk of being diagnosed with colon cancer (i.e. ‘patients who took statins had a 12 percent lower risk of being diagnosed with colon cancer than people who did not take the drugs’) rather than the absolute risk of being diagnosed with colon cancer in the two groups. Readers should ask, "12% of what?" See our primer on this topic.
The story mentioned potential harms associated with the use of statin medication though didn’t provide any insight about how commonly these occur. Is it 1 in 10? 1 in 100? 1 in a million?
It was not clear from the article whether the evidence for the meta-analysis came from randomized trials or from observational studies. The potential biases ("healthy user effect") inherent in observational studies of pharmacuetical use might have been explicitly mentioned.
There was also no mention of the limitations of drawing conclusions from a talk given at a scientific meeting. See our primer on this.
There’s a reason we analyze whether a story evaluates the quality of the evidence. It’s not just an academic exercise: it’s vital to reader comprehension.
Not applicable because the story really delivered no substantive background information on colon cancer.
The only quotes came from the lead researcher, pulled from a university news release. No independent expert was quoted.
The story did not provide any information about other approaches to reducing risk of colon cancer. It would take a line to do so.
The story mentioned that statin medications are a popular medication for managing cholesterol levels and so it was clear that they are currently in use.
It would have been useful to indicate that these are prescription medications available in both brand name and generic versions.
The story reported that it was about a meta-analysis of data on the impact of statin use on risk of colon cancer. But the story gave no context or background on other studies on statins and colon cancer, some of them dating back many years. We always urge journalists to put new research into the context of existing research.
Both quotes came from a university news release. Why? And why no interview with an independent expert?