Michael Joyce produces multimedia for HealthNewsReview.org and tweets as @mlmjoyce
I’ll admit this is predictable, but I can’t resist: It’s a tale of two news releases. Released on the same day, about the same study, but with very different headlines.
But first, the study: a randomized, placebo-controlled study run over four years by Creighton University, in collaboration with the University of California San Diego, with the objective of determining if dietary supplementation with vitamin D and calcium reduces the risk of cancer in older women. So what’s the answer?
Judging from the Creighton news release, vitamin D and calcium supplementation DID reduce the risk of cancer.
But according to the news release from the Journal of the American Medical Association (JAMA) — where the study was published this week — it DID NOT.
The Creighton news release highlights that in 2,303 healthy post-menopausal women over the age of 55, those that were randomly assigned to receive 2000 international units (IU) of vitamin D3 and 1500 mg of calcium, had a 30 percent lower risk of developing cancer than the placebo group. But buried in the third paragraph is this:
“This difference in cancer incidence rates between groups did not quite reach statistical significance.”
So why the “decreases risk of cancer” headline?
Clearly, these results are speculative until born out by other large, controlled studies like the pending VITAL study at Brigham and Women’s Hospital.
The authors go on to point out that two post hoc observational studies were significant. However, post hoc observations — by definition — look at the data AFTER the experiment is concluded. They simply look for patterns, after the fact, that were not part of the original study design. That’s why some critics call them “fishing expeditions.” They are considered inconclusive in studies like this in which the research question is clearly defined beforehand.
Even the authors, in their discussion of the study, point out that the post hoc observations “should be considered only exploratory and hypothesis-generating, and require assessment in further studies.”
The first post hoc observations showed an inverse association between blood levels of vitamin D (specifically, 25 hydroxy vitamin D) and cancer in the supplement group. Problem is, this observational method is no longer truly randomized so drawing cause-and-effect conclusions is impossible.
The second post hoc analysis — which excluded cancers, deaths, and drop-outs during the first year — showed that 3.17 percent of the supplement group and 4.86 percent of the placebo group had a new cancer diagnosis during years 2-4 of the study. But this result can not unequivocally differentiate which agent — calcium or vitamin D — is responsible for that effect.
But back to the news releases because there are two important considerations to bring up here.
First, we don’t know who was responsible for the glowing headline coming from the Creighton media affairs department. Whether public relations officials or study authors pushed this misleading narrative is unclear. But co-author Cedric Garland certainly should be challenged on this quote:
“This is the most important scientific study of this century to date.”
Really? A study with no statistical significance? Is it possible that Dr. Garland’s grandiose claim is influenced by the fact that he — along with half of the authors of this study — are associated with GrassrootsHealth, a nonprofit whose primary focus seems to be promoting the therapeutic or protective value of vitamin D for a variety of conditions such as diabetes, Alzheimer’s disease, autism, cystic fibrosis, and premature births to name a few?
Second, news releases like the arguably self-serving one from Creighton are not benign. In a time crunch, which news release does a journalist choose? Maybe KMTV in Omaha went with the hometown Creighton release as suggested by their headline, “Creighton study finds vitamin D decreases risk of cancer” (although, contrary to the headline, KMTV eventually mentions the study was not statistically significant). Other news outlets like CBS news and WebMD were more critical and comprehensive.
Lesson number one is that low levels of skepticism may be hazardous to your health. As Gary Schwitzer wrote in this recent BMJ article about what is or isn’t fake news, it’s probably good old-fashioned “spin” that is actually more ubiquitous, more manipulative, and maybe even more dangerous.
Lesson two is a variation on “measure twice and cut once” — which when applied to reading the news means going to two or more sources on stories you feel are important before rendering an opinion. This is a classic example of a story where just reading one source could have resulted in you getting the exactly wrong information.
The final lesson is a big one: special interests are everywhere. Readers might want to ask themselves these questions. Is it possible Creighton put a positive spin on these inconclusive results? If so, to whose benefit? Also, is it possible the authors of this study may have gone looking for an answer that suited their agenda? And when the results were inconclusive — which is important information — why not just share that openly?
Comments
Please note, comments are no longer published through this website. All previously made comments are still archived and available for viewing through select posts.
Comments are closed.
Our Comments Policy
But before leaving a comment, please review these notes about our policy.
You are responsible for any comments you leave on this site.
This site is primarily a forum for discussion about the quality (or lack thereof) in journalism or other media messages (advertising, marketing, public relations, medical journals, etc.) It is not intended to be a forum for definitive discussions about medicine or science.
We will delete comments that include personal attacks, unfounded allegations, unverified claims, product pitches, profanity or any from anyone who does not list a full name and a functioning email address. We will also end any thread of repetitive comments. We don”t give medical advice so we won”t respond to questions asking for it.
We don”t have sufficient staffing to contact each commenter who left such a message. If you have a question about why your comment was edited or removed, you can email us at feedback@healthnewsreview.org.
There has been a recent burst of attention to troubles with many comments left on science and science news/communication websites. Read “Online science comments: trolls, trash and treasure.”
The authors of the Retraction Watch comments policy urge commenters:
We”re also concerned about anonymous comments. We ask that all commenters leave their full name and provide an actual email address in case we feel we need to contact them. We may delete any comment left by someone who does not leave their name and a legitimate email address.
And, as noted, product pitches of any sort – pushing treatments, tests, products, procedures, physicians, medical centers, books, websites – are likely to be deleted. We don”t accept advertising on this site and are not going to give it away free.
The ability to leave comments expires after a certain period of time. So you may find that you’re unable to leave a comment on an article that is more than a few months old.
You might also like