Health News Review

On December 31 there may not have been much news.  Journalists may have grown tired of “Best of 2013″ or “Top 10 whatever” stories.  So a paper in the Journal of the American Medical Association that reports that “patients with mild to moderate Alzheimer’s disease had slower functional decline” after taking big doses of Vitamin E was bound to get a lot of attention.

And it did. More than 200 returns on a web search.

And maybe it shouldn’t have – at least not the kind of attention it garnered.

My 93-year old Dad is showing signs of at least mild cognitive impairment, so believe me, I perk up when I hear Alzheimer’s news. So get this straight:  I’m not saying this isn’t important research.  I’m not saying it wasn’t newsworthy.  I’m simply saying that most of the stories I saw failed to evaluate the evidence, exaggerated benefits and failed to give necessary context and caveats – or the balance of same was sorely out of whack.

My local Star Tribune newspaper put the story in prime real estate – page one above the fold.

Was this study worth that kind of placement?  I didn’t see another newspaper in the country give it that same kind of treatment.

But it undoubtedly got that kind of Star Tribune ink and prominence because the study was based at the local VA medical center.  And I bet you a lifelong supply of vitamin E pills that if the study had come from neighboring Wisconsin, Iowa, Illinois or the Dakotas, it would not have been page one.  It may not have been reported in the Star Tribune at all.

Let me emphasize: if you read the entire Star Tribune 900-word piece, you get a reasonable picture of the state of the research.  But readers are led to believe that an editorial weight has been given to a story that gets 900 words starting on page one above the fold.

Yet, buried deep in the story is this line: 

An accompanying editorial in JAMA called the results “modest.” 

But the editorial stated more:

“…the therapeutic effect seen was modest and more relevant to Alzheimer’s disease symptoms and consequences than to reversal of the disease process.”

The second part of that sentence carries an important point that was not captured in stories whose headlines emphasized slowing the disease process (I’ve added the following emphases):

Boston Globe:  Study suggests vitamin E can slow Alzheimer’s

UPI: Vitamin E may slow Alzheimer’s brain decline

HealthDay story on Philadelphia Inquirer site: Daily High-Dose Vitamin E Might Delay Alzheimer’s

Reuters: Vitamin E may slow early Alzheimer’s decline

USA Today: Vitamin E may slow Alzheimer’s decline

And on and on and on in dozens and dozens of stories.

The Associated Press story reflected good sourcing, but those sources framed the news in vastly different ways.  For example, on the one hand:

“This is truly a breakthrough paper and constitutes what we have been working toward for nearly three decades: the first truly disease-modifying intervention for Alzheimer’s,’’ said Dr. Sam Gandy of Mount Sinai School of Medicine in New York. ‘‘I am very enthusiastic about the results.’’

And on the other hand:

‘‘It’s a subtle effect but it’s probably real,’’ Dr. Ron Petersen, the Mayo Clinic’s Alzheimer’s research chief, said of the benefit on daily living from vitamin E. ‘‘That has to be weighed against the potential risks’’ seen in earlier studies, he said.

What are readers to make of this?

Breakthrough…first truly disease-modifying intervention?

Or subtle effect that’s “probably” real?

To further confuse/clarify the issue, please note that the researchers reported a 3 point difference in vitamin E users on a 78-point scale of activities of daily living.  They write:

Although there is not a consensus on a minimally clinically important difference for (that assessment), some clinicians, patients, or caregivers would consider a difference of 2 points as meaningful because it potentially represents, for example, a loss of dressing or bathing independently.”

But when you read the study, you see that they reported a “confidence interval” in the score difference of anywhere from .92 (or less than 1 point) on up to 5.39. What that means is that the researchers are confident that the true answer lies somewhere in between.  It could be a score change of less than 1 point – which would not be considered meaningful by most observers – or it could mean a 5 point difference. The wider the confidence interval – and this one seems fairly wide to me – the less certain is the estimate of benefit.

I didn’t see a single story that discussed the reported “confidence interval.”

Some readers are not stupid.

A reader on the New York Times website wrote:

“The confidence intervals run by the authors — and particularly the adjusted P values — demonstrate that this study found nothing, and absolutely demonstrated no causation. But that itself is news. Describing the trial as possibly having found a result is disingenuous. The Times’ headline on this article is just plain wrong.”

Though that is harsh, the reader’s attention to the confidence interval is noteworthy. Much more simply stated is this reader comment following a CNN blog post:

“I guess it goes without saying, but if Vitamin E may help Alzheimers it also may not help Alzheimers.”

Some have said that the low cost and apparent safety of Vitamin E might lead people to think, “Why not give this a try?”

Well, a closer look at the accompanying editorial is warranted.   The editorialists wrote:

“a meta-analysis of 19 randomized trials (found) that vitamin E in doses greater than 400 IU/d was associated with increased all-cause mortality.”

So the safety question of high-dose vitamin E (2,000 IU/d was the dose in the new study) is still unanswered.

The editorial notes that “the mechanism of action of vitamin E in Alzheimer’s disease is uncertain.”  Indeed, the Star Tribune story stated:

Asked how vitamin E might be working, (the principal investigator) laughed and said he had no idea. “We describe it of course as an antioxidant, but I’m hard-pressed to give you any reasonably cogent theory about why it might be doing anything in the brains of patients with Alzheimer’s,” he said.

Finally, though, back to the editorial for an important over-riding observation:

“Major AD (Alzheimer’s disease) treatment trials like this one use functional ability… as an outcome with increasing frequency. The use of functional ability measures for this purpose overtly or tacitly uses impairment in functional ability as though it were solely a consequence of AD progression. Such impairment, however, is not specific to AD but occurs frequently among older people as a consequence of many conditions. Some aspects of this trial highlight the nonspecificity of the link between AD and functional decline.”

Let’s recap what we just learned:

  • this was an effect that is independently judged by some as modest, subtle, not a slam dunk, not a home run”
  • safety has not been established in the high doses studied
  • there is no “reasonably cogent theory” of a mode of action for vitamin E
  • this was another trial focusing on an outcome measure that is not specific to the disease being studied.
  • the confidence interval that the researchers reported – something journalists should take into account – means the effect could fall outside the range of what would be considered meaningful.

Why didn’t stories wrap up these caveats in one neat little package, instead of once again touting the potential benefits of something slowing symptoms, slowing decline, slowing progression? Most stories had one of these elements or another.  But I didn’t see a single story that addressed all of the five bullet points I listed above.  Given the growing burden of dementia on an aging population, we must do a better job communicating research results, seeking independent perspectives, and independently vetting claims.

Here are some stories that made noticeable efforts to offer analysis:

A CNN blog post quoted an Alzheimer’s Association statement:

“The results are positive enough to warrant more research to replicate and confirm these findings, but should not change current medical practice. No one should take vitamin E for Alzheimer’s except under the supervision of a physician.”

And it had a clearly labeled “Caveats” heading, noting:

  • Some results don’t completely add up, such as why none of the treatment groups did better than the placebo group in cognitive abilities.
  • None of the treatments seemed to be unsafe, according to this research, but “the size of the study did not allow us to detect infrequent but potentially significant adverse events,” the researchers wrote.

Another sound summary came on the aforementioned New York Times “The New Old Age” blog:

But other studies have found that vitamin E failed to delay dementia in people without symptoms or with mild cognitive impairment, which may precede Alzheimer’s.

“It was dead stone cold in the M.C.I. trial,” said the leader of that study, Dr. Ronald Petersen, director of the Mayo Clinic’s Alzheimer’s center. “You couldn’t have found a closer match to placebo.”

Dr. Denis Evans, a professor of internal medicine at Rush University, who wrote an editorial accompanying the new study, cautioned against extrapolating the results to anyone without mild to moderate Alzheimer’s.

“Does this mean that all of us who don’t want to develop Alzheimer’s should rush out and purchase a bottle of vitamin E?” he said. “Oh, please don’t.”

A Wall Street Journal story put this in the second sentence:

“However, the research… found no impact on memory and doctors said there was no evidence that vitamin E prevents the debilitating disease.”

Kudos to those who tried to add independent analysis – even on New Year’s Eve – either by story framing/emphasis, by use of independent experts, and/or by offering context and caveats on their own.

Here’s a list of just a few past examples of less-than-optimal Alzheimer’s disease news coverage:


Follow us on Twitter:

and on Facebook.



Dan Browning posted on January 2, 2014 at 10:04 am

The meta-analysis found that 400 IU of vitamin E increased mortality, but the studies in the meta-analysis included just one group of Alzheimer’s patients. Three other studies of vitamin E in Alzheimer’s patients, including the large longitudinal study done by the VA, found no effect on mortality. But no one suggested that it’s safe to take large doses without medical supervision. Some drugs taken by elderly patients are dangerous when taken with vitamin E.

It’s fair to characterize the slower decline as modest. But it equates with three hours a day less care-giving time at the end of two years. Caregivers, I assure you, would gladly take those three hours.

    Gary Schwitzer posted on January 2, 2014 at 10:20 am


    Thanks for your note.

    As I wrote, I thought you gave a reasonable picture of the state of the research.

    You might be right that caregivers would gladly take those three hours. But you might be wrong in putting too much weight on the 3 hour estimate, for reasons I articulated in my post above. As I explained, the confidence interval shows the uncertainty with the estimated benefit; it could be an effect size that most would consider not meaningful.

Judy Graham posted on January 2, 2014 at 12:17 pm

We all know that people who report/write stories are not those who write headlines. So what can be learned from your examples here? Those who write headlines (who are untrained in the subtleties of science/health reporting) should run them by editors and/or reporters if possible. But this rarely happens….

    Gary Schwitzer posted on January 2, 2014 at 12:30 pm


    Thanks for your note.

    I didn’t mean to imply that the headlines were the main issue. Those headlines simply reflect a story framing that continued throughout the body of most of the stories I looked at. As I wrote, “most of the stories I saw failed to evaluate the evidence, exaggerated benefits and failed to give necessary context and caveats – or the balance of same was sorely out of whack.”

    What I think can be learned was what I tried to summarize in my bold font bullet points:

    • this was an effect that is independently judged by some as modest, subtle, “not a slam dunk, not a home run”
    • safety has not been established in the high doses studied
    • there is no “reasonably cogent theory” of a mode of action for vitamin E
    • this was another trial focusing on an outcome measure that is not specific to the disease being studied.
    • the confidence interval that the researchers reported – something journalists should take into account – means the effect could fall outside the range of what would be considered meaningful.

    While some, like the CNN and NYT blog examples noted, made a noticeable effort to independently evaluate the evidence, most stories missed on the bullet points I listed. Those issues could be addressed in fewer than 100 words.

Gary Schwitzer posted on January 2, 2014 at 9:00 pm

This post has drawn a lot of attention on Twitter. I’m copying some of those comments here.

From Ivan Oransky, VP & global editorial director for MedPage Today:

“.@garyschwitzer’s critique of vitamin E-Alzheimer’s stories makes me miss regular @HealthNewsRevu reviews even more.”

From Dr. Ash Paul of the UK:

“Scathing article on subject of coverage of Vit E and Alz paper.”

From London pharmacologist David Colquhoun:

“Blame JAMA and authors for ghastly hype.”

From self-proclaimed clinical trial enrollment nerd Paul Ivsin of Chicago:

“Excellent deflation of the coverage of Vitamin E or Alzheimer’s clinical trial results.”

From Tom Burton of the Wall Street Journal:

“Reasons for caution about study ‘showing’ vitamin E helps in Alzheimer’s. Excellent analysis via @garyschwitzer.”

From clinical exercise specialist Vik Khanna:

“And “help” is a very big word. Does it restore function? Limit disability? Change any HRQOL measure or alter mortality?”

From Canadian doc blogger Yoni Freedhoff:

“Before you go off getting all excited about Vitamin E and Alzheimer’s, read @garyschwitzer’s bodyslam of the story.”

From Yale primary care physician & health policy researcher Joe Ross:

“Especially need to discuss potential of harm.”

Thomas Anderson posted on January 2, 2014 at 11:43 pm

Results of tests such as these may depend on the form of the vitamin used in a given study. Eric Klein used dl-alpha in his prostate cancer study, resulting in more cancer. No surprise to me, as I discovered long ago that this form of the vitamin is virtually worthless and may even do harm. The natural d-alpha or even gamma might have achieved the hoped for results.

AnneMarie Ciccarella posted on January 3, 2014 at 12:29 am

You hit a hot button topic for me. HEADLINES. I think that the media has a responsibility to the public to stop sensationalizing the trivial for the purpose of selling papers, increasing ratings or driving web traffic. Tell the truth. This is disgraceful. It’s time for a complete overhaul of the medical journalism field. Too many have a far greater emphasis on the journalism part at the expense of reporting the actual findings. It’s wrong. On every level. My two cents. Thanks, Gary, for keeping them honest. AM

Daniel Pendick posted on January 3, 2014 at 3:32 pm

Gary, I think you are right to emphasize that this study lies in the “marginal effect, possibly nothing” category, and that the audience should hear that. But it is worth noting that Alzheimer’s treatment remains so grim that people and their families will happily accept “marginal, and possibly not helpful.” Small and hypothetical cardiac risks of large-dose vitamin E seem hardly worth worrying about for a person with diagnosed dementia. Am I wrong?

    Gary Schwitzer posted on January 3, 2014 at 4:21 pm


    Thanks for your note.

    My site is only about your first sentence. It is called HealthNewsReview for a reason. It is about analyzing the accuracy, balance and completeness of media messages about health cae.

    My site is not about your final two sentences. There is nothing in what I do that makes value judgments about what “people and their families will happily accept.”

    But since you brought it up, how can we know what fully informed people “will happily accept” if they are not fully informed?

Linda posted on January 13, 2014 at 9:34 am

Claims about Vitamin E are part of a pattern of “flavor of the day.” Anyone who has been following Alzheimer disease research as long as I have (over 25 years), are so weary of these sensationalist claims. I have read them all from blueberries to ginko biloba. Thank you Gary, for highlighting another one to pile on the heap keeping families from clinging to false hope.
I do take exception with a comment that Vitamin E may reduce caregiving time. Even at the onset of the disease, symptoms are unpredicable and variable. People lose their ability to think and reason. Caregivers are on constant vigilance, never knowing what to expect. To make claims that this or any other substance reduces caregiving time is misleading and bogus. People with Alzheimer’s will not return to the person they once were and they should never be left on their own without monitoring.