Thanks largely to the inclusion of an independent perspective, this HealthDay piece was much stronger than its counterpart from TIME on the same study.
Though neither story picked up on the troubled history of this study (addressed here and here and in our review of the competing story from from TIME), HealthDay at least featured a reality check on the evidence high up in the story. Citing an independent expert, it called the study “too limited in size and scope to allow conclusions as to whether meditation really reduces risk of death or disease” — an assessment with which we concur. There was also a better framing of the statistics, and more information on costs and availability than TIME was able to provide.
For the second time in as many weeks, an NIH-funded study of an alternative medical treatment has been published under murky circumstances. For reasons that aren’t entirely clear, the two studies have received markedly different receptions in the media. Last week, a troubled study of chelation, a controversial and potentially dangerous approach to lowering heart risk, was picked apart by the media and had its flaws exposed in fine detail. But this study of transcendental meditation seems to have escaped similar scrutiny by most mainstream news media, even though the circumstances surrounding its publication are also highly unusual and warrant additional investigation. As Larry Husten at CardioBrief points out, the study was originally submitted to Archives of Internal Medicine and had its scheduled publication put on hold just 12 minutes before it was scheduled to be released.* Now the study has reappeared in a different journal, with different data, and no explanation as to why the study was originally pulled or whether any concerns about the original manuscript were satisfactorily addressed. Cardiobrief identifies a number of other inconsistencies surrounding the paper that should certainly raise some red flags; we suggest you read his entire post for this context.
Now it’s unfair to expect that every reporter is going to be aware of this study’s strange back story. And it’s unclear at this point whether the inconsistencies that Cardiobrief identifies reflect anything more than poor communication and/or lack of transparency on the part of the authors and editors. Nevertheless, we think reporters should be putting themselves in position to ferret out this kind of background by talking to independent experts and skeptically appraising evidence. With the chelation study, experts seemed to be seeking out reporters in order to poke holes in the research and generate appropriate skepticism. There was no corresponding push with this study, and the result seems to be that some media outlets have largely accepted the results at face value.
According to the story, “An initial 10-hour course runs about $1,500 when taught by nonprofit groups, and there is continued lifetime follow-up.” And it’s generally not covered by insurance.
The story tells us that 20 people in the medication group had a heart attack or stroke or died, compared with 32 in the health education group. It also provides the number of subjects who participated in the study (201). We would have liked to have seen both the numerator and denominator provided for completeness (20/99 in the TM group vs. 32/102 in the HE group for example). This is still preferable to simply telling readers that there was a “48% reduction” in risk.
Same observation we made in reviewing the TIME.com story: An important indirect harm is that of “opportunity costs.” Training and ongoing programming in TM is expensive especially given the mean household income of the participants – less than $18,000 annually. The story didn’t address this.
The independent expert addressed many of our concerns about drawing any sweeping conclusions from this study:
This last bullet is a key point of comparison with the TIME coverage, which misinterprets the statistical adjustments conducted by the authors as a strength of the study (!). In fact, statistical tinkering generally is not expected to change the conclusions of a randomized trial, and when it does (as in this case), it can raise suspicions about the strength and reliability of the findings.
An independent cardiologist is extensively quoted, bringing an appropriate dose of skepticism to the coverage.
The story hints at factors that can help reduce risk: diet, exercise, not smoking, cholesterol-lowering drugs, etc. But it never directly addresses this criterion. Not quite good enough.
The story indirectly addresses availability as it notes that live training in meditation from an instructor is necessary (a home video apparently won’t cut it), and insurance won’t cover the costs. It could provided some sense as to how easy it is to find qualified instructors in different areas (i.e. urban vs rural settings).
The story notes that transcendental meditation has been studied previously, with mixed results on cardiovascular outcomes.
This story has enough original reporting that we can be sure it wasn’t based on a press release.