A research letter in this week’s JAMA Internal Medicine addresses an issue that has become a pet peeve of ours: the failure of medical journal articles, journal news releases, and subsequent news releases, to address the limitations of observational studies. Observational studies, although important, cannot prove cause-and-effect; they can show statistical association but that does not necessarily equal causation.
The authors, from the University of Auckland in New Zealand, analyzed a combined 538 documents including major medical journal articles, accompanying editorials in those journals, news releases by those journals, and news stories written about all of the preceding.
Why?
They wrote:
“Observational research is abundant and influences clinical practice, in part via publication in high-impact journals and dissemination by news media. However, it frequently generates unreliable findings. Inherent methodologic limitations that generate bias and confounding mean that causal inferences cannot reliably be drawn. Study limitations may be inadequately acknowledged and accompanied by disclaimers that diminish their importance.”
Here’s what they found:
“Any study limitation was mentioned in 70 of 81 (86%) source article Discussion sections, 26 of 48 (54%) accompanying editorials, 13 of 54 (24%) journal press releases, 16 of 81 (20%) source article abstracts (of which 9 were published in the Annals of Internal Medicine), and 61 of 319 (19%) associated news stories. An explicit statement that causality could not be inferred was infrequently present: 8 of 81 (10%) source article Discussion sections, 7 of 48 (15%) editorials, 2 of 54 (4%) press releases, 3 of 81 (4%) source article abstracts, and 31 of319 (10%) news stories contained such statements.”
Graphically, it looked like this in the published JAMA Internal Medicine research letter:
That is an awful report card.
Why does it matter? The authors summarize nicely:
“A possible consequence of inadequate reporting of limitations of observational research is that readers consider the reported associations to be causal, promoting health practices based on evidence of modest quality. Up to 50% of such practices prove ineffective when tested in randomized clinical trials. Giving greater prominence to the limitations of observational research, particularly in the publication abstract and journal press releases,might temper this enthusiasm and reduce the need for subsequent reversals of practice.”
We’ve written about dozens and dozens of examples of news stories and other media messages that have failed to address the limitations of observational studies, thereby misleading the public.
We’ve criticized major medical journal news releases for doing so – The BMJ and The Lancet, for example.
For years, we’ve posted a primer on this site for journalists, news release writers and the general public, to help them understand the limitations. The primer is entitled, “Does the Language Fit the Evidence? Association Versus Causation.”
The exaggeration should stop. Observational studies play an important role. But communicators should not try to make them more than what they are.
Comments
Please note, comments are no longer published through this website. All previously made comments are still archived and available for viewing through select posts.
Comments are closed.
Our Comments Policy
But before leaving a comment, please review these notes about our policy.
You are responsible for any comments you leave on this site.
This site is primarily a forum for discussion about the quality (or lack thereof) in journalism or other media messages (advertising, marketing, public relations, medical journals, etc.) It is not intended to be a forum for definitive discussions about medicine or science.
We will delete comments that include personal attacks, unfounded allegations, unverified claims, product pitches, profanity or any from anyone who does not list a full name and a functioning email address. We will also end any thread of repetitive comments. We don”t give medical advice so we won”t respond to questions asking for it.
We don”t have sufficient staffing to contact each commenter who left such a message. If you have a question about why your comment was edited or removed, you can email us at feedback@healthnewsreview.org.
There has been a recent burst of attention to troubles with many comments left on science and science news/communication websites. Read “Online science comments: trolls, trash and treasure.”
The authors of the Retraction Watch comments policy urge commenters:
We”re also concerned about anonymous comments. We ask that all commenters leave their full name and provide an actual email address in case we feel we need to contact them. We may delete any comment left by someone who does not leave their name and a legitimate email address.
And, as noted, product pitches of any sort – pushing treatments, tests, products, procedures, physicians, medical centers, books, websites – are likely to be deleted. We don”t accept advertising on this site and are not going to give it away free.
The ability to leave comments expires after a certain period of time. So you may find that you’re unable to leave a comment on an article that is more than a few months old.
You might also like