Science news outlets were buzzing last week with findings from a major research initiative designed to replicate the results of studies in psychology.
The Reproducibility Project: Psychology re-ran 100 different studies that experts considered important foundational research in the field. This type of replication is important, explained project member Angela Atwood in a news release, because “Scientific evidence does not rely on trusting the authority of the person that made the discovery. Rather, credibility accumulates through independent replication and elaboration of the ideas and evidence.”
Alas, the effort to reproduce the studies did not go as well as one might have hoped — something that most media outlets emphasized in their coverage:
New York Times: Many Psychology Findings Not as Strong as Claimed, Study Says
Newsweek: Science’s Reproducibility Problem: 100 Psych Studies Were Tested and Only Half Held Up
Associated Press: Results of many psychology experiments can’t be duplicated, study finds
Gizmodo: A Lot of Published Psychology Results are Bullshit.
Of course, calling “bullshit” on the non-reproducible studies (as Gizmodo did) is taking things a step too far, as many other stories about the research pointed out. Although it’s possible that the original studies were flat out wrong and that’s why they couldn’t be duplicated, it’s also possible that the original studies were right, and the repeat study overlooked a real effect just by chance, the Associated Press notes. “Or both studies could be correct, with conflicting conclusions because of differences in how they were carried out,” AP explains.
Whatever the reason, the fact that many of these studies couldn’t be reproduced is certainly troubling news for the field of psychology. And psychology is far from alone in suffering from a replication crisis. As Stanford’s John Ioannidis commented to the Times, the problem could be even worse in fields such neuroscience, clinical medicine, and animal research. Indeed, Nature magazine has a special collection on reproducibility that spans all of these disciplines and more.
The Reproducibility Project did not study any psychological treatments — it focused on basic research into concepts like memory and behavior. Nevertheless, the connection to health journalism and our own project here at HealthNewsReview.org should hopefully be readily apparent.
Journal articles are never written in stone.
We often caution health care journalists not to treat journal articles as if they were the stone tablets brought by Moses down from the mountaintop. In our review criteria, we ask journalists to probe for study limitations that might lead to an exaggerated or incorrect result. We ask for context on previous research and how the new research compares with other studies. We ask for independent sources who might cast a skeptical eye on the findings.
Insofar as they highlight the uncertainty of many study results, the conclusions of the Reproducibility Project underscore that these criteria are entirely appropriate and necessary elements of quality health journalism. And yet as our systematic reviews of news stories continue to demonstrate, all too often these elements are missing from news coverage of health studies.
Our project aims to change that.
Disclosure: The Reproducibility Project is funded by the Laura and John Arnold Foundation, which also supports HealthNewsReview.org.
Comments (1)
Please note, comments are no longer published through this website. All previously made comments are still archived and available for viewing through select posts.
BabyBoomerWriter
September 3, 2015 at 11:05 amJournalists may want to consider the role of the therapist in supporting the changes reported in the study. Any outcome that has as its core a one-on-one clinical dynamic is bound to be directly influenced by the skill of the clinician and trust the client is willing to invest in that professional relationship. The personality of the therapist is not an inconsequential piece of any reported success in individual counseling sessions set up to support a client’s desire for behaviorial change.
Our Comments Policy
But before leaving a comment, please review these notes about our policy.
You are responsible for any comments you leave on this site.
This site is primarily a forum for discussion about the quality (or lack thereof) in journalism or other media messages (advertising, marketing, public relations, medical journals, etc.) It is not intended to be a forum for definitive discussions about medicine or science.
We will delete comments that include personal attacks, unfounded allegations, unverified claims, product pitches, profanity or any from anyone who does not list a full name and a functioning email address. We will also end any thread of repetitive comments. We don”t give medical advice so we won”t respond to questions asking for it.
We don”t have sufficient staffing to contact each commenter who left such a message. If you have a question about why your comment was edited or removed, you can email us at feedback@healthnewsreview.org.
There has been a recent burst of attention to troubles with many comments left on science and science news/communication websites. Read “Online science comments: trolls, trash and treasure.”
The authors of the Retraction Watch comments policy urge commenters:
We”re also concerned about anonymous comments. We ask that all commenters leave their full name and provide an actual email address in case we feel we need to contact them. We may delete any comment left by someone who does not leave their name and a legitimate email address.
And, as noted, product pitches of any sort – pushing treatments, tests, products, procedures, physicians, medical centers, books, websites – are likely to be deleted. We don”t accept advertising on this site and are not going to give it away free.
The ability to leave comments expires after a certain period of time. So you may find that you’re unable to leave a comment on an article that is more than a few months old.
You might also like