Psychology’s reproducibility problem is journalism’s problem, too

Science news outlets were buzzing last week with findings from a major research initiative designed to replicate the results of studies in psychology.

The Reproducibility Project: Psychology re-ran 100 different studies that experts considered important foundational research in the field. This type of replication is important, explained project member Angela Atwood in a news release, because “Scientific evidence does not rely on trusting the authority of the person that made the discovery. Rather, credibility accumulates through independent replication and elaboration of the ideas and evidence.”

Alas, the effort to reproduce the studies did not go as well as one might have hoped — something that most media outlets emphasized in their coverage:

New York Times: Many Psychology Findings Not as Strong as Claimed, Study Says

Newsweek: Science’s Reproducibility Problem: 100 Psych Studies Were Tested and Only Half Held Up

Associated Press: Results of many psychology experiments can’t be duplicated, study finds

Gizmodo: A Lot of Published Psychology Results are Bullshit.

Of course, calling “bullshit” on the non-reproducible studies (as Gizmodo did) is taking things a step too far, as many other stories about the research pointed out. Although it’s possible that the original studies were flat out wrong and that’s why they couldn’t be duplicated, it’s also possible that the original studies were right, and the repeat study overlooked a real effect just by chance, the Associated Press notes. “Or both studies could be correct, with conflicting conclusions because of differences in how they were carried out,” AP explains.

Whatever the reason, the fact that many of these studies couldn’t be reproduced is certainly troubling news for the field of psychology. And psychology is far from alone in suffering from a replication crisis. As Stanford’s John Ioannidis commented to the Times, the problem could be even worse in fields such neuroscience, clinical medicine, and animal research. Indeed, Nature magazine has a special collection on reproducibility that spans all of these disciplines and more.

The Reproducibility Project did not study any psychological treatments — it focused on basic research into concepts like memory and behavior. Nevertheless, the connection to health journalism and our own project here at should hopefully be readily apparent.

Journal articles are never written in stone.

Journal articles are never written in stone.

We often caution health care journalists not to treat journal articles as if they were the stone tablets brought by Moses down from the mountaintop. In our review criteria, we ask journalists to probe for study limitations that might lead to an exaggerated or incorrect result. We ask for context on previous research and how the new research compares with other studies. We ask for independent sources who might cast a skeptical eye on the findings.

Insofar as they highlight the uncertainty of many study results, the conclusions of the Reproducibility Project underscore that these criteria are entirely appropriate and necessary elements of quality health journalism. And yet as our systematic reviews of news stories continue to demonstrate, all too often these elements are missing from news coverage of health studies.

Our project aims to change that.

Disclosure: The Reproducibility Project is funded by the Laura and John Arnold Foundation, which also supports

You might also like

Comments (1)

Please note, comments are no longer published through this website. All previously made comments are still archived and available for viewing through select posts.


September 3, 2015 at 11:05 am

Journalists may want to consider the role of the therapist in supporting the changes reported in the study. Any outcome that has as its core a one-on-one clinical dynamic is bound to be directly influenced by the skill of the clinician and trust the client is willing to invest in that professional relationship. The personality of the therapist is not an inconsequential piece of any reported success in individual counseling sessions set up to support a client’s desire for behaviorial change.