Deakin University
Browse

File(s) under permanent embargo

What meta-analyses reveal about the replicability of psychological research

journal contribution
posted on 2018-01-01, 00:00 authored by Tom StanleyTom Stanley, Evan C Carter, Hristos Doucouliagos
Can recent failures to replicate psychological research be explained by typical magnitudes of statistical power, bias or heterogeneity? A large survey of 12,065 estimated effect sizes from 200 meta-analyses and nearly 8,000 papers is used to assess these key dimensions of replicability. First, our survey finds that psychological research is, on average, afflicted with low statistical power. The median of median power across these 200 areas of research is about 36%, and only about 8% of studies have adequate power (using Cohen's 80% convention). Second, the median proportion of the observed variation among reported effect sizes attributed to heterogeneity is 74% (I2). Heterogeneity of this magnitude makes it unlikely that the typical psychological study can be closely replicated when replication is defined as study-level null hypothesis significance testing. Third, the good news is that we find only a small amount of average residual reporting bias, allaying some of the often-expressed concerns about the reach of publication bias and questionable research practices. Nonetheless, the low power and high heterogeneity that our survey finds fully explain recent difficulties to replicate highly regarded psychological studies and reveal challenges for scientific progress in psychology.

History

Journal

Psychological bulletin

Volume

144

Issue

12

Pagination

1325 - 1346

Publisher

American Psychological Association

Location

Washinton, D.C.

ISSN

0033-2909

eISSN

1939-1455

Language

eng

Publication classification

C1 Refereed article in a scholarly journal

Copyright notice

2018, American Psychological Association