Systematic review and meta-analysis are possible as viable research techniques only through transparent reporting of primary research; thus, one might expect meta-analysts to demonstrate best practice in their reporting of results and have a high degree of transparency leading to reproducibility of their work. This assumption has yet to be fully tested in the psychological sciences. We therefore aimed to assess the transparency and reproducibility of psychological meta-analyses. We conducted a meta-review by sampling 150 studies from to extract information about each review's transparent and reproducible reporting practices. The results revealed that authors reported on average 55% of criteria and that transparent reporting practices increased over the three decades studied ( = 1.09, = 0.24, = 4.519, < .001). Review authors consistently reported eligibility criteria, effect-size information, and synthesis techniques. Review authors, however, on average, did not report specific search results, screening and extraction procedures, and most importantly, effect-size and moderator information from each individual study. Far fewer studies provided statistical code required for complete analytical replication. We argue that the field of psychology and research synthesis in general should require review authors to report these elements in a transparent and reproducible manner.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1177/1745691620906416 | DOI Listing |
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!