Please use this identifier to cite or link to this item:
Full metadata record
DC FieldValueLanguage
dc.rights.licenseCC-BY 4.0-
dc.contributor.authorLopez-Nicolas, Ruben-
dc.contributor.authorSanchez-Meca, Julio-
dc.contributor.authorLopez-Lopez, Jose Antonio-
dc.contributor.authorRubio-Aparicio, Maria-
dc.identifier.citationLopez-Nicolas, R., Sanchez-Meca, J., Lopez-Lopez, J. A., & Rubio-Aparicio, M. (2021). Reproducibility-related Reporting Practices in Meta-analyses on the Effectiveness of Psychological Interventions. ZPID (Leibniz Institute for Psychology).
dc.description.abstractBackground The last decade has revealed significant problems in terms of reproducibility in psychological research. The scientific psychology community has detected various errors and questionable research practices in the process of studying psychological phenomena. In this context, different proposals have been put forward in order to improve transparency and reproducibility practices. In the present study, we focus on meta-analytic methodology. Meta-analysis is a powerful and important tool to synthesize the literature about a research topic. Like other kinds of research, meta-analyses must be reproducible to be compliant with the principles of the scientific method. Furthermore, reproducible meta-analyses can be easily updated with new data or reanalysed applying new and more refined analysis techniques. Objetives We attempted to empirically assess the prevalence of reproducibility-related methodological reporting practices in published meta-analyses of psychological interventions. Our purpose was to identify the key points that could be improved with the aim to provide some recommendations to carry out reproducible meta-analyses. Research question Are the meta-analytical reports comprehensive enough to allow for potential reproduction attempts? What are the key points where there is a lack of reporting? Method We conducted a meta-review of meta-analyses of psychological interventions published between 2000 and 2020. We searched PubMed, PsycInfo and Web of Science databases. Then, we selected a random sample of 100 meta-analyses to examine and extract a range of transparency and reproducibility-related indicators using an ad hoc checklist based on existing meta-analysis guidelines. We collected data concerning the systematic review methods (identification and selection of studies and data collection process) and the meta-analysis methods (effect measures handling and synthesis and analysis methods). Results Major issues concerning completely reproducible search procedures report, specification of exact methods to compute effect sizes and specification of certain elements of the model used were found. Conclusion and implications We will use the findings from our meta-review to articulate some recommendations intended to improve the transparency, openness and reproducibility-related reporting practices of meta-analyses in clinical psychology and related areas. Keywords: Replicability; Meta-analysis; Transparency; Reproducibility; Meta-science. * Funding: This research has been funded with a grant from the Ministerio de Ciencia e Innovación and by FEDER funds (Project nº PID2019-104080GB-I00).en
dc.publisherZPID (Leibniz Institute for Psychology)en
dc.relation.ispartofResearch Synthesis & Big Data, 2021, onlineen
dc.titleReproducibility-related Reporting Practices in Meta-analyses on the Effectiveness of Psychological Interventionsen
Appears in Collections:Conference Object

Files in This Item:
File SizeFormat 
Public UseCC-BY 4.0
471,2 kBAdobe PDFDownload