Please use this identifier to cite or link to this item: http://dx.doi.org/10.23668/psycharchives.867
Title: Improving the Science in Replication Sciences
Authors: Steiner, Peter
Issue Date: 2018
Publisher: ZPID (Leibniz Institute for Psychology Information)
Series/Report no.: Die Rolle der psychologischen Methodenlehre in der Open Science Debatte (15.06.2018, FU Berlin)
Abstract: Given the central role of replication in the accumulation of scientific knowledge, researchers have reevaluated the robustness of seemingly well established findings across multiple disciplines. Results from these efforts have not been promising. The Open Science Collaboration (OSC) replicated 100 experimental and correlational studies published in high impact psychology journals and found that only 36% of these efforts produced results with the same statistical significance pattern as the original study. In general, most researchers consider 36% as a disappointingly low replication rate. But is it really reasonable to expect higher replication rates? And under which conditions can one actually expect a successful replication? In order to critically assess these questions, we present a novel framework for replication studies that outlines all the assumptions that are required to reproduce (within the limits of sampling error) the causal effect reported by an original study. In contrast to most other replication frameworks, our framework does not rely on procedural replication aspects (i.e., using same methods, instruments, etc.) but on the assumptions that need to be met by both the original and replication study to successfully reproduce a causal effect. The long list of strong assumptions makes it very clear that researchers should not be too optimistic about successful replications of effects. In addition, we will show that our replication framework can also be used to design studies that go beyond the direct replication of an effect but instead aim at investigating effect heterogeneities, generalizing causal effects, or assessing the performance of non-experimental methods.
URI: https://hdl.handle.net/20.500.12034/665
http://dx.doi.org/10.23668/psycharchives.867
Appears in Collections:Books & Proceedings & Talks

Files in This Item:
File Description SizeFormat 
PresOpenScienceSteiner.pdfConference Talk983,23 kBAdobe PDFThumbnail
View/Open


This item is licensed under a Creative Commons License Creative Commons