Conference Object

Improving the Science in Replication Sciences

Author(s) / Creator(s)

Steiner, Peter

Abstract / Description

Given the central role of replication in the accumulation of scientific knowledge, researchers have reevaluated the robustness of seemingly well established findings across multiple disciplines. Results from these efforts have not been promising. The Open Science Collaboration (OSC) replicated 100 experimental and correlational studies published in high impact psychology journals and found that only 36% of these efforts produced results with the same statistical significance pattern as the original study. In general, most researchers consider 36% as a disappointingly low replication rate. But is it really reasonable to expect higher replication rates? And under which conditions can one actually expect a successful replication? In order to critically assess these questions, we present a novel framework for replication studies that outlines all the assumptions that are required to reproduce (within the limits of sampling error) the causal effect reported by an original study. In contrast to most other replication frameworks, our framework does not rely on procedural replication aspects (i.e., using same methods, instruments, etc.) but on the assumptions that need to be met by both the original and replication study to successfully reproduce a causal effect. The long list of strong assumptions makes it very clear that researchers should not be too optimistic about successful replications of effects. In addition, we will show that our replication framework can also be used to design studies that go beyond the direct replication of an effect but instead aim at investigating effect heterogeneities, generalizing causal effects, or assessing the performance of non-experimental methods.

Keyword(s)

Replication Reproducibility Causal Inference

Persistent Identifier

Date of first publication

2018

Is part of

Die Rolle der psychologischen Methodenlehre in der Open Science Debatte 2018, Berlin, Germany

Publisher

ZPID (Leibniz Institute for Psychology Information)

Citation

  • Author(s) / Creator(s)
    Steiner, Peter
  • PsychArchives acquisition timestamp
    2018-06-28T13:55:09Z
  • Made available on
    2018-06-28T13:55:09Z
  • Date of first publication
    2018
  • Abstract / Description
    Given the central role of replication in the accumulation of scientific knowledge, researchers have reevaluated the robustness of seemingly well established findings across multiple disciplines. Results from these efforts have not been promising. The Open Science Collaboration (OSC) replicated 100 experimental and correlational studies published in high impact psychology journals and found that only 36% of these efforts produced results with the same statistical significance pattern as the original study. In general, most researchers consider 36% as a disappointingly low replication rate. But is it really reasonable to expect higher replication rates? And under which conditions can one actually expect a successful replication? In order to critically assess these questions, we present a novel framework for replication studies that outlines all the assumptions that are required to reproduce (within the limits of sampling error) the causal effect reported by an original study. In contrast to most other replication frameworks, our framework does not rely on procedural replication aspects (i.e., using same methods, instruments, etc.) but on the assumptions that need to be met by both the original and replication study to successfully reproduce a causal effect. The long list of strong assumptions makes it very clear that researchers should not be too optimistic about successful replications of effects. In addition, we will show that our replication framework can also be used to design studies that go beyond the direct replication of an effect but instead aim at investigating effect heterogeneities, generalizing causal effects, or assessing the performance of non-experimental methods.
    en_US
  • Publication status
    unknown
  • Review status
    notReviewed
  • Persistent Identifier
    https://hdl.handle.net/20.500.12034/665
  • Persistent Identifier
    https://doi.org/10.23668/psycharchives.867
  • Language of content
    eng
    en_US
  • Publisher
    ZPID (Leibniz Institute for Psychology Information)
    en_US
  • Is part of
    Die Rolle der psychologischen Methodenlehre in der Open Science Debatte 2018, Berlin, Germany
  • Keyword(s)
    Replication
    en_US
  • Keyword(s)
    Reproducibility
    en_US
  • Keyword(s)
    Causal Inference
    en_US
  • Dewey Decimal Classification number(s)
    150
  • Title
    Improving the Science in Replication Sciences
    en_US
  • DRO type
    conferenceObject
    en_US