How do questionable research practices affect inferences of heterogeneity? A computer simulation.
Author(s) / Creator(s)
Hönekopp, Johannes
Linden, Audrey
Abstract / Description
Background: Heterogeneity reflects to what extent the results of a body of studies investigating the same question disagree over and above the effects of sampling error. Large heterogeneity means that effect sizes differ much more across studies than expected by the vagaries of sampling alone. Previously, we investigated heterogeneity in meta-analyses (which we might think of as consisting of a number of conceptual replications) and in multiple close replications (e.g. Many Labs). We found that heterogeneity tends to be very large in conceptual replications but quite small in close replications. However, it is unclear to what extent publication bias and questionable research practices (QRPs) might distort the heterogeneity that we observed in meta-analyses. It is well known that publication bias and QRPs inflate effect sizes in the published literature. There might be some effect on heterogeneity as well. Objectives: To understand to what extent we can trust the levels of heterogeneity empirically observed in meta-analyses. Research question(s) and/or hypothesis/es: How do publication bias and wide-spread QRPs affect the observed heterogeneity in meta-analyses? Method/Approach: We ran computer simulations in R. All considered between-subjects experiments in which means for experimental and control groups were contrasted. Multiple studies were run, published (or not) and (if published) summarised in a meta-analysis. We investigated the following factors in a fully crossed design: true effect size; true heterogeneity; number of studies per meta-analysis; strength of publication bias; QRP environment. The latter was defined by the extent to which (simulated) researchers engaged in the following: optional reporting from multiple dependent variables; optional stopping in participant recruitment; optional use of a moderator variable; and optional outlier removal. Results/Findings: Overall, publication bias and QRPs tend to moderately inflate observed heterogeneity. However, this bias appears small compared to levels of observed heterogeneity. Conclusions and implications (expected): High levels of heterogeneity observed in meta-analyses need to be taken seriously, and cannot be conveniently explained as artefacts arising from publication bias and QRPs. We discuss wide-ranging implications for progress in psychological science and the latter’s successful application to practical problems.
Persistent Identifier
Date of first publication
2019-03-13
Is part of
Open Science 2019, Trier, Germany
Publisher
ZPID (Leibniz Institute for Psychology Information)
Citation
Hönekopp, J., & Linden, A. (2019, March 13). How do questionable research practices affect inferences of heterogeneity? A computer simulation. ZPID (Leibniz Institute for Psychology Information). https://doi.org/10.23668/psycharchives.2397
-
j_3_Trier talk 2019.pdfAdobe PDF - 501.44KBMD5: 504767101d1ba484e2a06f71dd39021bDescription: Conference Talk
-
There are no other versions of this object.
-
Author(s) / Creator(s)Hönekopp, Johannes
-
Author(s) / Creator(s)Linden, Audrey
-
PsychArchives acquisition timestamp2019-04-03T12:36:30Z
-
Made available on2019-04-03T12:36:30Z
-
Date of first publication2019-03-13
-
Abstract / DescriptionBackground: Heterogeneity reflects to what extent the results of a body of studies investigating the same question disagree over and above the effects of sampling error. Large heterogeneity means that effect sizes differ much more across studies than expected by the vagaries of sampling alone. Previously, we investigated heterogeneity in meta-analyses (which we might think of as consisting of a number of conceptual replications) and in multiple close replications (e.g. Many Labs). We found that heterogeneity tends to be very large in conceptual replications but quite small in close replications. However, it is unclear to what extent publication bias and questionable research practices (QRPs) might distort the heterogeneity that we observed in meta-analyses. It is well known that publication bias and QRPs inflate effect sizes in the published literature. There might be some effect on heterogeneity as well. Objectives: To understand to what extent we can trust the levels of heterogeneity empirically observed in meta-analyses. Research question(s) and/or hypothesis/es: How do publication bias and wide-spread QRPs affect the observed heterogeneity in meta-analyses? Method/Approach: We ran computer simulations in R. All considered between-subjects experiments in which means for experimental and control groups were contrasted. Multiple studies were run, published (or not) and (if published) summarised in a meta-analysis. We investigated the following factors in a fully crossed design: true effect size; true heterogeneity; number of studies per meta-analysis; strength of publication bias; QRP environment. The latter was defined by the extent to which (simulated) researchers engaged in the following: optional reporting from multiple dependent variables; optional stopping in participant recruitment; optional use of a moderator variable; and optional outlier removal. Results/Findings: Overall, publication bias and QRPs tend to moderately inflate observed heterogeneity. However, this bias appears small compared to levels of observed heterogeneity. Conclusions and implications (expected): High levels of heterogeneity observed in meta-analyses need to be taken seriously, and cannot be conveniently explained as artefacts arising from publication bias and QRPs. We discuss wide-ranging implications for progress in psychological science and the latter’s successful application to practical problems.en_US
-
CitationHönekopp, J., & Linden, A. (2019, March 13). How do questionable research practices affect inferences of heterogeneity? A computer simulation. ZPID (Leibniz Institute for Psychology Information). https://doi.org/10.23668/psycharchives.2397en
-
Persistent Identifierhttps://hdl.handle.net/20.500.12034/2029
-
Persistent Identifierhttps://doi.org/10.23668/psycharchives.2397
-
Language of contentengen_US
-
PublisherZPID (Leibniz Institute for Psychology Information)en_US
-
Is part ofOpen Science 2019, Trier, Germanyen_US
-
Dewey Decimal Classification number(s)150
-
TitleHow do questionable research practices affect inferences of heterogeneity? A computer simulation.en_US
-
DRO typeconferenceObjecten_US
-
Visible tag(s)ZPID Conferences and Workshops