Conference Object

Participation in online surveys in psychology. A meta-analysis.

Author(s) / Creator(s)

Burgard, Tanja
Kasten, Nadine
Bosnjak, Michael

Abstract / Description

Background and objectives: Nonresponse is one of the most severe problems in survey research (Hox & De Leeuw, 1994). If nonresponse is completely at random, it only reduces the amount of data collected. But in the case of nonrandom nonresponse, it can cause biased results, as the final respondents are no longer representative for the population of interest (Groves and Peytcheva 2008). The main question of the meta-analysis is, whether the initial participation rate in psychological studies has decreased over time. Moreover, possible moderators of this time effect will be addressed: The design of an invitation letter, the contact protocol, the topic, the data collection mode, the burden of participating in the study and the incentives given to participants. Research questions and hypotheses: As the participation of psychological studies is presumed to be influenced by values, culture and communication habits, changes of these factors over time are expected to have contributed to a decrease of participation rates during the last three decades. Thus, the first hypothesis stated is: H1: The initial participation rate in psychological studies has decreased over time. In individualistic cultures, decisions are rather based on an individual cost-benefit-calculus. Thus, the burden of the participation, incentives and interest in the topic are more important to convince potential participants to comply (Esser 1986). Due to the higher importance of the cost-benefit-calculus through individualization, over time it can be expected that longer studies suffer more from the decrease in participation than shorter ones. H2: A higher announced time duration of the study aggravates the decline in response rates. An intensively researched topic in the area of survey participation is the effect of incentives. It is rather unambiguous that incentives have a positive effect on response rates (e.g. Cook 2000), thus they can also be expected to compensate for the trend of decreasing response rates, especially taking into consideration the assumed higher importance of individual costs and benefits in decision-making. Several studies have already concluded, that monetary incentives are more effective than non-monetary incentives (Dykema et al. 2012, Lee and Cheng 2006). Moreover, it is plausible, that a higher incentive has a stronger effect in reducing response rates than a smaller one. Halpern et al. (2002), as well as Murdoch et al (2014) provide evidence from randomized controlled trials for this assumption. These findings from cross-sectional research indicate, that monetary incentives and higher incentives should lessen the decrease in response rates. H3: The decrease in participation rates is less pronounced for monetary incentives relative to other kinds of incentives. H4: The higher the incentive, the smaller the decrease in participation over time. Depending on the content and style of an invitation letter, there is considerable variation of the effect on response rates (de Leeuw et al. 2007). A method to get more attention is the personalization of the invitation letter (Cook et al. 2000). Due to the higher amount of communication, this measure should have become more important to reduce nonresponse. H5: The personalization of the invitation letter reduces the decrease of participation rates. Another method to get more attention and to make the participation in a study more attractive, is the salience of the topic. H6: The decrease in participation rates is less pronounced for more salient topics. The mode of the study conduction also plays a role for the survey response. Hox & De Leeuw (1994) found the highest response rate for face-to-face interviews, followed by telephone surveys. Mail surveys suffered from the lowest response rates. Yet, mail surveys were found to be preferred over web surveys by most respondents, as the meta-analysis of Shih & Fan (2007) showed. More than ten years later now and for the area of psychological studies, it would be interesting, to what extent the further diffusion of the internet has reduced the reservation towards online surveys. The overall increase of communication makes the easy access and fast processing of online surveys more attractive. This leads to the conclusion, that the preferences for study conduction modes may have changed. H7: The decrease is less pronounced for online surveys than for other survey modes. Method/Approach: Of interest are psychological studies reporting initial participation rates and at least one of the following study design characteristics already mentioned. Student samples will be excluded due to differing motivation structure and incentives. In the case of panel studies, only the first wave is taken due to panel mortality in later waves. Studies have to be published in the three decades between 1988 and 2018. Publication language has to be either English or German. Editorials or texts reviewing results of original articles will not be included. Data is collected on two levels. At the level of the study report, general information on the publication is retrieved. Within the study reports, there may be different characteristics of study conduction, for example to compare a group not offered an incentive with a group offered one. For each kind of treatment, there is one single initial participation rate. Thus, all the information on the treatment and the sample is retrieved at the level of the effect sizes: A multilevel meta-analysis will be conducted. The dependent outcome will be the participation rate. The relevant independent variable for all tests is the time of sampling. The moderating effects of the survey design will be tested using the characteristics of study conduction as moderator variables. As the effects of the study design characteristics on the time effect are of interest, random slopes models are used. Conclusions and implications (expected): There is plenty of evidence on declining response rates in the last decades. This trend can aggravate the possible bias due to nonresponse. It is of interest what factors may moderate this trend to be able to guide survey operations by empirical evidence to optimize survey response. Due to the change in the willingness to participate in scientific studies, the continuous updating of the cumulative evidence is of importance. References: Cook; Heath; Thompson (2000): A meta-analysis of response rates in web- or internet-based surveys. Educational and psychological measurement, 60(6), 821-836. De Leeuw; Callegaro; Hox; Korendijk; Lensvelt-Mulders (2007): The influence of advance letters on response in telephone surveys. A meta-analysis. Public Opinion Quarterly, 71(3), 413-443 Dykema, Jennifer; Stevenson, John; Kniss, Chad; Kvale, Katherine; González, Kim; Cautley, Eleanor (2012): Use of Monetary and Nonmonetary Incentives to Increase Response Rates Among African Americans in the Wisconsin Pregnancy Risk Assessment Monitoring System. Maternal and child health journal, Vol. 16(4), 785-791. Esser (1986): Über die Teilnahme an Befragungen. ZUMA-Nachrichten 18: 38-46. Groves, Robert; Peytcheva, Emilia (2008): The Impact of Nonresponse Rates on Nonresponse Bias: A Meta-Analysis. Public Opinion Quarterly, Volume 72, Issue 2, Pages 167–189. Halpern, Scott; Ubel, Peter; Berlin, Jesse; Asch, David (2002): Randomized trial of 5 dollars versus 10 dollars monetary incentives, envelope size, and candy to increase physician response rates to mailed questionnaires. Medical care, Vol. 40(9), 834. Hox; de Leeuw (1994): A comparison of nonresponse in mail, telephone and face to face surveys. Quality and Quantity, 28 (4), 319-344. Lee, Soo-Kyung; Yu-Yao, Cheng (2006): Reaching Asian Americans: Sampling Strategies and Incentives. Journal of Immigrant and Minority Health, Vol. 8(3), 245-250. Murdoch, Maureen; Simon, Alisha Baines; Polusny, Melissa Anderson; Bangerter, Ann Kay; Grill, Joseph Patrick; Noorbaloochi, Siamak; Partin, Melissa Ruth (2014): Impact of different privacy conditions and incentives on survey response rate, participant representativeness, and disclosure of sensitive information: a randomized controlled trial. BMC Medical Research Methodology, Vol. 14 (1). Shi; Fan (2007): Response rates and mode preferences in web-mail mixed-mode surveys: a meta-analysis. International Journal of Internet Science, 2(1), 59-82.

Persistent Identifier

Date of first publication

2019-05-30

Is part of

Research Synthesis 2019 incl. Pre-Conference Symposium Big Data in Psychology, Dubrovnik, Croatia

Publisher

ZPID (Leibniz Institute for Psychology Information)

Citation

Burgard, T., Kasten, N., & Bosnjak, M. (2019). Participation in online surveys in psychology. A meta-analysis. ZPID (Leibniz Institute for Psychology Information). https://doi.org/10.23668/psycharchives.2473
  • Author(s) / Creator(s)
    Burgard, Tanja
  • Author(s) / Creator(s)
    Kasten, Nadine
  • Author(s) / Creator(s)
    Bosnjak, Michael
  • PsychArchives acquisition timestamp
    2019-06-11T14:06:20Z
  • Made available on
    2019-06-11T14:06:20Z
  • Date of first publication
    2019-05-30
  • Abstract / Description
    Background and objectives: Nonresponse is one of the most severe problems in survey research (Hox & De Leeuw, 1994). If nonresponse is completely at random, it only reduces the amount of data collected. But in the case of nonrandom nonresponse, it can cause biased results, as the final respondents are no longer representative for the population of interest (Groves and Peytcheva 2008). The main question of the meta-analysis is, whether the initial participation rate in psychological studies has decreased over time. Moreover, possible moderators of this time effect will be addressed: The design of an invitation letter, the contact protocol, the topic, the data collection mode, the burden of participating in the study and the incentives given to participants. Research questions and hypotheses: As the participation of psychological studies is presumed to be influenced by values, culture and communication habits, changes of these factors over time are expected to have contributed to a decrease of participation rates during the last three decades. Thus, the first hypothesis stated is: H1: The initial participation rate in psychological studies has decreased over time. In individualistic cultures, decisions are rather based on an individual cost-benefit-calculus. Thus, the burden of the participation, incentives and interest in the topic are more important to convince potential participants to comply (Esser 1986). Due to the higher importance of the cost-benefit-calculus through individualization, over time it can be expected that longer studies suffer more from the decrease in participation than shorter ones. H2: A higher announced time duration of the study aggravates the decline in response rates. An intensively researched topic in the area of survey participation is the effect of incentives. It is rather unambiguous that incentives have a positive effect on response rates (e.g. Cook 2000), thus they can also be expected to compensate for the trend of decreasing response rates, especially taking into consideration the assumed higher importance of individual costs and benefits in decision-making. Several studies have already concluded, that monetary incentives are more effective than non-monetary incentives (Dykema et al. 2012, Lee and Cheng 2006). Moreover, it is plausible, that a higher incentive has a stronger effect in reducing response rates than a smaller one. Halpern et al. (2002), as well as Murdoch et al (2014) provide evidence from randomized controlled trials for this assumption. These findings from cross-sectional research indicate, that monetary incentives and higher incentives should lessen the decrease in response rates. H3: The decrease in participation rates is less pronounced for monetary incentives relative to other kinds of incentives. H4: The higher the incentive, the smaller the decrease in participation over time. Depending on the content and style of an invitation letter, there is considerable variation of the effect on response rates (de Leeuw et al. 2007). A method to get more attention is the personalization of the invitation letter (Cook et al. 2000). Due to the higher amount of communication, this measure should have become more important to reduce nonresponse. H5: The personalization of the invitation letter reduces the decrease of participation rates. Another method to get more attention and to make the participation in a study more attractive, is the salience of the topic. H6: The decrease in participation rates is less pronounced for more salient topics. The mode of the study conduction also plays a role for the survey response. Hox & De Leeuw (1994) found the highest response rate for face-to-face interviews, followed by telephone surveys. Mail surveys suffered from the lowest response rates. Yet, mail surveys were found to be preferred over web surveys by most respondents, as the meta-analysis of Shih & Fan (2007) showed. More than ten years later now and for the area of psychological studies, it would be interesting, to what extent the further diffusion of the internet has reduced the reservation towards online surveys. The overall increase of communication makes the easy access and fast processing of online surveys more attractive. This leads to the conclusion, that the preferences for study conduction modes may have changed. H7: The decrease is less pronounced for online surveys than for other survey modes. Method/Approach: Of interest are psychological studies reporting initial participation rates and at least one of the following study design characteristics already mentioned. Student samples will be excluded due to differing motivation structure and incentives. In the case of panel studies, only the first wave is taken due to panel mortality in later waves. Studies have to be published in the three decades between 1988 and 2018. Publication language has to be either English or German. Editorials or texts reviewing results of original articles will not be included. Data is collected on two levels. At the level of the study report, general information on the publication is retrieved. Within the study reports, there may be different characteristics of study conduction, for example to compare a group not offered an incentive with a group offered one. For each kind of treatment, there is one single initial participation rate. Thus, all the information on the treatment and the sample is retrieved at the level of the effect sizes: A multilevel meta-analysis will be conducted. The dependent outcome will be the participation rate. The relevant independent variable for all tests is the time of sampling. The moderating effects of the survey design will be tested using the characteristics of study conduction as moderator variables. As the effects of the study design characteristics on the time effect are of interest, random slopes models are used. Conclusions and implications (expected): There is plenty of evidence on declining response rates in the last decades. This trend can aggravate the possible bias due to nonresponse. It is of interest what factors may moderate this trend to be able to guide survey operations by empirical evidence to optimize survey response. Due to the change in the willingness to participate in scientific studies, the continuous updating of the cumulative evidence is of importance. References: Cook; Heath; Thompson (2000): A meta-analysis of response rates in web- or internet-based surveys. Educational and psychological measurement, 60(6), 821-836. De Leeuw; Callegaro; Hox; Korendijk; Lensvelt-Mulders (2007): The influence of advance letters on response in telephone surveys. A meta-analysis. Public Opinion Quarterly, 71(3), 413-443 Dykema, Jennifer; Stevenson, John; Kniss, Chad; Kvale, Katherine; González, Kim; Cautley, Eleanor (2012): Use of Monetary and Nonmonetary Incentives to Increase Response Rates Among African Americans in the Wisconsin Pregnancy Risk Assessment Monitoring System. Maternal and child health journal, Vol. 16(4), 785-791. Esser (1986): Über die Teilnahme an Befragungen. ZUMA-Nachrichten 18: 38-46. Groves, Robert; Peytcheva, Emilia (2008): The Impact of Nonresponse Rates on Nonresponse Bias: A Meta-Analysis. Public Opinion Quarterly, Volume 72, Issue 2, Pages 167–189. Halpern, Scott; Ubel, Peter; Berlin, Jesse; Asch, David (2002): Randomized trial of 5 dollars versus 10 dollars monetary incentives, envelope size, and candy to increase physician response rates to mailed questionnaires. Medical care, Vol. 40(9), 834. Hox; de Leeuw (1994): A comparison of nonresponse in mail, telephone and face to face surveys. Quality and Quantity, 28 (4), 319-344. Lee, Soo-Kyung; Yu-Yao, Cheng (2006): Reaching Asian Americans: Sampling Strategies and Incentives. Journal of Immigrant and Minority Health, Vol. 8(3), 245-250. Murdoch, Maureen; Simon, Alisha Baines; Polusny, Melissa Anderson; Bangerter, Ann Kay; Grill, Joseph Patrick; Noorbaloochi, Siamak; Partin, Melissa Ruth (2014): Impact of different privacy conditions and incentives on survey response rate, participant representativeness, and disclosure of sensitive information: a randomized controlled trial. BMC Medical Research Methodology, Vol. 14 (1). Shi; Fan (2007): Response rates and mode preferences in web-mail mixed-mode surveys: a meta-analysis. International Journal of Internet Science, 2(1), 59-82.
    en_US
  • Citation
    Burgard, T., Kasten, N., & Bosnjak, M. (2019). Participation in online surveys in psychology. A meta-analysis. ZPID (Leibniz Institute for Psychology Information). https://doi.org/10.23668/psycharchives.2473
    en
  • Persistent Identifier
    https://hdl.handle.net/20.500.12034/2099
  • Persistent Identifier
    https://doi.org/10.23668/psycharchives.2473
  • Language of content
    eng
    en_US
  • Publisher
    ZPID (Leibniz Institute for Psychology Information)
    en_US
  • Is part of
    Research Synthesis 2019 incl. Pre-Conference Symposium Big Data in Psychology, Dubrovnik, Croatia
    en_US
  • Dewey Decimal Classification number(s)
    150
  • Title
    Participation in online surveys in psychology. A meta-analysis.
    en_US
  • DRO type
    conferenceObject
    en_US
  • Leibniz institute name(s) / abbreviation(s)
    ZPID
  • Visible tag(s)
    ZPID Conferences and Workshops