7
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: not found

      Web Versus Other Survey Modes: An Updated and Extended Meta-Analysis Comparing Response Rates

      1 , 2 , 3
      Journal of Survey Statistics and Methodology
      Oxford University Press (OUP)

      Read this article at

      ScienceOpenPublisher
      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Do web surveys still yield lower response rates compared with other survey modes? To answer this question, we replicated and extended a meta-analysis done in 2008 which found that, based on 45 experimental comparisons, web surveys had an 11 percentage points lower response rate compared with other survey modes. Fundamental changes in internet accessibility and use since the publication of the original meta-analysis would suggest that people’s propensity to participate in web surveys has changed considerably in the meantime. However, in our replication and extension study, which comprised 114 experimental comparisons between web and other survey modes, we found almost no change: web surveys still yielded lower response rates than other modes (a difference of 12 percentage points in response rates). Furthermore, we found that prenotifications, the sample recruitment strategy, the survey’s solicitation mode, the type of target population, the number of contact attempts, and the country in which the survey was conducted moderated the magnitude of the response rate differences. These findings have substantial implications for web survey methodology and operations.

          Related collections

          Most cited references85

          • Record: found
          • Abstract: not found
          • Article: not found

          Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement.

            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            Quantifying heterogeneity in a meta-analysis.

            The extent of heterogeneity in a meta-analysis partly determines the difficulty in drawing overall conclusions. This extent may be measured by estimating a between-study variance, but interpretation is then specific to a particular treatment effect metric. A test for the existence of heterogeneity exists, but depends on the number of studies in the meta-analysis. We develop measures of the impact of heterogeneity on a meta-analysis, from mathematical criteria, that are independent of the number of studies and the treatment effect metric. We derive and propose three suitable statistics: H is the square root of the chi2 heterogeneity statistic divided by its degrees of freedom; R is the ratio of the standard error of the underlying mean from a random effects meta-analysis to the standard error of a fixed effect meta-analytic estimate, and I2 is a transformation of (H) that describes the proportion of total variation in study estimates that is due to heterogeneity. We discuss interpretation, interval estimates and other properties of these measures and examine them in five example data sets showing different amounts of heterogeneity. We conclude that H and I2, which can usually be calculated for published meta-analyses, are particularly useful summaries of the impact of heterogeneity. One or both should be presented in published meta-analyses in preference to the test for heterogeneity. Copyright 2002 John Wiley & Sons, Ltd.
              Bookmark
              • Record: found
              • Abstract: not found
              • Article: not found

              Conducting Meta-Analyses inRwith themetaforPackage

                Bookmark

                Author and article information

                Journal
                Journal of Survey Statistics and Methodology
                Oxford University Press (OUP)
                2325-0984
                2325-0992
                June 2020
                June 01 2020
                May 13 2019
                June 2020
                June 01 2020
                May 13 2019
                : 8
                : 3
                : 513-539
                Affiliations
                [1 ]GESIS, Leibniz Institute for the Social Sciences, B2, 1, 68159 Mannheim, Germany
                [2 ]Professor with the ZPID, Leibniz Institute for Psychology Information, Universitätsring 15, 54296 Trier, Germany
                [3 ]Associate Professor with the Faculty of Social Sciences, University of Ljubljana, Kardeljeva ploščad 5, 1000 Ljubljana, Slovenia
                Article
                10.1093/jssam/smz008
                887d52f8-180a-4ccd-8f99-c0dd7d5dc64b
                © 2019

                https://academic.oup.com/journals/pages/open_access/funder_policies/chorus/standard_publication_model

                History

                Comments

                Comment on this article