11
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      The Misleading count: an identity-based intervention to counter partisan misinformation sharing

      research-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Interventions to counter misinformation are often less effective for polarizing content on social media platforms. We sought to overcome this limitation by testing an identity-based intervention, which aims to promote accuracy by incorporating normative cues directly into the social media user interface. Across three pre-registered experiments in the US ( N = 1709) and UK ( N = 804), we found that crowdsourcing accuracy judgements by adding a Misleading count (next to the Like count) reduced participants' reported likelihood to share inaccurate information about partisan issues by 25% (compared with a control condition). The Misleading count was also more effective when it reflected in-group norms (from fellow Democrats/Republicans) compared with the norms of general users, though this effect was absent in a less politically polarized context (UK). Moreover, the normative intervention was roughly five times as effective as another popular misinformation intervention (i.e. the accuracy nudge reduced sharing misinformation by 5%). Extreme partisanship did not undermine the effectiveness of the intervention. Our results suggest that identity-based interventions based on the science of social norms can be more effective than identity-neutral alternatives to counter partisan misinformation in politically polarized contexts (e.g. the US).

          This article is part of the theme issue ‘Social norm change: drivers and consequences’.

          Related collections

          Most cited references36

          • Record: found
          • Abstract: not found
          • Article: not found

          The spread of true and false news online

            Bookmark
            • Record: found
            • Abstract: found
            • Article: found
            Is Open Access

            Fighting COVID-19 Misinformation on Social Media: Experimental Evidence for a Scalable Accuracy-Nudge Intervention

            Across two studies with more than 1,700 U.S. adults recruited online, we present evidence that people share false claims about COVID-19 partly because they simply fail to think sufficiently about whether or not the content is accurate when deciding what to share. In Study 1, participants were far worse at discerning between true and false content when deciding what they would share on social media relative to when they were asked directly about accuracy. Furthermore, greater cognitive reflection and science knowledge were associated with stronger discernment. In Study 2, we found that a simple accuracy reminder at the beginning of the study (i.e., judging the accuracy of a non-COVID-19-related headline) nearly tripled the level of truth discernment in participants’ subsequent sharing intentions. Our results, which mirror those found previously for political fake news, suggest that nudging people to think about accuracy is a simple way to improve choices about what to share on social media.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              Lazy, not biased: Susceptibility to partisan fake news is better explained by lack of reasoning than by motivated reasoning

              Why do people believe blatantly inaccurate news headlines ("fake news")? Do we use our reasoning abilities to convince ourselves that statements that align with our ideology are true, or does reasoning allow us to effectively differentiate fake from real regardless of political ideology? Here we test these competing accounts in two studies (total N = 3446 Mechanical Turk workers) by using the Cognitive Reflection Test (CRT) as a measure of the propensity to engage in analytical reasoning. We find that CRT performance is negatively correlated with the perceived accuracy of fake news, and positively correlated with the ability to discern fake news from real news - even for headlines that align with individuals' political ideology. Moreover, overall discernment was actually better for ideologically aligned headlines than for misaligned headlines. Finally, a headline-level analysis finds that CRT is negatively correlated with perceived accuracy of relatively implausible (primarily fake) headlines, and positively correlated with perceived accuracy of relatively plausible (primarily real) headlines. In contrast, the correlation between CRT and perceived accuracy is unrelated to how closely the headline aligns with the participant's ideology. Thus, we conclude that analytic thinking is used to assess the plausibility of headlines, regardless of whether the stories are consistent or inconsistent with one's political ideology. Our findings therefore suggest that susceptibility to fake news is driven more by lazy thinking than it is by partisan bias per se - a finding that opens potential avenues for fighting fake news.
                Bookmark

                Author and article information

                Contributors
                Role: ConceptualizationRole: Formal analysisRole: InvestigationRole: MethodologyRole: Writing – original draft
                Role: MethodologyRole: Writing – review & editing
                Role: InvestigationRole: MethodologyRole: Writing – review & editing
                Role: MethodologyRole: Writing – review & editing
                Role: MethodologyRole: Writing – review & editing
                Role: Resources
                Role: ConceptualizationRole: SupervisionRole: Writing – review & editing
                Journal
                Philos Trans R Soc Lond B Biol Sci
                Philos Trans R Soc Lond B Biol Sci
                RSTB
                royptb
                Philosophical Transactions of the Royal Society B: Biological Sciences
                The Royal Society
                0962-8436
                1471-2970
                March 11, 2024
                January 22, 2024
                January 22, 2024
                : 379
                : 1897 , Theme issue ‘Social norm change: drivers and consequences’ compiled and edited by Giulia Andrighetto, Sergey Gavrilets, Michele Gelfand, Ruth Mace and Eva Vriens
                : 20230040
                Affiliations
                [ 1 ] Department of Psychobiology and Methodology of Health Sciences, Universitat Autònoma de Barcelona, , 08193 Barcelona, Spain
                [ 2 ] Department of Psychiatry and Forensic Medicine, Universitat Autònoma de Barcelona, , 08193 Barcelona, Spain
                [ 3 ] Center of Conflict Studies and Field Research, ARTIS International, , St Michaels, MD 21663, USA
                [ 4 ] Department of Psychology and Center for Neural Science, New York University, , New York, NY 10003, USA
                [ 5 ] Centre for the Politics of Feelings, School of Advanced Study, Royal Holloway, University of London, , London WC1E 7HU, UK
                [ 6 ] Department of Psychology, Royal Holloway, University of London, , Egham, Surrey TW20 0EX, UK
                Author notes

                One contribution of 15 to a theme issue ‘ Social norm change: drivers and consequences’.

                Electronic supplementary material is available online at https://doi.org/10.6084/m9.figshare.c.6980743.

                Author information
                http://orcid.org/0000-0003-2172-1184
                http://orcid.org/0000-0001-7753-7576
                http://orcid.org/0000-0002-2520-0442
                Article
                rstb20230040
                10.1098/rstb.2023.0040
                10799730
                38244594
                fe2e38ce-db24-449e-b918-26915e753f44
                © 2024 The Authors.

                Published by the Royal Society under the terms of the Creative Commons Attribution License http://creativecommons.org/licenses/by/4.0/, which permits unrestricted use, provided the original author and source are credited.

                History
                : March 2, 2023
                : July 25, 2023
                Funding
                Funded by: NOMIS Stiftung, http://dx.doi.org/10.13039/501100008483;
                Funded by: HORIZON EUROPE European Innovation Council, http://dx.doi.org/10.13039/100018703;
                Award ID: FETPROACT-EIC-05-2019
                Categories
                1001
                42
                Articles
                Research Articles
                Custom metadata
                March 11, 2024

                Philosophy of science
                misinformation,social media,social norms,social identity,intervention
                Philosophy of science
                misinformation, social media, social norms, social identity, intervention

                Comments

                Comment on this article