33
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Citizens Versus the Internet: Confronting Digital Challenges With Cognitive Tools

      research-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          The Internet has evolved into a ubiquitous and indispensable digital environment in which people communicate, seek information, and make decisions. Despite offering various benefits, online environments are also replete with smart, highly adaptive choice architectures designed primarily to maximize commercial interests, capture and sustain users’ attention, monetize user data, and predict and influence future behavior. This online landscape holds multiple negative consequences for society, such as a decline in human autonomy, rising incivility in online conversation, the facilitation of political extremism, and the spread of disinformation. Benevolent choice architects working with regulators may curb the worst excesses of manipulative choice architectures, yet the strategic advantages, resources, and data remain with commercial players. One way to address some of this imbalance is with interventions that empower Internet users to gain some control over their digital environments, in part by boosting their information literacy and their cognitive resistance to manipulation. Our goal is to present a conceptual map of interventions that are based on insights from psychological science. We begin by systematically outlining how online and offline environments differ despite being increasingly inextricable. We then identify four major types of challenges that users encounter in online environments: persuasive and manipulative choice architectures, AI-assisted information architectures, false and misleading information, and distracting environments. Next, we turn to how psychological science can inform interventions to counteract these challenges of the digital world. After distinguishing among three types of behavioral and cognitive interventions—nudges, technocognition, and boosts—we focus on boosts, of which we identify two main groups: (a) those aimed at enhancing people’s agency in their digital environments (e.g., self-nudging, deliberate ignorance) and (b) those aimed at boosting competencies of reasoning and resilience to manipulation (e.g., simple decision aids, inoculation). These cognitive tools are designed to foster the civility of online discourse and protect reason and human autonomy against manipulative choice architectures, attention-grabbing techniques, and the spread of false information.

          Related collections

          Most cited references345

          • Record: found
          • Abstract: not found
          • Article: not found

          The spread of true and false news online

            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            Dissecting racial bias in an algorithm used to manage the health of populations

            Health systems rely on commercial prediction algorithms to identify and help patients with complex health needs. We show that a widely used algorithm, typical of this industry-wide approach and affecting millions of patients, exhibits significant racial bias: At a given risk score, Black patients are considerably sicker than White patients, as evidenced by signs of uncontrolled illnesses. Remedying this disparity would increase the percentage of Black patients receiving additional help from 17.7 to 46.5%. The bias arises because the algorithm predicts health care costs rather than illness, but unequal access to care means that we spend less money caring for Black patients than for White patients. Thus, despite health care cost appearing to be an effective proxy for health by some measures of predictive accuracy, large racial biases arise. We suggest that the choice of convenient, seemingly effective proxies for ground truth can be an important source of algorithmic bias in many contexts.
              Bookmark
              • Record: found
              • Abstract: not found
              • Article: not found

              Social Media and Fake News in the 2016 Election

                Bookmark

                Author and article information

                Journal
                Psychol Sci Public Interest
                Psychol Sci Public Interest
                PSI
                sppsi
                Psychological Science in the Public Interest
                SAGE Publications (Sage CA: Los Angeles, CA )
                1529-1006
                2160-0031
                16 December 2020
                December 2020
                : 21
                : 3
                : 103-156
                Affiliations
                [1 ]Center for Adaptive Rationality, Max Planck Institute for Human Development
                [2 ]School of Psychological Science, University of Bristol
                [3 ]School of Psychological Science, University of Western Australia
                Author notes
                [*]Anastasia Kozyreva, Center for Adaptive Rationality, Max Planck Institute for Human Development E-mail: kozyreva@ 123456mpib-berlin.mpg.de
                Article
                10.1177_1529100620946707
                10.1177/1529100620946707
                7745618
                33325331
                4c3230d3-9970-41ff-b5b2-1738a25debf5
                © The Author(s) 2020

                This article is distributed under the terms of the Creative Commons Attribution 4.0 License ( https://creativecommons.org/licenses/by/4.0/) which permits any use, reproduction and distribution of the work without further permission provided the original work is attributed as specified on the SAGE and Open Access page ( https://us.sagepub.com/en-us/nam/open-access-at-sage).

                History
                Categories
                Article
                Custom metadata
                ts1

                attention economy,behavioral policy,boosting,choice architecture,cognitive tools,decision aids,disinformation,false news,media literacy,nudging

                Comments

                Comment on this article