Blog
About

162
views
0
recommends
+1 Recommend
1 collections
    6
    shares
      • Record: found
      • Abstract: found
      • Conference Proceedings: found
      Is Open Access

      On the Cruelty of Computational Reasoning

      1 , 1 , 2

      Politics of the Machines - Art and After (EVA Copenhagen)

      Digital arts and culture

      15 - 17 May 2018

      Human Rights, Artificial Intelligence, Big Data, Society

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          We seek, firstly, to demonstrate the cruelty of current computational reasoning artefacts when applied to decision making in human social systems. We see this cruelty as unintended and not a direct expression of the motives and values of those creating or using algorithms. But in our view, a certain consequence nevertheless. Secondly, we seek to identify the key aspects of some exemplars of AI products and services that demonstrate these properties and consequences and relate these to the form of reasoning that they embody. Third we show how the reasoning strategies developed and now increasingly deployed by computer and data science have necessary, special and damaging qualities in the social world. Briefly noting how the narrative underpinning the creation and use of AI and other tools provides them with power in neoliberal economies. Creating a disempowered data ‘subject’ in an inferior, economically and politically supine position from which they must defend themselves if they can.

          Related collections

          Most cited references 9

          • Record: found
          • Abstract: found
          • Article: not found

          Complacency and bias in human use of automation: an attentional integration.

          Our aim was to review empirical studies of complacency and bias in human interaction with automated and decision support systems and provide an integrated theoretical model for their explanation. Automation-related complacency and automation bias have typically been considered separately and independently. Studies on complacency and automation bias were analyzed with respect to the cognitive processes involved. Automation complacency occurs under conditions of multiple-task load, when manual tasks compete with the automated task for the operator's attention. Automation complacency is found in both naive and expert participants and cannot be overcome with simple practice. Automation bias results in making both omission and commission errors when decision aids are imperfect. Automation bias occurs in both naive and expert participants, cannot be prevented by training or instructions, and can affect decision making in individuals as well as in teams. While automation bias has been conceived of as a special case of decision bias, our analysis suggests that it also depends on attentional processes similar to those involved in automation-related complacency. Complacency and automation bias represent different manifestations of overlapping automation-induced phenomena, with attention playing a central role. An integrated model of complacency and automation bias shows that they result from the dynamic interaction of personal, situational, and automation-related characteristics. The integrated model and attentional synthesis provides a heuristic framework for further research on complacency and automation bias and design options for mitigating such effects in automated and decision support systems.
            Bookmark
            • Record: found
            • Abstract: not found
            • Article: not found

            Machine Bias: There’s Software Used Across the Country to Predict Future Criminals. And It’s Biased Against Blacks

              Bookmark
              • Record: found
              • Abstract: not found
              • Article: not found

              False positives, false negatives, and false analyses: A rejoinder to ‘‘machine bias: There’s software used across the country to predict future criminals

                Bookmark

                Author and article information

                Contributors
                Conference
                May 2018
                May 2018
                : 1-6
                Affiliations
                [1 ] ETIC Lab

                Newtown, UK
                [2 ] ETIC Lab

                Milton Keynes, UK
                Article
                10.14236/ewic/EVAC18.3
                © Hogan et al. Published by BCS Learning and Development Ltd. Proceedings of EVA Copenhagen 2018, Denmark

                This work is licensed under a Creative Commons Attribution 4.0 Unported License. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/

                Politics of the Machines - Art and After
                EVA Copenhagen
                7
                Aalborg University, Copenhagen, Denmark
                15 - 17 May 2018
                Electronic Workshops in Computing (eWiC)
                Digital arts and culture
                Product
                Product Information: 1477-9358BCS Learning & Development
                Self URI (journal page): https://ewic.bcs.org/
                Categories
                Electronic Workshops in Computing

                Comments

                Comment on this article