1,127
views
0
recommends
+1 Recommend
1 collections
    6
    shares

      Celebrating 65 years of The Computer Journal - free-to-read perspectives - bcs.org/tcj65

      scite_
       
      • Record: found
      • Abstract: found
      • Conference Proceedings: found
      Is Open Access

      On the Cruelty of Computational Reasoning

      proceedings-article
      1 , 1 , 2
      Politics of the Machines - Art and After (EVA Copenhagen)
      Digital arts and culture
      15 - 17 May 2018
      Human Rights, Artificial Intelligence, Big Data, Society
      Bookmark

            Abstract

            We seek, firstly, to demonstrate the cruelty of current computational reasoning artefacts when applied to decision making in human social systems. We see this cruelty as unintended and not a direct expression of the motives and values of those creating or using algorithms. But in our view, a certain consequence nevertheless. Secondly, we seek to identify the key aspects of some exemplars of AI products and services that demonstrate these properties and consequences and relate these to the form of reasoning that they embody. Third we show how the reasoning strategies developed and now increasingly deployed by computer and data science have necessary, special and damaging qualities in the social world. Briefly noting how the narrative underpinning the creation and use of AI and other tools provides them with power in neoliberal economies. Creating a disempowered data ‘subject’ in an inferior, economically and politically supine position from which they must defend themselves if they can.

            Content

            Author and article information

            Contributors
            Conference
            May 2018
            May 2018
            : 1-6
            Affiliations
            [1 ] ETIC Lab

            Newtown, UK
            [2 ] ETIC Lab

            Milton Keynes, UK
            Article
            10.14236/ewic/EVAC18.3
            a134d476-2c1c-45e3-9e94-64384f8cef38
            © Hogan et al. Published by BCS Learning and Development Ltd. Proceedings of EVA Copenhagen 2018, Denmark

            This work is licensed under a Creative Commons Attribution 4.0 Unported License. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/

            Politics of the Machines - Art and After
            EVA Copenhagen
            7
            Aalborg University, Copenhagen, Denmark
            15 - 17 May 2018
            Electronic Workshops in Computing (eWiC)
            Digital arts and culture
            History
            Product

            1477-9358 BCS Learning & Development

            Self URI (article page): https://www.scienceopen.com/hosted-document?doi=10.14236/ewic/EVAC18.3
            Self URI (journal page): https://ewic.bcs.org/
            Categories
            Electronic Workshops in Computing

            Applied computer science,Computer science,Security & Cryptology,Graphics & Multimedia design,General computer science,Human-computer-interaction
            Human Rights,Artificial Intelligence,Big Data,Society

            References

            1. 2016 Machine bias: There’s software used across the country to predict future criminals and it’s biased against blacks A vailable online at www.propublica.org/article/machine-bias-risk-SSRN : https://ssrn.com/abstract=3020259

            2. 2017 Datification, Organizational Strategy, and IS Research: What’s the Score? The Journal of Strategic Information Systems 26 3 233 241

            3. 2016 Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy Crown Publishing Group New York, NY

            4. 2017 Algorithmic Risk Assessment Policing Models: Lessons from the Durham HART Model and ‘Experimental’ Proportionality. Information & Communications Technology Law, Forthcoming Available at SSRN: https://ssrn.com/abstract=3029345 or http://dx.doi.org/10.2139/ssrn.3029345

            5. 2010 ”Complacency and Bias in Human Use of Automation: An Attentional Integration”. Human Factors 52 3 381 410

            6. 2017 Fair Prediction with Disparate Impact: A Study of Bias in Recidivism Prediction Instruments Big Data 5 2

            7. 2013 Predictive Validity of the COMPAS Reentry Risk Scales Northpointe https://epic.org/algorithmic-transparency/crim-justice/EPIC-16-06-23-WI- FOIA-201600805-MDOC_ReentryStudy082213.pdf May 2018

            8. 2018 Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor New York, NY: St. Martin’s Press

            9. 2016 False positives, false negatives, and false analyses: A rejoinder to ‘‘machine bias: There’s software used across the country to predict future criminals And it’s biased against blacks Available at: http://www.crj.org/page/-/publications/ rejoinder7.11.pdf

            10. 2017 The Ugly Truth About Ourselves and Our Robot Creations: The Problem of Bias and Social Inequity Sci Eng Ethics 2017 Sep 21 10.1007/s11948-017-9975-2 [Epub ahead of print]

            11. 2017 Feeding the Machine: Policing, Crime Data, & Algorithms William & Mary Bill ofdebtors They have none of the resources

            12. 2017 Forthcoming Available at personal, intellectual or financial that may be required to understand or militate the circumstances they find themselves in Not least because no other single actor in this ecosystem has the information to fully comprehend what it is that is happening to these people.

            Comments

            Comment on this article