6
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Algorithmic Injustices: Towards a Relational Ethics

      Preprint
      ,

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          It has become trivial to point out how decision-making processes in various social, political and economical sphere are assisted by automated systems. Improved efficiency, the hallmark of these systems, drives the mass scale integration of automated systems into daily life. However, as a robust body of research in the area of algorithmic injustice shows, algorithmic tools embed and perpetuate societal and historical biases and injustice. In particular, a persistent recurring trend within the literature indicates that society's most vulnerable are disproportionally impacted. When algorithmic injustice and bias is brought to the fore, most of the solutions on offer 1) revolve around technical solutions and 2) do not focus centre disproportionally impacted groups. This paper zooms out and draws the bigger picture. It 1) argues that concerns surrounding algorithmic decision making and algorithmic injustice require fundamental rethinking above and beyond technical solutions, and 2) outlines a way forward in a manner that centres vulnerable groups through the lens of relational ethics.

          Related collections

          Author and article information

          Journal
          16 December 2019
          Article
          1912.07376
          22f2572d-6ab7-43d1-a046-acafb3776cd4

          http://arxiv.org/licenses/nonexclusive-distrib/1.0/

          History
          Custom metadata
          Presented at the Black in AI workshop, @NeurIPS2019
          cs.CY

          Applied computer science
          Applied computer science

          Comments

          Comment on this article