904
views
0
recommends
+1 Recommend
1 collections
    0
    shares

      Studying business & IT? Drive your professional career forwards with BCS books - for a 20% discount click here: shop.bcs.org

      scite_
       
      • Record: found
      • Abstract: found
      • Conference Proceedings: found
      Is Open Access

      Racial Data in Identity Construction of ‘Intelligent Agents’: Examining Conversations with BINA48 and Mythiccbeing

      Published
      proceedings-article
      Proceedings of Politics of the Machines - Rogue Research 2021 (POM 2021)
      debate and devise concepts and practices that seek to critically question and unravel novel modes of science
      September 14-17, 2021
      Artificial Intelligence, Conversational AI, Humanoids, Chatbots, Virtual Identity, Race, Gender, Bias
      Bookmark

            Abstract

            Conversational Artificial Intelligence (AI) has been utilized to create interaction between humans and machines. As an artistic medium to mediate an artificial being, the training dataset for the conversational AI as well as the algorithmic model plays a significant role in constructing the identity of ‘intelligent agents’. In this paper, two of the artworks: Conversation with BINA48 (2014) by Stephanie Dinkins and Mythiccbeing (2018) by Martine Syms are used as case studies to give a critical glance into the usage of racial data in AI identity construction. Through a close reading of Dinkins’ performative interaction with a customized AI, BINA48 and a chatbot created by Syms, the focus of this research lies in locating the current discourse of constructing virtual identities by a comparative textual analysis of the conversations respectively between BINA48 and Dinkins, Teenie and the audience.

            Content

            Author and article information

            Contributors
            Conference
            September 2021
            September 2021
            : 14-22
            Affiliations
            [0001]Media Arts Cultures EMJMD Program

            Aalborg University, Denmark
            Article
            10.14236/ewic/POM2021.2
            7c247e86-2a40-42ce-a5be-d37937211476
            © Lee. Published by BCS Learning & Development Ltd. Proceedings of Politics of the Machines - Rogue Research 2021, Berlin, Germany

            This work is licensed under a Creative Commons Attribution 4.0 Unported License. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/

            Proceedings of Politics of the Machines - Rogue Research 2021
            POM 2021
            3
            Berlin, Germany
            September 14-17, 2021
            Electronic Workshops in Computing (eWiC)
            debate and devise concepts and practices that seek to critically question and unravel novel modes of science
            History
            Product

            1477-9358 BCS Learning & Development

            Self URI (article page): https://www.scienceopen.com/hosted-document?doi=10.14236/ewic/POM2021.2
            Self URI (journal page): https://ewic.bcs.org/
            Categories
            Electronic Workshops in Computing

            Applied computer science,Computer science,Security & Cryptology,Graphics & Multimedia design,General computer science,Human-computer-interaction
            Humanoids,Race,Chatbots,Conversational AI,Artificial Intelligence,Bias,Gender,Virtual Identity

            REFERENCES

            1. , et al. (2021). Multimodal Datasets: Misogyny, Pornography, and Malignant Stereotypes. arXiv:2110.01963 [Cs]. arXiv.org, http://arxiv.org/abs/2110.01963 (Preprint)

            2. Bina48: Gender, Race, and Queer Artificial Life. (2016). Ada: A Journal of Gender, New Media, and Technology, 9.

            3. (2011). A New Algorithmic Identity. Theory, Culture & Society\, 28(6), 164–181. http://doi.org/10.1177/0263276411424420

            4. (2006) Race, Gender and Sex on the Net: Semantic Networks of Selling and Storytelling Sex Tourism. Media, Culture & Society, 28, (6), 883–905. SAGE Journals, http://doi.org/10.1177/0163443706068922

            5. (2003, September) Embodied Artificial Intelligence. Artificial Intelligence, vol. 149, no. 1, pp. 131–50. Science Direct, http://doi.org/10.1016/S0004-3702(03)00055-9

            6. (2009). Race as Technology. Camera Obscura: Feminism, Culture, and Media Studies, 24 (1), 177–207. http://doi.org/10.1215/02705346-2008-018

            7. (2016). The Return of the Chatbots. Natural Language Engineering, 22 (5), 811–17. Cambridge Core. http://doi.org/10.1017/S1351324916000243

            8. (2018). Conversing with Personal Digital Assistants: on Gender and Artificial Intelligence. Journal of Science and Technology of the Arts, 10(3), 2. http://doi.org/10.7559/citarj.v10i3.563

            9. and (1995) Constructions of the Mind: Artificial Intelligence and the Humanities. Stanford Humanities Review, 4 (1), 141–160.

            10. (2009). The Face and the Public: Race, Secrecy, and Digital Art Practice. Camera Obscura, 24, 1 (70), 37–65. http://doi.org/10.1215/02705346-2008-014

            11. , & (1997). Slang and Sociability: In-Group Language among College Students. Language, 73(4), 860. http://doi.org/10.2307/417338

            12. (2017). Technology becomes her. New Vistas, 3(1), 46–50. http://repository.uwl.ac.uk/id/eprint/3356/

            13. (2002). Probing the Surveillant Assemblage: On the Dialectics of Surveillance Practices as Processes of Social Control. Surveillance & Society, 1 (3), 399–411. http://doi.org/10.24908/ss.v1i3.3347.

            14. , et al. (2021). Dual Humanness and Trust in Conversational AI: A Person-Centered Approach. Computers in Human Behavior, 119. http://doi.org/10.1016/j.chb.2021.106727

            15. , , , , , & (2004). Othering and Being Othered in the Context of Health Care Services. Health Communication, 16(2), 255–271. http://doi.org/10.1207/s15327027hc1602_7

            16. (2009, December). From Embodied Mind to Embodied Robotics: Humanities and System Theoretical Aspects. Journal of Physiology, Paris, 103 (3–5), 296–304. http://doi.org/10.1016/j.jphysparis.2009.08.012.

            17. (2019, September). Embodied AI beyond Embodied Cognition and Enactivism. Philosophies, 4, (3), 39. http://doi.org/10.3390/philosophies4030039.

            18. (2005). Helpless machines and true loving care givers: a feminist critique of recent trends in humanâ€?robot interaction. Journal of Information, Communication and Ethics in Society, 3(4), 209–218. http://doi.org/10.1108/14779960580000274

            19. (1998) Artificial Knowing: Gender and the Thinking Machine. Routledge, Abingdon-on-Thames.

            20. , et al. (2021). On the Opportunities and Risks of Foundation Models. Center for Research on Foundation Models (CRFM) Stanford University, Stanford. http://arxiv.org/abs/2108.07258v2

            21. (2012) Race And/As Technology, or How to Do Things to Race. In & (Eds.) Race After the Internet, Routledge. Abingdon-on-Thames.

            22. (1998). Stereotyping, Prejudice, and Discrimination. In , , & (Eds.) The Handbook of Social Psychology. McGraw-Hill, New York.

            23. (2010). Robotics in the Study of Animal Behavior. & (Eds.) Encyclopedia of Animal Behavior. Academic Press, Freiburg.

            24. (2019). Identity: Contemporary Identity Politics and the Struggle for Recognition. Farrar, Straus and Giroux, New York.

            25. (1999). How We Became Posthuman: Virtual Bodies in Cybernetics, Literature, and Informatics. University of Chicago Press, Chicago.

            26. (1997). Dislocating the Color Line: Identity, Hybridity, and Singularity, African-American Narrative. Stanford University Press, Stanford.

            27. (1992). Playing in the Dark. Harvard University Press, Cambridge MA.

            28. and (2014). Racial Formation in the United States. Routledge, Abingdon-on-Thames.

            29. (1995). Race and the Education of Desire: Foucault’s History of Sexuality and the Colonial Order of Things. Duke University Press Books, Durham.

            30. , & (2013) Digital Identity and Social Media. IGI Global, Hershey.

            31. , , , , , , & (2018). Robots and Racism. ACM/IEEE International Conference on Human Robot Interaction (HRI), 5 – 8 March, Chicago, 196-204. Association for Computing Machinery, New York. http://doi.org/10.1145/3171221.3171260

            32. , & (2018, January 21). Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification. Proceedings of Machine Learning Research. Volume 81: Conference on Fairness, Accountability and Transparency, 23 - 24 February 2018, New York. http://proceedings.mlr.press/v81/buolamwini18a.html (online)

            33. (2019, May 10). Voice Recognition Still Has Significant Race and Gender Biases. Harvard Business Review. http://hbr.org/2019/05/voice-recognition-still-has-significant-race-and-gender-biases

            34. (2017, April 26). Lifelike robot Sophia on Jimmy Fallon’s Tonight Show. Mail Online. http://www.dailymail.co.uk/sciencetech/article-4448180/Lifelike-robot-Sophia-Jimmy-Fallon-s-Tonight-Show.html

            35. , & (2017, November 7). Robots, Race, and Algorithms: Stephanie Dinkins at Recess Assembly. Art21 Magazine. http://magazine.art21.org/2017/11/07/robots-race-and-algorithms-stephanie-dinkins-at-recess-assembly/#.YTCt99MzZN0

            36. (2010, May 13). View of Toward a Theory of Critical Computing; The Case of Social Identity Representation in Digital Media Applications | CTheory. Code Drift: Essays in Critical Digital Studies. http://journals.uvic.ca/index.php/ctheory/article/view/14683/5553

            37. (2014). Visible Social Identities vs Algorithmic Identities, Cyborgology. http://thesocietypages.org/cyborgology/2014/09/20/visible-social-identities-vs-algorithmic-identities/

            38. (2020, June 19). Martine Syms, “Threat Model,” “MythiccBeing.” De Young. http://deyoung.famsf.org/martine-syms-threat-model-mythiccbeing

            39. (2020). Talk to me! Chatbots in Museums. ZKM. http://zkm.de/en/talk-to-me-chatbots-in-museums

            40. (2012). Bina48 is first humanoid robot to address a conference « Kurzweil. http://www.kurzweilai.net/. http://www.kurzweilai.net/bina48-is-first-humanoid-robot-to-address-a-conference

            41. (2020, July 1). MIT apologizes, permanently pulls offline huge dataset that taught AI systems to use racist, misogynistic slurs. The Register. http://www.theregister.com/2020/07/01/mit_dataset_removed/

            42. (2017, May 31). At MoMA, Martine Syms Puts Black Female Identity under the Lens. Artsy. http://www.artsy.net/article/artsy-editorial-moma-martine-syms-puts-black-female-identity-lens

            43. (2017, July 13). Lifelike robots “joke” about taking over the world, say humans are not “the most ethical creatures.” The Independent. http://www.independent.co.uk/life-style/gadgets-and-tech/news/robots-debate-ai-humanity-take-over-world-lifelike-humans-hans-sophia-a7839061.html

            44. (2019, December 11). Intelligent robot says, ‘I’ll keep you safe in my people zoo.’ Metro. http://metro.co.uk/2015/08/31/intelligent-robot-tells-interviewer-ill-keep-you-safe-in-my-people-zoo-5369311/

            Comments

            Comment on this article