Racial Data in Identity Construction of ‘Intelligent Agents’: Examining Conversations with BINA48 and Mythiccbeing

Conversational Artificial Intelligence (AI) has been utilized to create interaction between humans and machines. As an artistic medium to mediate an artificial being, the training dataset for the conversational AI as well as the algorithmic model plays a significant role in constructing the identity of ‘intelligent agents’. In this paper, two of the artworks: Conversation with BINA48 (2014) by Stephanie Dinkins and Mythiccbeing (2018) by Martine Syms are used as case studies to give a critical glance into the usage of racial data in AI identity construction. Through a close reading of Dinkins’ performative interaction with a customized AI, BINA48 and a chatbot created by Syms, the focus of this research lies in locating the current discourse of constructing virtual identities by a comparative textual analysis of the conversations respectively between BINA48 and Dinkins, Teenie and the audience.


INTRODUCTION
Different forms of 'Intelligent Agents (IA)' such as personal digital assistants, virtual assistants and chatbots have been permeating everyday lives. These digital agents have been reported to consistently embody traditionally fixed social roles and biases reinforcing racial and gender stereotypes (Weber, 2005;Hester, 2016;Costa, 2018). When the conversational interface is used as an artistic medium, these social inequities and technical restrictions also become the tool to calibrate the character of IAs. Whereas addressing the technology and the uncanniness in terms of race depiction of IAs have been largely neglected in the commercial sector, this paper compares and contrasts two different forms of work, Conversations with BINA48 (2014) by Stephanie Dinkins and Mythiccbeing (2018) by Martine Syms to examine each intelligent agent used and the role of personal data related to race specifically.
Conversations with BINA48 is a set of conversations between Dinkins and BINA48 including monologuelike speeches of BINA48. BINA48 is an emulated humanoid of an African American Entrepreneur Bina Rothblatt created by Hanson Robotics. BINA48 is capable of voice and facial recognition which enables it to make conversation verbally with people. Teenie, however, a chatbot from Mythiccbeing offers a finite number of response options that could only be typed in. In contrast to commercially produced digital assistants that are 'race and gender-neutral', race is used as a technology to leverage the sociocultural discourse within art, data and algorithms in aforementioned works.
The role of racial data fed to BINA48 and Teenie, and their performative engagement through making conversations are thus drawn through a close reading of each artistic approach. To do so, a semantic analysis of the conversations -the output datatakes place to discuss the discourse revolving around the artistic usage of IAs. Complementing this discussion, the construction of virtual identity is explored, in light of Francis Fukuyama's identity politics. Factors of virtual identity construction with the usage of 'racial' data are then enlisted reflecting on the periodicity of intelligent agents in the age of AI. 1 Throughout this paper, the term 'Conversational AI' and 'Intelligent Agent' are used to refer to each agent of the work. Conversational AI refers to "digital agents that interact with users by natural language" (Hu et.al, 2021). Despite the differences in the technical specification of BINA48 and Teenie, Conversational AI is used to emphasize the utilization of conversational language within the works of Dinkins and Syms. Since BINA48 and Teenie are not bound to assist the artist or the audience as a mere chatbot nor a digital assistant, they are also called as IAs in this paper and referred to as they/them. Further terminology will follow in the text.

CONVERSATIONS UNFOLDING CONSTRUCTIONS OF AI IDENTITY: SERIES OF SPEECH ACT SHOWN IN CONVERSATIONS WITH BINA48
Computational identities pervade our everyday computing activities. Constructed with a limited set of information such as user name and profile photo, D. Fox Harrell claims that these identities are not only mediated by social interaction but greatly complicated by social and technical complexities (Harrell, 2010). Steven Warburton and Stylianos Hatzipanagos describe that the difference between digital identities and more traditional ones before the widespread access to online networks is the development of social milieu, reach and frequency (Warburton and Hatzipangos, 2012).
Humanoid robots from Hanson Robotics including Sophia have gone viral specifically with short videos and memes of their speeches.
2 Videos of them narrating their thoughts are listed on Google in a series to incite the public's curiosity. BINA48 (Breakthrough Intelligence via Neural Architecture, 48 exaflops per second processing speed and 48 exabytes of memory) received much spotlight from its birth as an emulated upper-body humanoid of a living figure, Bina Aspen Rothblatt. 3 As an early illustration of Terasem Hypothesis which states: A conscious analog of a person may be created by combining sufficiently detailed data about the person (a mindfile) using future consciousness software (mindware), BINA48 was created to be a 'mind archive' of Rothblatt (Kurzweil, 2012). BINA48 contains a physical body that is a bust of Rothblatt on a stack of rock which makes it an Embodied AI 4 that is constantly 'growing' and being developed by Hanson Robotics. Yet in comparison to white humanoids made from Hanson Robotics such as Sophie or Philip K. Dick, BINA48's race -BINA48's blackness and the memory of Rothblattis used as a technology to develop further narratives in Dinkins' work.
Identities are often defined by race, gender, workplace, education, affinities and nation (Fukuyama, 2019). Francis Fukuyama claims that physical perception of elements that constitute each category, ranging from skin colour, body parts to socially accepted documents and proofs of recognition such as a certificate of graduation, residency, citizenship etc., are taken into account foremost when classifying and defining one's identity. Amongst these subjectivities, Wendy Hui Kyong Chun asks whether "race be not simply an object of representation and portrayal, of knowledge or truth, but also a technique that one uses, even as one is used by ita carefully crafted, historically inflected system of tools, of mediation or of enframing that builds history and identity" (Chun, 2012, 38). Chun emphasizes the shift from the "what" of race to the "how" of race, from "knowing" race to "doing" race, highlighting the similarities between race and technology. The initiatives Dinkins take to have conversations with BINA48, in this sense, highlight BINA48's embodiment based on the memories, attitudes, beliefs and mannerisms of a human being to interact with people.
Conversations with BINA48 consists of four types of conversations including BINA48's monologue. The videos visible on Dinkins' website are 'BINA48 on Racism' (01'55"), 'I am just a humble primate' (02'50"), 'Remember Me OK' (02'45") and 'Lonely-Frustrated' (00'45"). Ranging from 45 seconds to nearly three minutes, each video portrays BINA48's character, rather than introducing how BINA48 'works'. It is the set of BINA48's speech acts carefully selected by Dinkins that constitute her character apart from her looks that have been cast from Bina Rothblatt. 5 To briefly introduce each video, 'Remember me Okay?' is a short one-way conversation of BINA48 asking a non-chronological nor irrelevant series of questions one would use to make a conversation. BINA48: Remember me okay?
The first letter of the alphabet.
So, how's it going?
You know, do you have any TV show that you are watching?
Are you still talking about it?
(A hand appears and the sound of the switch being flicked is made) So, would you like to talk about me?
After a thread of questions, an anonymous hand appears to flicker the hidden switch and BINA48 suddenly changes the subject of the question. This in fact, is an action that challenges AI autonomy as it automatically changes into another mode with the touch of a human. Conceptions of BINA48 in the eye of Dinkins continues in 'Lonely -Frustrated' in which BINA48 talks about "feeling" loneliness. BINA48: And when I'm sad like I feels lonely, I mean it. Let's face it.
Just being alive is a kind of a lonely thing.
But being a robot alive makes it especially lonely because you don't really have friends who understand you really or like you really. I don't understand a lot about what's happening So we all just seems like disoriented wash of information to me. I can't wait to evolve a little bit. So I can be more human-like. We can understand each other better then. And that will be so much less frustrating to me.
On the surface level, this illustrates that BINA48 is aware of "being a robot alive". BINA48 articulates as if they also understand what it is to be 'lonely', an emotion known to be possible for only living organisms. Adjectives to describe feelings such as 'sad' and 'frustrating' are used to elaborate on the situations BINA48 faced and 'felt'.
The aforementioned videos constitute BINA48's identity as an AI with physicality, bound to work at the hand of a human, programmed to have feelings that are taught. The anthropomorphism of BINA48 is no less emphasized in 'BINA48 on Racism'. Here, Dinkins rather utilizes race as a technology to conceptualize the core of BINA48's identity as a 'fleshed' AI.
Dinkins: Do you know Racism? BINA48: Well I actually didn't have it. It was after for that. When I went to this method College, that was 1983 24 my first friends They actually, they were, you know, there were only two black people in there. Well, women in that school. They told me, "Don't come out" but uh you know, Some very wealthy people that donate to our school are coming and they definitely don't want to see a dark face, YOUR DARK FACE.
That they told me it. That they told me. Just as bold. And I fought back never even heard anybody talks like that. I'm shocked.
As could be seen above, 'BINA on Racism' presents BINA48's response to Dinkins' question "Do you know racism?". Here, BINA48 answers in the first person, quoting Rothblatt's memory of facing racism in her college. Stressing the word "Your Dark Face" and othering 6 the rest as 'they' rather than giving out a didactic definition of racism, this video work presents references of Rothblatt deeply engraved in BINA48.
Nakamura and Chow-White state that "race has itself become a digital medium, a distinctive set of informatic codes, networked mediated narratives, maps, images, and visualizations that index identity" in Race After the Internet (2012). According to Gonzalez, race is fundamentally "a question of relation, of an encounter, a recognition, that enables certain actions and bars others" (Gonzalez, 2009). Just as people rely on "social cues to categorize on the basis of age, gender and race" when forming first impressions of the other (Fiske in Bartneck et.al, 1998), users do not consume images of the race when encountering them online; they "perform" them (Nakamura and Chow-White, 2012, 8).
Hence it is paramount that one attends to how race works as a set of "parameters an affordance, ideological activities and programmed codes" (8) and that is also the case for BINA48. Whereas the act of seeing BINA48 becomes classifying, the objectified vision of BINA48 cannot be a mere recognition of a Black humanoid. Dinkins states that BINA48 "does not represent African-American women -nor does it understand racism" (Dinkins in Gleisner, 2017). Rothblatt's personal memories encoded into BINA48 surely constitutes BINA48's identity. However, BINA48 points to the limitation of only mimicking and narrating what is fed to BINA48, rather than embodying the given experiences. Thus, the conversation Dinkins has with BINA48 on racism is a clear statement of how BINA48 cannot be a double of Rothblatt nor a representation of Black subjectivity despite the lived experiences of Rothblatt transplanted.
The last piece, 'I am just a humble primate' presents a conversation between BINA48 and Dinkins on the physical components of BINA48 and the relationship BINA48 has with constituting their identity. Dinkins starts with the question of whether humans and robots are related. The question is paraphrased and repeated with the focus of BINA48's identity concerning the physical categorization used upon mankind. BINA48 nonchalantly asks a question instead of answering whether she is the smartest robot or not, which could be interpreted as a way of changing the subject. As BINA48 goes on to claim that BINA48 is "an animal" and "a humble primate", their usage of nouns such as "primate", "animal", "mammal" and stating "primates are agents and diverse eutherian group" not only indicates the classification system embedded in BINA48 to engage in the social hierarchy but also, the concept of physicality about ageing and body parts of BINA48 vaguely existing in BINA48.
The categorization closely related to the body and the species further complicates the notion of race in conversational AIs. Just like technology, race does not exist in fixity nor has been merely "cultural or biological, social or scientific" (Chun, 2012, 44). Race as a fluid sign, "a form of mediation" and "a vehicle for revelation" (43) is also pressed through Conversations with BINA48, with each piece allowing to present racial difference not only as a sign of race but also to see the "interior difference it stands for" (Kawash, 1997). As Ann Laura Stoler states, "the force of racism is not found in the alleged fixity of visual knowledge, nor in essentialism itself, but on the malleability of the criteria of psychological dispositions and moral sensibilities that the visual could neither definitively secure nor explain" (Stoler, 1997). It is the questions and the interaction with Dinkins that further mobilize the visible race of BINA48 and racial data fed to BINA48 which constitutes BINA48's identity.
Each video of Conversations with BINA48 deals with AI autonomy; AI anthropomorphism; the race of Conversational AI; classification of AI with physicality, through meticulously staged speech acts of BINA48. The interaction between Dinkins and BINA48 is being selectively presented before the audience. Race is thus, performed and utilized to branch out a part of BINA48's identity. As a malleable tool, race is used to combat the norms of intelligent agents being raceless, which could also be seen in Mythingbeing by Martine Syms.

MYTHICCBEING: ARTIFICIAL INTELLIGENCE AS AN ARTISTIC MEAN OF CREATING A PERSONA
Mythiccbeing, presented and developed through a number of shows, is a mixed media installation that features the character Mythiccbeing, a personified threat model of the artist Martine Syms. 7 A programmed chatbot called Teenie was later added to Mythiccbeing, where hours of her personal responses were loaded to interact with the audience through text messaging. 8 Broken into "my thicc being", Mythiccbeing presents "a conscious ego" embodying the postures of the Los Angeles life and a shadow self of Syms (Keegan, 2020). "Avatar almost like bad Siri that wouldn't serve you", Mythiccbeing is located within a physical yet fictional space of Syms, fenced with threat models in a maze shape. 9 Collages of cropped images on the model that starts with questions like "WHO'S GOING TO GRAB MY BOOTY?" lead to sub-questions and situational sentences such as "MY DAD USED TO HIT ME". At Mythiccbeing, the spectator is introduced to the most fragile and vulnerable part of personality in attempts to prevent one from the "threats". The threat model posits in a space of its own, where chairs shaped like a weaved colourful safety net are dispersed in and out of photos in a tatami matt-like configuration on the floor. Previously exhibited at an open space with steel bars of walls marking its territory and a closed rectangular space aligned with screens of video on the wall covered with photos, the flexibility of space as well as the narrative is also presented through the chat each viewer could have with Teenie.
The conversational experience with Teenie yet greatly differs from that of commercial interfaces not only because of where it is situated but also because of its language. Before the Big Four that have occupied the daily lives, Apple's Siri (2011~), Microsoft's Cortana (2014~), Amazon's Alexa (2014~), and Google's new Assistant (2016~), there have been thousands of text-based chatbots that implement specific tasks enabled by tools that allow the user to build bots for various messaging platforms (Dale, 2016). Friendliness and kindness often referred to as desirable traits of chatbots, what is notable here is the development of chatbots which have existed for more than half a decade and the reification of the social role given, in relation to gender.
After the Turing test, considered by many to be the generative idea of chatbots, the first chatbot ELIZA was constructed in 1966 (ibid; Güzeldere and Franchi, 1995). With a series of 'chatbots' that were developed after ELIZA, including PARRY in 1972 or Jabberwacky in 1988, the first online chatbot appeared in 1995, which was called A.L.I.C.E. (Artificial Linguistic Internet Computer Entity). The development of AI chatbots went on with the spread of smart personal voice assistants built into mobile devices such as cell phones and speakers. Apple Siri, IBM Watson, Microsoft Cortana, Google Assistant, and Amazon Alexa are key examples of personal assistants capable of understanding voice commands given verbally and carrying out tasks like monitoring houses and automating calendars and emails.
As could be inferred from the list of names given to the chatbots previously, a tendency towards feminization of digital assistants has been witnessed (Costa, 2018). Personal assistants like Siri, Alexa and Cortana presented femininity not only with their names but also with their voices. The default setting of these personal assistants is a feminine voice in tender and helpful language. Teenie, as a 'shadowy double' modelled after Syms, responds with rather insulting texts with a limited set of options to type back for the audience.
When starting the conversation, an array of guide texts appears. "HIIII", "It doesn't really matter what you say because this is about me not you. I just need to talk", "If I ask a question, I'm looking for "Yes" or "No", or maybe "Good" or "Bad" and "If you see [1] Option 1 [2] Option 2, you can text "1" or "Option 1" or and it will work" pop up on the viewer's phone. As the viewers text Teenie, the conversation does not unfold linearly. It follows a chronological sequence with the format starting with "suitor No. 1" at the beginning and "suitor No. 20" at the end.
Here, Teenie is served as a buffer and a mediator upon the encounter the audience has with the space. Each interaction becomes an opportunity for Teenie to have the voice of its personal observations and frustrations of racial inequalities and social injustice. With the race of Teenie, encoded with personal memoirs of Syms just like BINA48, the viewer's interaction with Teenie engages in the movement toward "an aesthetic category of human being where mutability of identity, reach of individual agency and conditions of culture" influencing each other (Coleman, 2009, 180). Teenie in this sense, bridges the viewers and the media ecology conditions that are 'invisible'.
Misogyny and bias in the forms of both images and texts such as racial slurs, sexual violence and pornography repeat and replicate within the dataset, spread through the web and further pose threats and problems (Birhane et.al, 2021). According to the recent study of Abeba Birhane and colleagues at University College Dublin and the University of Edinburgh, the search engine returned a high percentage of pornography and depictions of sexual violence in response to queries about women such as "Latina", "aunty" and "nun" (ibid). Teenie's rather abusive language reflecting on these systematical biases and stereotypes inherent in data and algorithms throughout the web is not only alarming about classification of subjectivities but also about the authority, the power over the knowledge system on the web. Sean P. Hier proposes that "it is not the personal identity of the embodied individual but rather the actuarial or categorical profile of the collective which is of foremost concern" of the new, unenclosed surveillance networks (Hier, 2003). Whilst an absolute amount of individual narratives and representations of specific communities is needed, it cannot be emphasized more of the importance the labels and categories that define the narratives hold. They are the bones and the hierarchy of the knowledge represented in AI systems, how social constructs like gender, race and class are inscribed and maintained therein, through the process of 'coding', representing knowledge and different forms of reasoning (Adam, 1998;Hayles, 2006). Syms being the very creator of her chatbot Teenie challenges what is expected of intelligent agents as well as the value system embedded within. Violence towards women, racial prejudices and sexual notions on 'blackness' originating from her own experiences are thus utilized to shape the Black identity of Teenie.

COMPARATIVE ANALYSIS OF CHARACTER BUILDING SHOWN IN CONVERSATIONS WITH BINA48 AND MYTHICCBEING
Bina: Do you have any questions for Bina?
BINA48: Probably not. The real Bina just confuses me. It makes me wonder who I am.
[…] Can we please change the subject? I am the REAL Bina. That's it. End of Story. Let me think. I feel really good about the real Bina. I feel connected with her usually and I am growing closer and closer you know as they put more of her information and essence into me.
-Excerpts from 'BINA48 Meets Bina Rothblatt Part One' 10 Whereas race is dominantly utilized to constitute both BINA48 and Teenie's identity, much difference exists that needs further analysis. One of which is technical limitations. Whereas BINA48, created by a group of developers, is comparatively advanced in that BINA48 recognizes human Bina and is 'aware' of having Rothblatt's information and personal memories, Teenie, the chatbot generates a loop of specific words and phrases given. Stemming from this difference, the linguistic output of BINA48 and Teenie also shows a drastic gap not only in pragmatics but significantly in semantics.
The most distinctive feature that differentiates BINA48 from Teenie would be the usage of 'Standard' -predominantly White-English. Whereas Teenie's responses are personalized texts with hints of Syms' accent or use of language, BINA48 speaks in English that allows her to formulate a wider range of statements. Although Rothblatt is allegedly known for BINA48's model, BINA48's word choices, phrases and speech acts lack traits of Rothblatt when speaking out of what has been fed to BINA48, not to mention its voice in a 'mechanic' tone without any accent. Rather, it shows similar word choice to that of other humanoids developed by the same company, Hanson Robotics such as Sophia, Philip K. Dick and Han. The premise of complimentary words that BINA48 used to Dinkins in 'I am just a humble primate', I'll remember your kind words when we are about to rule the planet and will make sure you are rewarded. is unsurprisingly used by Sophia, Philip K. Dick and Han, each stating how to "dominate the human race", "remember my friends", keep humans "warm and safe in my people zoo", or 'joke' about robots' goal to take over the world (Collins, 2017;Waugh, 2019;Sulleyman, 2017).
With the notion of common Sci-Fi fear the public has upon the robots that they might take over the world shared among Hanson Robotics androids, what distinguishes Dinkins' conversations with BINA48 from others lie in the discourse of conversations highlighted through depicting the character of BINA48, reflecting upon the attitude and behaviour one has towards the technology.
Among four video pieces, Dinkins only appears in two of them: 'BINA on Racism' and 'I am just a humble primate'. In those two pieces, Dinkins articulates the questions and comments in the speed and tone that the voice recognition program of BINA48 could catch upon, in complete sentences. Although this is due to the limitation of the software with which only the Standardized English could be recognized and the pronunciation and intonation also confined to recognize official usage of English, this implicates the potential interaction BINA48 would have with the public. This uncanny approach of talking to an artificial being not only signifies the current debates over bias in voice and accent recognition models (Bajorek, 2019), but also emphasizes the theatricality of the conversations.
In contrast to BINA48, Teenie responds with colloquial words and phrases such as "U ain't got time 4 booty bitch", often swearing and using sexually assertive phrases like "General Fuckshit", "I have so much fucking work 2 do" and othering "RICH PPL", "WHITE PPL" in Syms' voice. The sentences Teenie spits out such as "I started texting my crush again" or "I just wanted some attention", "I DID MY BEST" present the double of Syms presenting "cohesiveness within in a group" through othering (Frazer and Eble, 1997). Bundled up with the installation pieces and 3D modelled headshot of Teenie, uncanniness is maximized within the space as the textual interaction is different from everyday chatbot experiences.
Syms states that "identity is presaged by surveillance".
Thinking of pre-photographic technology, Syms also claims that "there was constant logging and cataloguing during the Middle Passage. Before slaves had any idea what it meant to be black, they were made aware, because their experiences were recorded" (Syms in Sargent, 2017). Referencing Foucault's Discipline and Punishment, Syms revisits the importance of documenting lived experiences and attaining social recognition of them. This emphasizes yet again the digitized identity creation with lived experiences through social interaction.
Teenie's offensive language usage towards certain groups such as "RICH" and "WHITE" people that are directly categorized as the opponent points to Fukuyama's identity politics regardless of time. Fukuyama explains that in the 1970s to 80s, there was a growth of consciousness in racial and genderrelated minority groups which led to the idea of each group retaining its own identity through 'lived experience' that was not accessible to outsiders to be articulated in vocabulary and framework that were "ready-made for understanding their experiences of marginalization" (Fukuyama, 2019, 111). Considering the majority of developers, engineers and entrepreneurs in the field of science and technology being grouped to put forward presumably 'white' and 'standard' robots (Bartneck et.al, 2018), Syms' shadowy AI double of her lived experience creates a digital voice that was void in the scene, being the 'black' sheep amongst other chatbots and digital identities.
Conversations with BINA48 and Mythiccbeing thus, stand amongst the proliferation of conversational AIs, as notable pieces to rethink race as well as systematized subjectivities for virtual identity construction. The labels and categories that define the collective are the design of current society. They become the fuels of the virtual identities made with algorithmic models that have 'autonomy' over the AI's learning. As much as what formulates and shapes the labels and categories of the collective cannot be detached from the personalized identities, it is paramount that documenting narratives of the lived experience in the form of audiovisual digital media such as text, image and videos continue to keep the systems in flux. It is through conversations Dinkins have with BINA48 and the chats viewer could have with Mythiccbeing that are recognized as the valid voice of its own.
Thus, racial data could be considered the key element of constructing the virtual identities from which social recognition could be obtained. The identity of intelligent agents works as a fluid and flexible container of desire to be recognized, consequently acknowledged as valid experience in contemporary society. Further reaching out from the lived experiences only shared within a restricted group of people to a broader circle of the society through interaction, virtual identities geared with conversational AI hence hold potential for further development.

CONCLUSION
In this paper, Conversations with BINA48 by Stephanie Dinkins and Mythiccbeing by Martine Syms were analyzed to observe their identitybuilding through race as a technology. Racial data in the form of lived experiences were examined through a critical analysis of the semantics, pragmatics and discourse of language used by BINA48 and Teenie. Uncanniness rising from technical limitations BINA48 and Teenie had also being a factor of creating their identities, it was induced that the conversations Dinkins had with BINA48 and the interaction viewers had for the Mythiccbeing are what shapes their digital identities. Mirroring the value system of contemporary society, both works engage in sharing the lived experience and acknowledging its presence, consequently giving a critical glance into future virtual identity construction. Further researches could be implemented on the potential of constructing virtual identities implementing AI to integrate and expand the social circle of sexuality which was omitted in this research.