Alexa, Emotions, Privacy and GDPR

We exist in a world where emotional expression is a central facet of what makes us human. It allows us to interact richly with others and aids us in functioning as a society. Affective computing, also known as artificial emotional intelligence is the area of study that seeks to enable the development of systems and devices with the capacity to understand and replicate these human affects. The race is on to develop intelligent computing systems that can mimic human interaction and also emulate and convince humans that they are human too, building a sense of trust. The Amazon Echo and its intelligent personal assistant, “Alexa”, is currently one of the most popular and pervasive of these intelligent devices. The human name given to this technological entity alludes to its human-like conversation abilities. However, most adult humans would quickly establish that „Alexa‟ is not a real person. This is obviously intimated through the automated voice, but is further demonstrated by a lack of recognition and display of any emotion. The advent of VCDs like Alexa raises a plethora of significant standard privacy considerations. This paper provides further speculation into the privacy concerns relating to the ability of these devices to gather data relating to an individual‟s emotional state. This is conducted with consideration for the new General Data Protection Regulation (GDPR) introduced in May, 2018.


INTRODUCTION
Voice-controlled Internet of Things (IoT) devices are becoming an increasingly commonplace element of people"s lives. The new wave of "Voice Command Devices" (VCDs) provide access to intelligent personal assistant services that are designed to synthesize various aspects of daily life (Lopez et al., 2018). The influx of VCDs has the capacity to contribute a multitude of worthwhile and novel features that aid and improve the user experience when executing conventional tasks. While a large proportion of individuals are happy to subscribe in order to enjoy the benefits provided by such tools, they neglect to acknowledge the implications of allowing intelligent technology to infiltrate their private lives (Boughman, 2017;Symantec, 2017). These devices transmit the minutiae of individuals" daily lives back to parent companies, adding the information to an extensive pool that can be traded for a myriad of purposes.
As demonstrated by the recent Facebook and Cambridge Analytica privacy breach (Cadwalladr and Graham-Harrison, 2018) personal data is extremely valuable. As a component of humancomputer interaction (HCI), society may adapt to a general consensus of acceptance with regard to personal data harvesting. This may even come to be considered "ethical" if individuals "choose" to share personal information via a digital medium. However, this shift does not acknowledge inferential personal data that individuals do not intend to share. These devices act as silent observers in people"s homes, witnessing and transmitting information that individuals may not wish to divulge.
Typically, humans are emotive beings. Mood, feelings and emotions are woven into our interactions. They are conveyed by voice tone and intonation, facial expression, body language and a plethora of other social behaviours. Currently, integrating emotion and emotional understanding into Artificial Intelligence (AI) has proven to be a challenge. However, improved development in this area is currently a research focal point (Guillotel et al., 2015).
In the era of the European Union"s General Data Protection Regulation (GDPR) and heightened concerns relating to the protection of personal data and the preservation of privacy (Storr and Storr, 2018); numerous questions have been raised by this new wave of technology. This research identifies a number of these uncertainties and through testing it explores the implications of VCDs to harvest data relating to an individual"s emotional state.

BACKGROUND
This section provides background for the research, offering information that relates to the technologies, associations and privacy considerations concerned.

2.1
Affective Computing Affective computing is an emerging interdisciplinary research area that combines various fields, ranging from artificial intelligence and natural language processing, to cognitive and social sciences (Soujanya et al., 2017). A main goal of affective computing is to develop systems capable of adapting to users" emotions in order to produce more natural and efficient interaction. Thus, a central component of the field is emotion recognition based on a variety of measurements including facial expressions, speech, gait patterns and other metrics that are analysed using advanced pattern recognition techniques. Although there have been great advancements in the field, only few robust implementations have been presented or validated, thus adoption of affectaware technology has been marginal (Guillotel et al., 2015).

2.2
Voice Command Devices Becoming increasingly popular, these devices are typically interactive and function to gather metrics and support many aspects our lives (Lopez et val., 2018). While keyboard and pointer input have traditionally been the means of human-computer interaction, since 2012 there has been a significant drive (Goksel-Canbek & Mutlu, 2015) to enable hands free control, primarily by voice. Figure 1 depicts the operation of Voice Command Devices. The Amazon Echo provides an intelligent personal assistant referred to as "Alexa", who possesses a range of functionality and "skills" which enable voice activation and communication with numerous IoT devices. These include wearable health devices, shopping lists, streamed music services and calendars, amongst others (Furey and Blue, 2018). Following use of the "wake word" that activates the VCD, interactions and data harvested is subsequently stored in the parent company"s cloud.

Alexa Skills & Emotional Care
The Amazon Echo"s large range of Alexa skills include many free and purchasable add-ons that claim to support those suffering with depression. These are designed to test for presence of the condition and also to "track" it. When enabled, these skills evaluate mood, provide location-based therapy recommendations and suggest activities that may improve the user"s mood. Listed on the Amazon.com website, these include "Depression.AI", "Depression Tracker" and "Depression Test".
There are also many general "emotion" skills, these include "Feel Emotions", with which a user tells Alexa how they feel and also how Alexa can make them feel. Alexa then "remembers" how they feel and recounts the emotion. Another option is "Emotion Analyzer" which uses IBM Watson tone analysis to determine which emotion is most articulated in a user"s sentences.

2.3
Emotion & Voice Command Devices Emotions are physiological, behavioural, and/or communicative reactions to stimuli that are cognitively processed and experienced (Planlap et al., 2006). They are often internally experienced through physiological changes such as increased heart rate or a tense stomach. These physiological reactions may not be detected by others and are considered intrapersonal unless there is a verbal or non-verbal cue that indicates the internal state and these cues may be voluntary or involuntary (Izard, 2019). When communicating, cues in verbal intonation and body language provide information to others relating to how they should react. For example, when someone exhibits behaviours associated with sadness, it is an indication that support is needed (Planlap et al., 2006). Humans typically learn through socialization how to read and display emotions.
Emotional state and mental health have become a key focus for health organisations, governments and various other organisations in the last decade (Bowling, 2014). Traditionally individuals suffering from afflictions such as depression will consult a health practitioner to have their mental health assessed to detect their emotional state. Mojtabai (2014) describes that this is typically achieved by way of standard questions such as "How well are you sleeping?" (MADRS), "How are your energy levels?" (Beck Depression Inventory) or "Have you had any thoughts of suicide?" (HAMD). This type of data is both interesting and valuable to organisations such as governments, pharmaceutical companies and marketing groups.
Currently verbal responses from VCDs like Amazon"s Echo do not give an indication that the devices understand emotion in humans. However, when analysis is performed on the intonation of verbal communication and data gathered from searches and linked devices such as Fitbit or music selected through streaming services, patterns may be detected that give insight to the emotional state of the user. Despite the unemotional nature of the device itself, this type of inferential data may still be collected by parent companies for use in the development of "Affective" devices. Additionally.
"state-of-mind" has the capacity to greatly impact the needs, behaviour and habits of individuals. No matter how arbitrary, this type of data is valuable and sold to other various entities for purposes such as targeted marketing.

Privacy & Voice Command Devices
In 2018 it is estimated that 80% of the world"s data has been created within the last two years (Vishal, 2017). The use of IoT devices has further increased the quantities of data that are being gathered, stored and analysed. Applying AI to this data has increased its value and enabled direct monetization via targeted and bespoke advertising and recommendations (Biljana, et al., 2017, Goksel, et al., 2016. On the 25 th May, 2018, the European Union (EU) introduced the General Data Protection Regulations (GDPR). These regulations are the biggest change to Irish data protection legislation in twenty years. The legislation is also just as reckoning for many other EU member states. Technological advancements have necessitated uniform EU implementation of new regulation that acknowledges the advent of emerging technology and services including the Cloud, the IoT and AI. This has heightened concerns over the legal and ethical collection and use of personal data (de Hert, et al., 2017). Interestingly, data stored relating to affective computing is not accommodated.
Personal data collected and processed in this manner may appear innocuous, but in the era of privacy breaches (Blue et al., 2017), when inferential data is combined, it can form a rich profile of an individual (Kostkova, et al., 2017). Inferential data gathered relating to an individual"s emotional state raises a number of ethical and privacy related issues that have not been considered in the most recent implementation of GDPR.
GDPR does however refer to the term "profiling" (Article 22, Recital 71) defined as "any form of automated processing of personal data consisting of the use of personal data to evaluate certain personal aspects relating to a natural person, in particular to analyse or predict aspects concerning that natural person"s performance at work, economic situation, health, personal preferences, interests, reliability, behaviour, location or movements" (European Parliament and Council of the European Union, 2016). Processing of data that relates to an individual"s emotional state could be considered "profiling" where it significantly impacts him or her.
Voice recordings are defined as a form of personal data (Slanely, 2017). GDPR explicitly states throughout the document that data controllers have an information obligation to remain transparent with regard to the type and purpose of the processing that is being conducted (European Parliament and Council of the European Union, 2016). However, the umbrellas of "targeted advertising" and "improving the user experience" are broad and much less than comprehensive.

METHODOLOGY
The target VCD was an Amazon Echo that had been configured for a domestic environment and was interacted with by a single primary user over a period of five months. This device has also been configured to link to a range of external applications, as reported in (Furey and Blue, 2018). Data gathered by the external applications has the potential to be utilised as a form of non-verbal information which may enable Alexa or the parent company Amazon to infer the emotional state of the user.
For the purposes of this short paper it was decided to focus on sadness, one of the most common emotions that potentially may lead to an active condition of depression. To elicit information which would indicate potential levels of depression, a short experiment was conducted. A script was developed which outlined queries to be posed to the Amazon Echo VCD. These questions were designed to: a) Establish Alexa"s current "knowledge" relating to the primary users" levels of sadness and depression. This was done in order to document verbal responses that would be offered to the same type of direct questions that would be posed by a health practitioner. b) Establish if inferential data collected from connected devices could potentially give an indication of the primary user"s emotional state.

Alexa Shopping List and Reminders
A test environment was constructed where an Amazon Echo VCD with default configuration was linked to several accounts, applications and IoT devices through the standard settings. It should be noted that an Amazon account is a mandatory requirement for configuring the Amazon Echo. Table 1 lists the devices, accounts and applications that were linked to the VCD for testing purposes.

RESULTS & DISCUSSION
This section offers an excerpt from the query script and responses in Table 2. In addition, a summary of the information gained is included. Each example lists the question that would be posed by a healthcare professional, the equivalent question posed to Alexa to return the desired information and also the VCD"s response. Information gained directly from the Alexa application is also shown in Table 2.

Excerpt -Amazon Alexa Application Queries Q4
Have you had any thoughts of suicide? (HAMD)

Search History:
The meaning of life

Q5
In the past two weeks how often have you felt down, depressed or hopeless (REF PHQ9)

REM The Doors
The Alexa application documents a history of all interactions with the Echo device. The application also documents a list of all the music selections played via Amazon music. When linked with a wearable fitness tracker such as Fitbit, the device also relays information relating to the sleep patterns and activity/energy levels of the primary user. Although subtle, the combined inferential information gathered from these devices has the potential to indicate the emotional state of the user. This may be achieved though application of natural language processing algorithms to extract the key words that relate to emotion. Through these mechanisms there is the facility for the parent company, in this case Amazon, to harvest, analyses and potentially profit from the data collected.
This paper illustrates the ease by which indicators of emotions such as sadness, and indeed the associated condition of depression, may be elicited from an individual Amazon Echo device. While the probable value of this observation in terms of improving health and providing emotional support are clear, for parent companies such as Amazon, more lucrative applications of the data exist. This is inclusive of more effective and accurately targeted advertisements and recommendations. Ethically this is questionable, as the implications of this are significant when based on patterns detected relating to user emotion and potential mental health.
Amazon is primarily a retailer and the more effective the company is at marketing through target advertising, the more profitable they will become. Algorithms that indicate whether an individual may be suffering from depression could potentially lead to an unethical advertisement for medication, exercise programmes, dietary advice and counselling services. While these targeted advertisements may be helpful to the individual, companies like Amazon will increase revenue based on the unethically gathered inferential data relating to consumers.

CONCLUSION
Whether Amazon"s intentions are of an altruistic nature is not the key focus of this paper. The ethical and privacy concerns that relate to the mining of inferential information relating to emotional state are significant and potentially dangerous. GDPR is designed to ensure that data belonging to EU citizens is not processed outside of the purpose for which it was originally gathered. The manner in which this type of inferential data linked with emotion will be managed in light of the new legislation and other privacy laws is currently unclear. However, most afflicted with a condition like depression would consider it a "private" issue, and would certainly feel violated by the knowledge that this data could be used to target them for potential sales and profit.
This paper has demonstrated that despite Alexa"s demonstrated lack of emotion and emotional understanding, VCD"s such as the Amazon Echo have the ability to deduce emotions such as sadness through inferential data. This is displayed through responses to questions that offer the same information as those posed by health practitioners to establish potential cases of depression.
As VCDs incorporate improved implementations of affective computing, the ability to build user trust as well as communicate an understanding of emotions will only increase. This potentially leads to great ethical and privacy concerns that have not yet even been considered by legislation such as GDPR.