Evaluating Mapping Designs for Conveying Data Using Auditory and Tactile Displays

One of the most fundamental interactions between a computing system and its user(s) is the transmission of information between the two. A common method for this transmission is parameter mapping - mapping data parameters to sensory parameters. This research focuses on non-visual information transmission including auditory, vibrotactile and electrotactile stimulation and understanding what users expect different values of data to sound and feel like. Understanding these expectations is rarely possible a priori, therefore the aim of this research is to fill this gap and begin to provide a framework of how to design parameter mappings for a variety of data values to be conveyed through a variety of non-visual parameters, such that these mappings are congruent with users expectations of how the data should be represented.


INTRODUCTION
One of the most fundamental interactions between a computing system and its user(s) is the transmission of information between the two.Information is often conveyed using an abstract representation rather than a direct representation, for example a secure internet connection is represented by a green padlock symbol as opposed to message literally explaining the connection's safety.Some advantages of abstract representations are that they are independent of language, may contain more information than literal representations and they can be implemented in different modalities such as visual, auditory or tactile.
One strategy for conveying information using an abstract representation is parameter mapping, or using some sensory parameter to represent a data value.Visual mappings are the most common however there are many situations where non-visual displays are preferable or necessary.Sonification -"the use of non-speech audio to convey information" (Kramer et al. 1999) and Tactons -"structured, abstract, [vibro] tactile messages which can be used to communicate information" (Brewster and Brown 2004) are two of the most common non-visual parameter mapped representations of information.
An important factor when designing parameter mapping interfaces is how the user expects a certain value of data or piece of information to be displayed through whichever modality is being used in the mapping.For example, artificial and natural alarm sounds like a fire alarm or a person screaming are typically rough or disharmonious sounding (Arnal et al. 2015), therefore it is reasonable to assume that users may expect a sonification relating to some value of danger to sound rougher as the value increases.Often in these types of displays, user expectations are not considered and even more rarely are they investigated, resulting in interaction designs that may be in conflict with how a user expects data to sound, feel or look like (depending on the modality(s) in use).
This work investigates how to map data parameters to sensory parameters, such that the mapping is congruent with how a user expects the data value to be conveyed to them.This work has a particular focus on non-visual information transmission including auditory, vibrotactile and electrotactile stimulation.Understanding how users expect various data value to sound/feel like is not possible a priori, therefore goal of this research is to fill this gap and begin to provide a framework of how to design parameter mappings for a variety of data values to be conveyed through a variety of sensory parameters.This paper discusses the background and motivation of this and discusses the progress towards these goals from the first 18 months of the project, and finally the plan for the remaining 18 months.

CONTEXT AND BACKGROUND
In a parameter mapping sonification system, data values are used to manipulate acoustic parameters, such as frequency (pitch) or tempo, which facilitate the communication of the data value (Hermann et al. 2011).Many of these data:sound mappings fail to account for the listener's mental model of how the data value should sound during sonification -a mental model being defined as "a representation of some domain or situation that supports understanding, reasoning, and prediction" (Gentner 2001).This deficit may result in auditory displays that are incongruent with how a user may expect a particular data value to sound.
Early in the development of the field of sonification, one of its first proponents discussed some general principles for designing auditory displays (Kramer 1994).In this work a number of perceptual factors were established that may be practically implemented in a sonification system, including affective associations and metaphorical associations.A metaphorical association is as simple as an association such as louder = more -a larger object generally makes a louder sound, therefore a user may perceive increasing loudness of a sonified value to indicate an increase in this value.Affective associations were described as "the association of feelings about data (if such feelings exist) with feelings aroused by changes in the sound".An example of such an association was given in the context of an ecologist -to such a researcher, data indicating an increase in rain acidity would generally be described as undesirable, and therefore may cause a subtle negative affect.A sonification mapping utilising this affect could be described as: an increase in "auditory ugliness" = an increase in an undesirable data variable.
Although there are sporadic studies investigating the information transmission capacity and accuracy of various data:sound pairings (Pollack 1954 andBly 1982), Walker & Kramer's study -originally presented in 1996and published in 2005(Walker & Kramer 2005) was the first to directly investigate the importance of mappings between data and sound parameters in sonification.They investigated utilising a number of common acoustic parameters such as pitch, onset, loudness and tempo to convey common data variables like temperature, pressure, size and rate in a simple process-monitoring task.They found the mappings that the sound designers believed to be optimal, e.g.temperature:pitch, did not result in either the most accurate responses or the fastest response times.However results showed that using slower acoustic changes such as onset to represent size resulted in improved performance over other size mappings.A post hoc explanation for this result were posited by the authors was that larger objects in physical space move more slowly (i.e due to inertia).The authors suggest that the user's mental models of data:sound mappings has an effect on the perception of that mapping -such as user's mental models of large objects and slow movement.
Walker continued this work, applying magnitude estimation -a common experimental method used in psychophysics, to assess what acoustic parameter may be most effective for a given data value (Walker 2002(Walker , 2007)).They established a polarity consensus and scale for each mapping -polarity being the direction the user perceives (i.e. increase in pitch = increase in temperature) and scale being how much perceived increase/decrease in the data value for a given increase/decrease in the acoustic parameter.Polarity was used as a predictor of the "naturalness of a mapping" -if a given polarity obtained a majority of all responses by participants in a block it was predicted to be a "good" polarity choice and it can therefore be predicted that the mapping itself is effective.These works have provided a solid basis for the study of data:sound mappings as they are exemplary of both the complex nature of user's perceptions of mappings (in that they are not always perceived as expected by the designer) and the importance of the user's mental model in this perception.However, these studies remain introductory as they look at a very small number of data:sound mappings in which they acoustic and data variables used are quite simple.It is necessary to investigate more complex data:sound mappings in order to provide designers with a wider palette of acoustic parameters to choose from when they are designing sonification systems to ensure that they are accurate, comfortable and Similarly to auditory displays, vibrotactile displays have been utilised in human-computer interaction for a number of years.In many cases, vibrotactile feedback is simply used to attract attention (e.g. to notify users to a text message or incoming call), or to give interaction feedback (such as confirming a key press on a touchscreen device).However, vibration has a number of dynamic properties, such as frequency, amplitude, rhythm and location on the body that can be structured into messages or Tactons which can afford more complex information to be encoded in vibration.The advantage of Tactons is that they not only notify the user to information but also convey it.In some contexts this may be all that is required, or it may provide some information and the user may choose to engage another modality to access more.There has been work investigating the potential of Tactons for affective feedback which has shown they can be a useful tool to convey emotion during communication (Yoo 2015, Wilson 2017, Seifi 2013), however there has not been any attempt to evaluate data:vibration mappings for use in Tactons.

RESEARCH APPROACH
Based on this previous work, the focus of this research is to further investigate parameter mappings in sonification -expanding into to more complex acoustic parameters and data variables than were previously investigated.With current technology it is much easier and less resource intensive to provide rich auditory feedback from devices, therefore it is necessary study how this feedback can be used to convey data effectively and through sound.Furthermore, this project aims to apply the methods of evaluating parameter mapping designs to modalities which have the capacity to convey information in a parameter-mapped design, but have been less studied such as vibrotactile and electrotactile displays.As with auditory displays, complex vibrotactile technology beyond a simple rumble is increasingly present in phones such as the iPhone and increasingly in videogame and virtual reality controllers such as the Nintendo Switch and HTC Vive, but there is a deficit in research that can guide designers in displaying information using the rich vibrotactile feedback that these devices can offer.The approach in this work is to use the methods evaluated by Walker (Walker 2002(Walker , 2007) ) to broaden the space of data:sound mappings that have been researched as well as applying these methods to vibrotactile and electrotactile stimulation to create a set of data:tactile mappings.

COMPLETED WORK
Preliminary Study: Evaluation of Psychoacoustic Sound Parameters for Sonification (Ferguson & Brewster 2017) This study evaluated data:sound mappings that are based on psychoacoustic sensations (rather than the more simple pitch or tempo mappings used in previous works), in an attempt to move towards using data-to-sound mappings that are aligned with the listener's perception of the data value's auditory connotations.The acoustic parameters used were roughness, sharpness and pitch as a comparison to previous studies.Roughness, noise and sharpness were chosen based on the task in this study which was to assess the quality of an astronomical image using sound.This study was a collaboration with Yerkes Observatory who were working with visually impaired students to create non-visual astronomy tools.We decided on roughness, noise and sharpness as they are often indicators of quality across senses -"a sharp image" or "rough sounding".Therefore potentially, participants would more strongly relate these parameters to the quality of an image than pitch.The task was simply to grade the quality of an image using a sonification of the amount of blur present in the image.A second version of this study was also carried out which had a visual condition, as the observatory wished to investigate if it would be possible to combine the tool for sighted and visually impaired students.The results showed that using noise to convey the quality of an image was effective and performed closely to the visual equivalent (within 7%) suggesting that participants associated acoustic noise with reduced visual quality.

Study 1: Investigating Perceptual Congruence between Data and Display Dimensions in
Sonification (Ferguson & Brewster 2018) As the preliminary study indicated that psychoacoustic sound parameters did have potential for use in sonification, a new study was designed based on Walker's evaluated magnitude estimation method (Walker 2002) in which these parameters would be mapped to a variety of common data variables and these mappings would be evaluated.The acoustic parameters studied were roughness, noise and pitch (again to compare to previous work) and the data variables or concepts were stress, error and danger.These data concepts were chosen because of the predicted association between musically "undesirable" parameters such as roughness and noise and "undesirable" data qualities like stress, error and danger.Polarity results from the study were as predicted -increasing the roughness of a sound resulted in an increased perception of the magnitude of stress, error and danger, with similar results for the noise condition.

Study 2: Evaluating Mapping Designs for Conveying Data through Tactons.
This study was similar to the previous, however the focus was on vibrotactile stimuli.In this study the vibrotactile parameters used were: frequency, roughness, tempo and duration and the data concepts used were: accuracy, error, danger, size, distance, current and stress.These were chosen to ensure a mix of somewhat polarising variables (i.e.accuracy is desirable, error is not) and more general variables like size or distance.Key results from this study show that duration and tempo are not tempo are not mapping dependent, meaning that for all mappings the majority of polarities were positiveincreasing tempo or duration resulted in increased perceived magnitude of the data concept.

FUTURE WORK
Moving forward from the work already completed, one of the next areas of work in this research is to investigate parameter mappings of electrotactile stimulation, in a similar fashion to the vibrotactile study.Rationale for studying electrotactile stimulation is that it is rarely used in HCIparticularly in an information transmission capacity, however there is history of use in assistive technology for prosthesis users.Therefore, the aim is to investigate a number of mappings in this space, to establish if electrotactile can be an additional nonvisual information transmission modality alongside auditory and vibrotactile.
Finally, to round off this project, a more ecologically valid experiment will be designed where results from the project so far in a lab can be validated.The current approach is to design a study in which users have to carry out a task using information provided by auditory, vibrotactile and/or electrotactile stimulation without a priori knowledge of what the parameters represent, so that the true effectiveness of the mapping can be established.
An example scenario could be a mid-air gesture task where the participant must gesture to a point in 2D space, where their movement is picked up using a Leap controller.The parameter being tested would increase/decrease as they get closer/further from the target.In this example if acoustic roughness decreased as the user got closer to the target, this would evaluate a mapping of roughness:accuracy where accuracy is how close they are to the target point.The ability or difficulty to find the target point with no instruction may provide an ecologically valid evaluation of these data:sound mappings.This is an initial plan to further solidify the projects thesis, however feedback from the community will be valuable in shaping this study.

EXPECTED CONTRIBUTIONS
Rich auditory and haptic experiences are becoming more and more common in consumer technologies such as phones, wearables, videogame controllers and virtual reality systems to name a few.As the hardware in these systems afford more complex auditory and haptic parameters to be controlled and manipulated, designers working with these systems require guidance as to how to design effective interactions with the system, so that that information can be conveyed to the user effectively The goal of this research to provide data on a number of potential data:sensory parameters in the auditory, vibrotactile and electrotactile modalities, to guide system designers working with similar variables on how to design their interactions in a way that is easily understood by a user.Whether it is a videogame designer trying to convey an in-game damage value through a VR controller's haptic feedback, or conveying the level of danger through sound during an alarm in a process-monitoring system, the goal of this research is to provide a framework which designers can use to guide their design of non-visual information transmission scenarios such as these.