Touch and Go: On the use of proprioception to convey a meaningful experience in virtual reality

Proprioception is the usually unconscious ability to monitor the position of our body in the world. Two aspects of proprioception may be of particular interest in the construction of virtual reality environments. The first, proprioceptive memory, occurs when a recalled position of our body overrides the actual position of our body or limb. The second, proprioceptive drift, occurs when the sense of ownership of part of our body is transferred externally. The abstract navigable virtual reality environment The Sonic Stage employs these two aspects of proprioception to enhance the place illusion and plausibility of the environment.


INTRODUCTION
Alice started to her feet, for it flashed across her mind that she had never before seen a rabbit with either a waistcoat-pocket, or a watch to take out of it, and burning with curiosity, she ran across the field after it, and was just in time to see it pop down a large rabbit-hole under the hedge. In another moment down went Alice after it, never once considering how in the world she was to get out again (Carroll 1865). My current practice revolves around the translation of analogue experiences to digital equivalents. As I sit in the 'here' and 'now', there is a very clear notion in my mind of where my body is in space and time, and what around me are the different parts of my body. The bodily function responsible for that awareness is proprioception. The role of proprioception is to stabilise the body and protect it from over extension of muscles and joints, necessary for secure efficient locomotion. In addition to this real time reflexive function, I am able to recall past proprioceptive information Virtual Reality Environments (VRE) aim to create the illusion of being in the 'there ' and 'then' and -if very ambitious -being 'them' (being another). VREs are kinaesthetic and multi-sensory in nature, intended to mimic our interaction with the physical world with an ever-growing fidelity. Consequently, proprioception is key to the cognitive processes within a VRE as well.
Day to day, proprioception is as unconscious as breathing, and just as essential to keep us safely mobile in the world. Yet under certain conditions, a displacement can occur between the actual position of our body in time and space and the proprioceptive image formed in our mind: we may think of our body as somewhere it no longer is or perceive an external element to us to be part of our body.
By understanding these proprioceptive phenomena and their proper application in a VRE it will be possible to enhance the quality of the VRE experience.

PROPRIOCEPTION AND ITS UNDERLYING MECHANISMS
Proprioception originates from sensors, neurons that are in muscle spindles located at the interface between muscles and tendons and at the joints. Most often, proprioception functions reflexively. When someone hands us an object, proprioceptor neurons sense the elongation of the muscles caused by the added weight in our hand, resulting in an adjustment of the muscles tension in our arm to support the additional weight. The response/action cycle is very short, because proprioceptive information feeds directly into the autonomic nervous system, the system responsible for controlling vital bodily functions that have no need for conscious intervention (Collet et al. 2013). As a result, we are generally unaware of the role proprioception plays in our interaction with the world (Jänig 2015).

PROPRIOCEPTIVE MEMORY
Even in the absence of visual cues, we can retrace our steps using proprioceptive recollection (Proske & Gandevia 2012). Habitual actions, such as opening a door require little conscious attention: we can reach towards the handle without looking, because we remember the position our hand needs to be in. Humans are able to do this because we simultaneously hold in our mind two proprioceptive body models: a real time and an offline model.
The real time body model is composed of the three sources of input into the motor system: Input from the sensors monitoring the bodies' internal workings (interoception), from the environment surrounding the body, including its surface (exteroception) and from sensors attached to skeletal muscles and joints -proprioception. The combination of these inputs to the motor system creates an 'online' real time body map in our minds, describing where our body is in space, and a sense of ownership -the assurance that we are moving our own limb even when it is obscured from view. This last important point will be further elaborated in the subsequent section. The 'offline' representation of the body is constructed from a combination of the real time sensory inputs augmented by stored memories (Collet et al. 2013).
An extreme example of the discrepancy between a real time body model and an offline remembered body model is a medical condition referred to as 'Phantom Limb Syndrome'. This condition causes a majority of patients with limb amputation at some point experience a sensation of their missing limb, most commonly of pain. Some describe an ability to voluntarily 'move' the missing limb -wave or shake hands -and experience a sense of involuntary movement as if the phantom hand spontaneously moved to a new position (Anderson- Barnes et al. 2009). Treatment of phantom limb pain consists of replacing the pre-amputation proprioceptive memories with new ones. Mirror box therapy entails placing the residual limb behind a mirror reflecting the existing limb. New memories are created of actions that replace these of past trauma (Zweighaft et al. 2012).

PROPRIOCEPTIVE DRIFT
Proprioceptive drift is a spatial displacement between the perceived and actual position of a part of the body. This leads to an illusionary sense of body ownership, where an external object appears to be part of the physical body and is incorporated into the proprioceptive body model (Tuthill & Azim 2018).
One example of this is the 'Rubber Hand Illusion'. In this often-repeated experiment, one hand of the subject is obscured from view and an artificial limb is placed close to it within sight. The same physical action is applied to the obscured hand and visible artificial limb, typically stroking with a soft brush. Faced with a contradiction between the proprioceptive information of the physical hand's location and the visual evidence of the position of the artificial limb, subjects will form an intermediate hand percept, locating it spatially between the actual location of the real and artificial hand (Fuchs et al. 2016). The feeling of ownership of the artificial limb may become so strong in this case, that when a threat is perceived to the artificial limb, that a measurement of cortical activity indicates feelings of anxiety (Ehrsson et al. 2007).
Other experiments have shown an extension of the feeling of ownership from a single limb to the entire body, creating an illusion of an 'Out of Body Experience' (Petkova & Ehrsson 2008). In one experiment of this kind, participants wore immersive head mounted displays that show the view from cameras placed on the head of a mannequin, pointing downwards towards the mannequin's abdomen. When participants look down towards their abdomen, they see an image of the mannequin's abdomen instead of their own. Identical short rods are then used to simultaneously touch the abdomen of the participant and of the mannequin. When the mannequins' body was threatened with a knife, Skin Conductance Response measurement shows an anxiety response in the subjects. As with the rubber hand illusion, visual input overrides proprioceptive information to create a proprioceptive drift of the participants' entire body model to that of the mannequin.

APPLICATION IN WORKS OF VIRTUAL REALITY
Proprioceptive drift is an important consideration in the work of BeAnotherLab -"an interdisciplinary multinational group dedicated to understanding, communicating and expanding subjective experience" (The Lab -BeAnotherLab 2020). The team uses embodied simulation mechanisms to explore notions of identity and empathy.
One of their pieces is The Machine to be Another. De Olivera et al. (2016) describes it as an installation consisting of a head mounted display worn by the beholder, connected to a camera mounted on the body of a performer / storyteller. Both beholder and storyteller hold the same objects, which are props in a narrative told by the storyteller. The field of view of the camera changes to correspond with the head motion of the beholder but translated to point to the hands of the performer. The performer mirrors the hand motions of the user while manipulating the same object. The combination of haptic and visual cues causes a proprioceptive drift from the body of the beholder towards the body of the performer.
In a case study based on this piece, Ainsley Sutherland (2016) at MIT describes her experience in the role of alternately participant and performer. She reports feeling a "'proprioceptive transference', a sensation of disruption, or shock: I felt it as well, when I 'moved' a hand that was not mine and yet still felt the apple that was handed to me." Yet the stated mission of BeAnotherLab for this work is to Put Yourself in the Shoes of the Other. Her failure to feel that, says Sutherland, points the limits of the immersive experience. BeAnotherlab's goal in all of their work is to evoke empathy for the other. In another instance of their work, in the Calais refugee camp in France, the same technology was used, this time with refugees as storytellers, to promote a better understanding of the refugee crisis (Jarvis 2017).
Two forms of empathy recognised by psychologists include: • Emotional Empathy -the feeling of emotions ensuing from perceived sensations that are happening not oneself but feel as if they are.
• Cognitive Empathy -the acquired ability to recognize and identify the emotions of others.
Neurologically, a state of emotional empathy activates the same areas in the brain as do direct personal emotions. A state of cognitive empathy activates a different area of the brain, associated with mentalisation (Nummenmaa et al. 2008). Proprioceptive drift, or bodily empathy, serves as an amplifier serves an amplifier to emotional empathy (Nummenmaa et al. 2008).
To achieve their goal, BeAnotherlab needed to achieve both emotional and cognitive empathy. As observed after Sutherland's experience, they achieve emotional empathy, but not the perspective of cognitive empathy (Sutherland 2016). The success of the first, an integration of an external stimulus into the self-body map, is the reason for the failure of the second -the disappearance of an 'other' to have empathy for.

THE SONIC STAGE
I started my research into proprioception as part of my interest into physical interaction within a virtual digital environment.
The Sonic Stage (Figure 1) was inspired by a performance held by composer and sound artist Ken Ueno (2020) at Osage gallery in Hong Kong in October 2019. The performance, titled Bread was a structured improvisation led by Ueno with four additional musicians. The performance highlighted the acoustics of the space -the musicians were stationed in different areas of the gallery, with the audience free to roam around as they please, interacting with the acoustics around them, curating their own sound perspective (HKACT! 2019). I was fascinated by this relationship between audience and performance, allowing the audiences to make choices during the performance about their listening perspective.
The challenge I undertook was not to replicate the live performance as a digital experience, but rather to translate the essence of attending the live performance into an experience within an abstract digital medium. It differs from the experiment undertaken by Bergström et al. (2017), where a VRE performance of a string quartet was recreated using animated avatars. It is also important to note that unlike The Machine to Be Another the aim here is not be in the body of an audience member in the gallery, but to translate as closely as feasible the essence of the experience had at the gallery: Listening to sound works, while at the same time having the ability to select an acoustic perspective within the digital space.
An experience, in the John Dewey sense of the term, is a totality isolated in time where it proceeds until it runs its course to fulfilment: The experience, like that of watching a storm reach its height and gradually subside, is one of continuous movement of subject-matters. Like the ocean in the storm, there are a series of waves; suggestions reaching out and being broken in a clash or being carried onwards by a cooperative wave. If a conclusion is reached, it is that of a movement of anticipation and culmination, one that finally comes to completion. A "conclusion" is no separate and independent thing; it is the consummation of a movement (Dewey 1934).
In the real-world event in the gallery, the experience starts when entering, a separation from the exterior world to a defined space. This is equally a moment of separation from the pedestrian flow of time to an anticipated event. The experience proceeds as we navigate the space of the gallery while listening to the sounds created by the participating artists. The consummation of the experience occurs when the artists complete their playing and we have exited the gallery.
My personal viewing experience as both creator and user of The Sonic Stage started with donning the hardware components of to enable the VRE. These consist of a swivelling stool, an arcade controller topped by a large red ball strapped to the knee when seated, a Google Cardboard immersive headset and sound isolating headphones The digital world I initially saw was momentarily reassuring. I was floating above a light filled landscape, in front of us a red ball, similar in size colour and texture as the one on top of the controller we feel in our hand. This is quickly followed by an instance of disorientation, as I watched the ball quickly fall towards the ground, through a skylight of a dome-like structure, followed closely by my own point-of-view camera.
As I descended through the rabbit hole and landed on the ground, I found the red ball in front of us, imbedded in the soft sand beneath us. The sound of a string quartet surrounds us. The binaural nature of the sound enabled me to easily identify the direction from which the sound came from as I turned my head or moved around in space.
When I pushed forward on the red tracking ball controller in my hand, I found myself moving forward in space. Looking down through the headset, I saw a red ball in front of us, pushing through the sand on the ground. Tilting my body to the left and the red ball in my hand with it, I saw that the red ball in my field of vision also moved to the left. My point of view kept following the ball in this way, always keeping an arm's length distance between the ball and me.
As I turned my head to look around, I identified four landmarks in the dome. As I continued to move around the space, the soundscape around me changed according to my position relative to the four landmarks. I discover that each landmark is associated with an instrument from a string quartet ensemble. The uniformly blended sound, heard at the centre, is deconstructed and reconstructed in many ways as I move around. When I was finished, I simply glided towards a door marked 'EXIT' to end the experience of The Sonic Stage.

DISCUSSION
A real-world experience is anchored by a sense of place. In the virtual world, the participant should also experience an illusion of having been to a place, not just a series of consecutive perceptions of a space. Perceptions are sensations bound together into coherent objects in our mind. In an immersive VRE, real world stimuli are replaced by computer generated ones. They are perceptually bound together into the illusion of computergenerated objects and a sense of volumetric space. But this in itself is not yet sufficient to create a sense of presence -the experience of having been to a particular place (Jerald 2015).
Place illusion occurs when sufficient sensorimotor contingencies are supported by the digital environment. For example, by turning our head to the left we see what is to the left of us (Slater 2009). For this to occur, the exteroceptional information we receive from our five senses needs to match the proprioceptive information describing the position and movement of our body. When we move our head in the physical world, our field of view in the virtual world changes, accordingly, giving us a correct stereoscopic rendering of the virtual world around us. When we turn around in the chair we are sitting on, we see the virtual world behind us. These combine together to create the proprioceptive drift that will transpose our body map into the virtual world, anchoring us to the virtual place. As this is a translation, not a simulation of the gallery experience, architectural fidelity is not important. The key is to create a 'place' as a locus of an experience as mentioned above.
The binaural sound in the digital world further reinforces the visual sensorimotor contingencies. It behaves according to the same intuitive physicality we are accustomed to in the world.
The plausibility of motion is a more difficult problem. Essential to the experience of motion is not just a sense of 'being there' but also a sense of being in different locations 'there'. To prevent breaking place illusion, a sense of motion requires a transition from one location to the other in the VRE. Recalled proprioception data and our memory of the non-virtual world around us, places us seated in a chair. The motion of our legs in the world causes us to rotate around, but not to travel laterally.
At the same time, we are holding in our hand a ball, which we remember to be red, very similar in colour texture and size to the ball we see in front of us.
Pushing the physical ball with our hand, corresponds to the virtual ball rolling forward. As the virtual ball comes to represent the ball in our hand, a metaphorical link is established between the motion of the virtual ball and our motion in virtual space. Once the causality between our actions in the world and their consequence in the virtual world is established, we can anticipate the motion in the virtual world and motion sickness is minimised (Kuiper et al. 2019).
The appearance of friction between the motion of the ball as it pushes through the sand as it travels is a remembered real-world action. It anticipates the slow movement of the ball. The controlled traces we leave in the sandy floor anchor us further in the illusion of place, since they are further evidence of us being there. They also allow us to return to places in the dome we particularly liked

NEXT STEPS
So far, the sound used within the VRE has been a recording of that in a live performance. Research has shown that response to music is situationally dependent (Liljeström, Juslin & Västfjäll 2013), Future work will look to integrate current research into the differences between real world and VRE response to music in creating musical VRE experiences.
Interaction with the virtual space has so far been locomotion of the beholder and passive listening. An additional venue of future work may explore the role of the beholder as collaborator within the soundscape they are navigating as creators of sound.