Our daily interactions and perceptions involve multiple sensory multimodalities; most exchanges are inherently multimodal, playing an important part in our decoding and understanding of environments. Events can often engage multiple senses. Furthermore, musical experiences can be highly multisensory, with obvious auditory stimulation, the visual elements of a live performance, and physical stimulation of the body. In this paper, we will propose a means of incorporating an additional somatic channel of communication into live performances and compositional practice to further augment the physical nature of live performance.
This work explores the integration of augmented vibratory, or haptic stimulation for audiences in live performance. The vibration interface is presented as an expressive and creative live performance-based tool for composers. Vibrations, or haptics, are implemented as an additional instrumental line, alongside auditory musical gestures, to expand the composer’s palette of expressions through augmented somatic engagement. This paper will describe the background, and design and development of a haptic interface for the purpose of audio-haptic listening-feeling. The focus of this paper is to describe a study into motor latency for informing multimedia simultaneity.