Simulacra: Using game development software to create audio-visual performance tools

The conceptual origins of Simulacra are twofold. First, it is a continuation of my creative coding practices. Between 2018-19 I created ‘The Hill That Bleeds’, an interactive sound art piece made in the Unity game engine, exploring themes of spatial audio and surrealist landscapes (Figure 1). It comprises of a number of ‘sonic sculptures’ for participants to wander between, each with their own unique sonic identity (Holtum 2019a). It was with this project that I began to realise the artistic possibilities game development technology could offer my sonic practices.


INTRODUCTION
Game development technology offers an extraordinary amount of flexibility and thus has applications beyond the creation of video games. For example, Weinel (2018; discusses the use of game technologies as an audio-visual performance medium, and the work of multimedia artists such as Lawrence Lek and Clifford Sage provides excellent examples of this. In my current project Simulacra, I am utilising the Unity game engine and the audio middleware FMOD to create an audio-visual (AV) performance using a PlayStation 4 (PS4) controller as an input device. I undertook this as my final major project for my undergraduate sound design degree at London South Bank University, and as an extension of my musical practices and creative coding endeavours.
The conceptual origins of Simulacra are twofold. First, it is a continuation of my creative coding practices. Between 2018-19 I created 'The Hill That Bleeds', an interactive sound art piece made in the Unity game engine, exploring themes of spatial audio and surrealist landscapes ( Figure 1). It comprises of a number of 'sonic sculptures' for participants to wander between, each with their own unique sonic identity (Holtum 2019a). It was with this project that I began to realise the artistic possibilities game development technology could offer my sonic practices.
Second, since 2017 I have been performing my electronic music as part of an audio-visual performance under the moniker Sub Denizen (Holtum 2019b). For this I created a Max/MSP patch to manipulate video clips based on incoming audio. The patch uses FFT (Fast Fourier Transform) to analyse the incoming audio and trigger random number generators attached to various video effect parameters, each frequency band controlling a separate set of parameters ( Figure 2).
Whilst I am happy with the results, it quickly becomes computationally taxing, thus limiting its scope. By adopting game development tools I gain myriad more parameters and interactions with which I can forge an efficient audio-visual ecosystem of cause and effect, as discussed by Dolphin (2014) with the idea of 'sound toys'.

STRUCTURE
Simulacra is currently structured in a linear fashion, although in time I intend to have multiple narrative strands with separate endings so as to be able to alter the piece for each performance. The audience observes as I travel between and explore the various landmarks placed around the c.4km 2 environment, visually inspired by the surrealist art of Dali and Beksiński. (Figure 4).
The landmarks each represent the equivalent of a musical track in the performance, each containing a large structure, intended as the cynosure of that particular area. Each have their own sonic characteristics and provide unique audio-visual interactions. Sonically, these landmarks consist of two sections: an ambient (environmental) section and a beat-driven section that implements percussive layers, which I can cross-fade between and manipulate during performances.

CODING AND CONTROL MAPPING
The underlying performance system I am creating for Simulacra revolves around the control mapping capabilities of the PlayStation 4 controller and the audio middleware FMOD.
Unity receives two types of input message from the PS4 controller: button presses and axis input. Button presses can either be registered 'on-press' or 'on-hold', whilst axes return a floating point value. With the addition of FMOD I have been able to create a network of audio events and parameters using a PS4 controller as an input device.
The buttons on the PS4 controller are coded to trigger events when pressed, the function depends on predetermined variables, such as location (physical or musical) and orientation. The axes are chiefly used for navigation of the world and rotation of the camera in the style of a first person shooter FPS) game. The "L2" and "R2" triggers are also recognised as axes by Unity and provide a floatingpoint number within a user defined range. With the aid of FMOD these can be used for cross-fading between different sonic material and be given separate attack and release times resulting in a highly expressive performance capability .
The ambient sections consist of various texture layers that can be cross-faded between as the scene demands and sounds localised to game objects in the vicinity. Often this is done by traversing the environment, however in some cases the sonic balance of the scene can be changed by pointing the camera at different objects, utilising Unity's ray-casting system allowing me to play parts of the environment like audio-visual instruments (Figure 3).
In the percussion-driven sections, audio-visual interaction has the additional option of being controlled by a 'beat call-back' script. This enables beat and bar synchronisation for audio and visual events in addition to input from the PS4 controller. Allowing me to defer a great deal of compositional nuance until runtime.

FUTURE
As previously mentioned, Simulacra will eventually contain multiple narrative strands, each with their own unique landmarks and music. Thus giving me run-time control over both of Roads' (2004) concepts of the micro and macro compositional timescales.
I will also continue to refine my system of audiovisual interaction. With this regard two things are of particular interest. First is Unity's new shader graph, which would allow me to more easily create audio reactive materials with which to dress the project. Secondly is the use of ambisonic audio for performance. This is something that I was not able to achieve for The Hill That Bleeds, but my early tests of Simulacra in 5.1 and 7.1 surround sound have been successful.