We address the challenge of supporting collaborators who access a shared interactive space through different sets of modalities. This was achieved by designing a cross-modal tool combining a visual diagram editor with auditory and haptic views to allow simultaneous visual and non–visual interaction. The tool was deployed in various workplaces where visually-impaired and sighted coworkers access and edit diagrams as part of their daily jobs. We use our observations and analyses of the recorded interactions to outline preliminary design recommendations for supporting cross-modal collaboration.
Content
Author and article information
Contributors
Oussama Metatla
Nick Bryan-Kinns
Tony Stockman
Fiore Martin
Conference
Publication date:
September
2012
Publication date
(Print):
September
2012
Pages: 109-118
Affiliations
[0001]School of Electronic Engineering and Computer Science
Queen Mary University of London
Mile End road, London, E1 4NS, UK