Blog
About

118
views
0
recommends
+1 Recommend
1 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      A Gestural Interaction Design Model for Multi-touch Displays

      , , , ,

      People and Computers XXIII Celebrating People and Technology (HCI)

      Computers XXIII Celebrating People and Technology

      1 - 5 September 2009

      Touch interaction, gestural recognition, mapping rules, interaction model, tabletop, PDA

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Media platforms and devices that allow an input from a user’s finger/hand touch are becoming more ubiquitous, such as Microsoft Surface and DiamondTouch, as well as numerous experimental systems in research labs. Currently the definition of touch styles is application-specific and each device/application has its own set of available touch types to be recognized as input. In this paper we attempt a comprehensive understanding of all possible touch types for touch-sensitive devices by constructing a design model for touch interaction and clarifying their characteristics. The model is composed of three structural levels (action level, motivation level and computing level) and the relationships between them (mapping).

          In action level, we construct a unified definition and description of all possible touch gestures, first by analyzing how a finger/hand touch on a surface can cause a particular event that can be recognized as a legitimate action, and then using this analysis we define all possible touch gestures, resulting in touch gesture taxonomy. In motivation level, we analyze and describe all the direct interactive motivation according to applications. Then we define the general principles for mapping between the action and motivation levels. In computing level, we realize the motivation and response to gestural inputs using computer languages.

          The model is then used to illustrate how it can be interpreted in the context of a photo management application based on DiamondTouch and iPod Touch. It allows to reuse touch types in different platforms and applications in a more systematic and generic manner than how touch has been designed so far.

          Related collections

          Author and article information

          Contributors
          Conference
          September 2009
          September 2009
          : 440-446
          Affiliations
          National

          University of

          Defence

          Technology
          Article
          10.14236/ewic/HCI2009.55
          © Songyang Lao et al. Published by BCS Learning and Development Ltd. People and Computers XXIII Celebrating People and Technology, Churchill College Cambridge, UK

          This work is licensed under a Creative Commons Attribution 4.0 Unported License. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/

          People and Computers XXIII Celebrating People and Technology
          HCI
          Churchill College Cambridge, UK
          1 - 5 September 2009
          Electronic Workshops in Computing (eWiC)
          Computers XXIII Celebrating People and Technology
          Product
          Product Information: 1477-9358 BCS Learning & Development
          Self URI (journal page): https://ewic.bcs.org/
          Categories
          Electronic Workshops in Computing

          Comments

          Comment on this article