1,610
views
0
recommends
+1 Recommend
1 collections
    0
    shares

      Studying business & IT? Drive your professional career forwards with BCS books - for a 20% discount click here: shop.bcs.org

      scite_
       
      • Record: found
      • Abstract: found
      • Conference Proceedings: found
      Is Open Access

      GestureNet: A Common Sense Approach to Physical Activity Similarity

      Published
      proceedings-article
      , ,
      Electronic Visualisation and the Arts (EVA 2014) (EVA)
      Electronic Visualisation and the Arts (EVA 2014)
      8 - 10 July 2014
      Commonsense reasoning, Animation, Activity recognition algorithms
      Bookmark

            Abstract

            Generalizing knowledge about physical movement often requires significant amounts of data capture. Despite the large effort to collect and process activity examples, these systems can still fail to classify movements due to many reasons. Our system, called GestureNet, uses a very small dataset of activity templates to get useful query results for a generalized set of movements. Thus, many more movement profiles can be generated for activity recognition systems and gesture synthesis algorithms. We demonstrate a system that is able to support a larger set of computer animations based on a small set of base animations. A user can input any motion word recognized by GestureNet, and the system will respond with the closest animation match. GestureNet will also describe the degree to which the new activity is similar to the template profiles. One example is if the user inputs “baseball”, the system will show the animation for Run. The commonsense database associates baseball with jogging, which is a type of running. Although the example gesture matrix is small, we demonstrate that our techniques can extend the system to describe variations of these activities (e.g. sitting and squatting) which are not currently represented. We can expect that this solution will be useful in application domains where sensor data capture and activity profiles are costly to acquire (e.g. activity classification, animations and visualisations).

            Content

            Author and article information

            Contributors
            Conference
            July 2014
            July 2014
            : 89-94
            Affiliations
            [0001]MIT Media Lab

            20 Ames St.,

            Cambridge, MA 02139, USA
            [0002]MIT Architecture

            77 Massachusetts Ave.,

            Cambridge, MA 02139, USA
            Article
            10.14236/ewic/EVA2014.15
            50d5fd73-ebec-4b57-b1e0-91e2b16b2553
            © Angela Chang et al. Published by BCS Learning and Development Ltd. Electronic Visualisation and the Arts (EVA 2014), London, UK

            This work is licensed under a Creative Commons Attribution 4.0 Unported License. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/

            Electronic Visualisation and the Arts (EVA 2014)
            EVA
            London, UK
            8 - 10 July 2014
            Electronic Workshops in Computing (eWiC)
            Electronic Visualisation and the Arts (EVA 2014)
            History
            Product

            1477-9358 BCS Learning & Development

            Self URI (article page): https://www.scienceopen.com/hosted-document?doi=10.14236/ewic/EVA2014.15
            Self URI (journal page): https://ewic.bcs.org/
            Categories
            Electronic Workshops in Computing

            Applied computer science,Computer science,Security & Cryptology,Graphics & Multimedia design,General computer science,Human-computer-interaction
            Commonsense reasoning,Activity recognition algorithms,Animation

            Comments

            Comment on this article