Blog
About

137
views
0
recommends
+1 Recommend
1 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Conference Proceedings: found
      Is Open Access

      GestureNet: A Common Sense Approach to Physical Activity Similarity

      , ,

      Electronic Visualisation and the Arts (EVA 2014) (EVA)

      Electronic Visualisation and the Arts (EVA 2014)

      8 - 10 July 2014

      Commonsense reasoning, Animation, Activity recognition algorithms

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Generalizing knowledge about physical movement often requires significant amounts of data capture. Despite the large effort to collect and process activity examples, these systems can still fail to classify movements due to many reasons. Our system, called GestureNet, uses a very small dataset of activity templates to get useful query results for a generalized set of movements. Thus, many more movement profiles can be generated for activity recognition systems and gesture synthesis algorithms.

          We demonstrate a system that is able to support a larger set of computer animations based on a small set of base animations. A user can input any motion word recognized by GestureNet, and the system will respond with the closest animation match. GestureNet will also describe the degree to which the new activity is similar to the template profiles. One example is if the user inputs “baseball”, the system will show the animation for Run. The commonsense database associates baseball with jogging, which is a type of running. Although the example gesture matrix is small, we demonstrate that our techniques can extend the system to describe variations of these activities (e.g. sitting and squatting) which are not currently represented. We can expect that this solution will be useful in application domains where sensor data capture and activity profiles are costly to acquire (e.g. activity classification, animations and visualisations).

          Related collections

          Author and article information

          Contributors
          Conference
          July 2014
          July 2014
          : 89-94
          Affiliations
          MIT Media Lab

          20 Ames St.,

          Cambridge, MA 02139, USA
          MIT Architecture

          77 Massachusetts Ave.,

          Cambridge, MA 02139, USA
          Article
          10.14236/ewic/EVA2014.15
          © Angela Chang et al. Published by BCS Learning and Development Ltd. Electronic Visualisation and the Arts (EVA 2014), London, UK

          This work is licensed under a Creative Commons Attribution 4.0 Unported License. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/

          Electronic Visualisation and the Arts (EVA 2014)
          EVA
          London, UK
          8 - 10 July 2014
          Electronic Workshops in Computing (eWiC)
          Electronic Visualisation and the Arts (EVA 2014)
          Product
          Product Information: 1477-9358BCS Learning & Development
          Self URI (journal page): https://ewic.bcs.org/
          Categories
          Electronic Workshops in Computing

          Comments

          Comment on this article