Blog
About

147
views
0
recommends
+1 Recommend
1 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Conference Proceedings: found
      Is Open Access

      Tool Use as Gesture: new challenges for maintenanceand rehabilitation

      ,

      Proceedings of HCI 2010 (HCI)

      Human Computer Interaction

      6 - 10 September 2010

      Ubiquitous computing, Tangible User Interface, Sensor-based interaction, Rehabilitation, Maintenance Support

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          There are many ways to capture human gestures. In this paper, consideration is given to an extension to the growing trend to use sensors to capture movements and interpret these as gestures. However, rather than have sensors on people, the focus is on the attachment of sensors (i.e., strain gauges and accelerometers) to the tools that people use. By instrumenting a set of handles, which can be fitted with a variety of effectors (e.g., knives, forks, spoons, screwdrivers, spanners, saws etc.), it is possible to capture the variation in grip force applied to the handle as the tool is used and the movements made using the handle. These data can be sent wirelessly (using Zigbee) to a computer where distinct patterns of movement can be classified. Different approaches to the classification of activity are considered. This provides an approach to combining the use of real tools in physical space with the representation of actions on a computer. This approach could be used to capture actions during manual tasks, say in maintenance work, or to support development of movements, say in rehabilitation.

          Related collections

          Most cited references 10

          • Record: found
          • Abstract: not found
          • Book Chapter: not found

          Activity Recognition from User-Annotated Acceleration Data

            Bookmark
            • Record: found
            • Abstract: not found
            • Book Chapter: not found

            Activity Recognition in the Home Using Simple and Ubiquitous Sensors

              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              Recognition of dietary activity events using on-body sensors.

              An imbalanced diet elevates health risks for many chronic diseases including obesity. Dietary monitoring could contribute vital information to lifestyle coaching and diet management, however, current monitoring solutions are not feasible for a long-term implementation. Towards automatic dietary monitoring, this work targets the continuous recognition of dietary activities using on-body sensors. An on-body sensing approach was chosen, based on three core activities during intake: arm movements, chewing and swallowing. In three independent evaluation studies the continuous recognition of activity events was investigated and the precision-recall performance analysed. An event recognition procedure was deployed, that addresses multiple challenges of continuous activity recognition, including the dynamic adaptability for variable-length activities and flexible deployment by supporting one to many independent classes. The approach uses a sensitive activity event search followed by a selective refinement of the detection using different information fusion schemes. The method is simple and modular in design and implementation. The recognition procedure was successfully adapted to the investigated dietary activities. Four intake gesture categories from arm movements and two food groups from chewing cycle sounds were detected and identified with a recall of 80-90% and a precision of 50- 64%. The detection of individual swallows resulted in 68% recall and 20% precision. Sample-accurate recognition rates were 79% for movements, 86% for chewing and 70% for swallowing. Body movements and chewing sounds can be accurately identified using on-body sensors, demonstrating the feasibility of on-body dietary monitoring. Further investigations are needed to improve the swallowing spotting performance.
                Bookmark

                Author and article information

                Contributors
                Conference
                September 2010
                September 2010
                : 241-249
                Affiliations
                School of Electronic, Electrical and Computer Engineering

                The University of Birmingham Birmingham B15 2TT
                Article
                10.14236/ewic/HCI2010.30
                © Manish Parekh et al. Published by BCS Learning and Development Ltd. Proceedings of HCI 2010, University of Abertay, Dundee, UK

                This work is licensed under a Creative Commons Attribution 4.0 Unported License. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/

                Proceedings of HCI 2010
                HCI
                24
                University of Abertay, Dundee, UK
                6 - 10 September 2010
                Electronic Workshops in Computing (eWiC)
                Human Computer Interaction
                Product
                Product Information: 1477-9358BCS Learning & Development
                Self URI (journal page): https://ewic.bcs.org/
                Categories
                Electronic Workshops in Computing

                Comments

                Comment on this article