+1 Recommend
1 collections
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      A study of the adaptive gesture interface for the severely physically handicapped


      Science Impact, Ltd.

      Read this article at

          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.


          Technology has already been developed that does not require a person to physically touch a mouse or a screen. This technology, such as the Kinect, relies on motion sensors to pick up movements and gestures, using them as inputs. However, these gestures are pre-programmed and actually require the same amount of coordination as any other input through a physical device. This means that they are not immediately useful for those with motor dysfunction. Dr. Ikushi Yoda of the National Institute of Advanced Industrial Science and Technology (AIST), Japan, is working to resolve this problem and aims to provide motion sensors that can allow disabled people to smoothly access computers amongst other devices. Yoda and his interdisciplinary team are working to analyse and categorise a wide variety of gestures from disabled with motor dysfunction. Their aim is to programme these gestures to be recognised by motion-sensing devices as well as to introduce flexibility into the devices to allow for a wider range of detection. In order to devise new systems of input for users with various motor disabilities, Yoda and his team first had to gather data on the sorts of gestures they could create. To document these accurately, Yoda’s team monitored a pool of disabled volunteers to collect movement data for classification and analysis. Yoda chose participants that represented a wide range of potential disabilities. This meant many gestures of different types and from different parts of the body could be acquired and analysed. These data were classified according to body part and used to develop a modulised gesture recognition engine. In addition to gathering important data, Yoda is also trying to optimise the research process using relatively cheap motion detection cameras. It is essential that any system eventually employed is at least accessible to the majority of the target disabled users. Yoda and his team have previously developed an interface based on head gestures for individuals with severe cerebral palsy to circumvent their inability to operate their motorised wheelchairs.

          Related collections

          Author and article information

          Science Impact, Ltd.
          June 15 2018
          June 15 2018
          : 2018
          : 3
          : 41-43
          © 2018

          This work is licensed under a Creative Commons Attribution 4.0 Unported License. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/

          Earth & Environmental sciences, Medicine, Computer science, Agriculture, Engineering


          Comment on this article