There are many ways to capture human gestures. In this paper, consideration is given to an extension to the growing trend to use sensors to capture movements and interpret these as gestures. However, rather than have sensors on people, the focus is on the attachment of sensors (i.e., strain gauges and accelerometers) to the tools that people use. By instrumenting a set of handles, which can be fitted with a variety of effectors (e.g., knives, forks, spoons, screwdrivers, spanners, saws etc.), it is possible to capture the variation in grip force applied to the handle as the tool is used and the movements made using the handle. These data can be sent wirelessly (using Zigbee) to a computer where distinct patterns of movement can be classified. Different approaches to the classification of activity are considered. This provides an approach to combining the use of real tools in physical space with the representation of actions on a computer. This approach could be used to capture actions during manual tasks, say in maintenance work, or to support development of movements, say in rehabilitation.
Content
Author and article information
Contributors
Manish Parekh
Chris Baber
Conference
Publication date:
September
2010
Publication date
(Print):
September
2010
Pages: 241-249
Affiliations
[0001]School of Electronic, Electrical and Computer Engineering
The University of Birmingham Birmingham B15 2TT