Ergonomic Gesture Recognition
Supervisors: Denis Lalanne, Rolf Ingold
Student: Tom Forrer
Project status: Finished
Year: 2011
This master thesis proposes the use of ergonomic hand gestures for application control. Hand gestures recognition systems often have the disadvantage of being fatiguing over a prolonged period of use. For a practical use for application control the gestures have to share the same ergonomic features of a keyboard or a mouse: limited action space, wrist or arm support, precision and comfort.
This project presents a vision-based architecture where these ergonomic gestures can be recognized. Following a general overview of gesture recognition phases and techniques, this architecture implements the phases of tracking, model mapping, training and classification with the emphasis on real-time execution. The tracking module is implemented using the high-performance OpenCV library, whereas the training and classification of hand postures uses the dlib C++ machine learning functions.
To limit the scope of the project, color-marked gloves will be used to aid the tracking of the finger positions. Also three one-hand gestures are selected for their suitability for gestural application control: the pointing gesture, the zooming gesture (pinching motion with index and thumb) and the horizontal swiping gesture. These gestures are decomposed into their key postures. For the recognition of hand postures, the tracked finger blobs are mapped onto an abstract hand model which can be classified using a one-versus-one multiclass -Support Vector Machine (-SVM). The decision function of the -SVM is trained through the collection of hand model samples from annotated recordings containing the three gestures.
In a validation a small demonstration program is implemented using the gesture recognition architecture. This demonstration program shows that the gesture recognition architecture works with reaction delays under 0.4 seconds. It also shows that spatially small-grained aspects of a gesture are still recognized and that these gestures are easy to learn by other users.
Document: report.pdf