Using Microsoft Kinect Sensor To Perform Commands On Virtual Objects
Supervisors: Denis Lalanne, Matthias Schwaller
Student: Simon Brunner
Project status: Finished
Gestural user interfaces are becoming omnipresent in our daily lives with more or less success. The Microsoft Kinect device is at the moment on top of all hand free recognition devices. Gestures are mostly designed for leisure applications such as mini-games requiring standing up in large environment, far from the screen. The Kinect sensor device for windows reduces the distance for desktop users with the near mode. This project aims at studying the possibility to develop subtle gestures to perform basic gestural interactions to operate a computer with accuracy. Two different sets of gestures have been designed and compared during the course of this project to work on close range. The goal is to use them in small repetitive tasks and evaluate their performances against each other to see if the one or several types of gestures work better than others. The designed gestures operate four commands that are evaluated with users: selection, drag and drop, rotation and resizing of objects.
The first part of this master project paper presents the technological features and how to go from data acquisition from Kinect to the creation of functional gestures. The following part concerns the design of the gestures. A description on all the designs with their pros and cons is provided in tables showing the evolutions of the gestures. The gestures are divided in two groups: the technological and the iconic ones. On one hand, the technological gestures aim at efficiency and reliable recognition regardless of the users' expectations. On the other hand, the iconic gestures aim to be efficient but priority is given to their naturalness, easiness to remember, and ergonomics for users.
Another important part of this project concerned the creation of a full application for testing with users each gesture in four simple activities and then all together by groups in a final activity. This report ends with the results of a within-subject user evaluation organized with 10 persons and their analysis. Results show that iconic selection has quantitatively equivalent performances as the technological one but is perceived as more comfortable and usable by users. Further, the iconic zoom and rotation have significantly better results according to statistical tests. Finally, iconic gestures are individually better and/or favored by users over the technological gestures, which had similar performances during the final tasks regrouping the four commands.