Two-Handed Gestures for Human-Computer Interaction

作者: Agnès Just

DOI: 10.5075/EPFL-THESIS-3683

关键词:

摘要: The present thesis is concerned with the development and evaluation (in terms of accuracy utility) systems using hand postures gestures for enhanced Human-Computer Interaction (HCI). In our case, these are based on vision techniques, thus only requiring cameras, no other specific sensors or devices. When dealing movements, it necessary to distinguish two aspects movements: \textit{static} aspect \textit{dynamic} aspect. static characterized by a pose configuration in an image related Hand Posture Recognition (HPR) problem. dynamic defined either trajectory hand, series sequence images. This second Gesture (HGR) task. Given recognized lack common databases HGR field, first contribution this was collection public distribution databases, containing both one- two-handed gestures, which part results reported here will be upon. On we compare state-of-the-art models task HGR. As contribution, propose HPR technique new feature extraction. method has advantage being faster than conventional methods while yielding good performances. addition, provide comparison technique. Finally, most important lies thorough study not but also more generally field HCI. chapter provides extended state-of-the-art. contributes HPR. We apply employed success face detection. Modified Census Transform (MCT) extract relevant features evaluate existing benchmark database approaches. third describe recorded database, 3D space. used HGR, namely Hidden Markov Models (HMM) Input-Output Model (IOHMM). fourth focused precisely gesture recognition. For that purpose, been cameras. goal manipulate virtual objects screen. investigate processing techniques previous chapter. then discuss obtained different features, images one conclusion, major techniques. complete survey recent some possible applications applied interaction. hope research open directions posture

参考文章(181)
Trish Keaton, Ali H. Sayed, Sylvia M. Dominguez, Browsing the environment with the SNAP&TELL wearable computer system - Erratum. Personal and Ubiquitous Computing. ,vol. 9, pp. 356- 356 ,(2005)
Wayne Piekarski, Grant Wigley, Ross Smith, Hand tracking for low powered mobile AR user interfaces australasian user interface conference. pp. 7- 16 ,(2005)
M. Yeasin, N. Krahnstoever, R. Sharma, Automatic Acquisition and Initialization of Kinematic Models ,(2001)
Yuan Yao, Miao-Liang Zhu, Hand tracking in time-varying illumination international conference on machine learning and cybernetics. ,vol. 7, pp. 4071- 4075 ,(2004) , 10.1109/ICMLC.2004.1384552
Chin-Chen Chang, Cheng-Yi Liu, Modified curvature scale space feature alignment approach for hand posture recognition international conference on image processing. ,vol. 3, pp. 309- 312 ,(2003) , 10.1109/ICIP.2003.1247243
J. Wachs, U. Kartoun, H. Stern, Y. Edan, Real-time hand gesture telerobotic system using fuzzy c-means clustering world automation congress. ,vol. 13, pp. 403- 409 ,(2002) , 10.1109/WAC.2002.1049576
J.M.S. Dias, P. Nande, N. Barata, A. Correia, OGRE - open gestures recognition engine brazilian symposium on computer graphics and image processing. pp. 33- 40 ,(2004) , 10.1109/SIBGRA.2004.1352940
Hyoyoung Jang, Jun-Hyeong Do, Jin-Woo Jung, Z.Z. Bien, Two-staged hand-posture recognition method for Softremocon system systems, man and cybernetics. ,vol. 1, pp. 572- 576 ,(2005) , 10.1109/ICSMC.2005.1571207
Andrew D. Wilson, Edward Cutrell, FlowMouse: A Computer Vision-Based Pointing and Gesture Input Device Human-Computer Interaction - INTERACT 2005. pp. 565- 578 ,(2005) , 10.1007/11555261_46