Look where you're going [robotic wheelchair]

作者: Yoshinori Kuno , Nobutaka Shimada , Yoshiaki Shirai , None

DOI: 10.1109/MRA.2003.1191708

关键词: WheelchairGestureArtificial intelligenceHuman–computer interactionMobile robotMotion controlRoboticsComputer visionEngineeringVisual servoingGesture recognitionUser interface

摘要: We propose a robotic wheelchair that observes the user and environment. It can understand user's intentions from his/her behaviors environmental information. also when he/she is off wheelchair, recognizing commands indicated by hand gestures. Experimental results show our approach to be promising. Although current system uses face direction, for people who find it difficult move their faces, modified use movements of mouth, eyes, or any other body parts they move. Since such are generally noisy, integration observing environment will effective in understanding real useful technique better human interfaces.

参考文章(13)
Yoshinori Kuno, Teruhisa Murashima, Nobutaka Shimada, Yoshiaki Shirai, None, Understanding and learning of gestures through human-robot interaction intelligent robots and systems. ,vol. 3, pp. 2133- 2138 ,(2000) , 10.1109/IROS.2000.895286
Takashi Gomi, Ann Griffith, Developing Intelligent Wheelchairs for the Handicapped Lecture Notes in Computer Science. pp. 150- 178 ,(1998) , 10.1007/BFB0055977
I. Kweon, Y. Kuno, M. Watanabe, K. Onoguchi, Behavior-based mobile robot using active sensor fusion international conference on robotics and automation. pp. 1675- 1682 ,(1992) , 10.1109/ROBOT.1992.220137
R.C. Simpson, S.P. Levine, Adaptive shared control of a smart wheelchair operated by voice control intelligent robots and systems. ,vol. 2, pp. 622- 626 ,(1997) , 10.1109/IROS.1997.655076
David P. Miller, Marc G. Slack, Design and Testing of a Low-Cost Robotic Wheelchair Prototype Autonomous Robots. ,vol. 2, pp. 77- 88 ,(1995) , 10.1007/BF00735440
T. Nishimura, T. Mukai, R. Oka, Spotting recognition of gestures performed by people from a single time-varying image intelligent robots and systems. ,vol. 2, pp. 967- 972 ,(1997) , 10.1109/IROS.1997.655126
Y. Adachi, Y. Kuno, N. Shimada, Y. Shirai, Intelligent wheelchair using visual information on human faces intelligent robots and systems. ,vol. 1, pp. 354- 359 ,(1998) , 10.1109/IROS.1998.724645
S. Okazaki, Y. Fujita, N. Yamashita, A compact real-time vision system using integrated memory array processor architecture IEEE Transactions on Circuits and Systems for Video Technology. ,vol. 5, pp. 446- 452 ,(1995) , 10.1109/76.473557
V.I. Pavlovic, R. Sharma, T.S. Huang, Visual interpretation of hand gestures for human-computer interaction: a review IEEE Transactions on Pattern Analysis and Machine Intelligence. ,vol. 19, pp. 677- 695 ,(1997) , 10.1109/34.598226
B. Moghaddam, A. Pentland, Probabilistic visual learning for object representation IEEE Transactions on Pattern Analysis and Machine Intelligence. ,vol. 19, pp. 696- 710 ,(1997) , 10.1109/34.598227