作者: Yoshinori Kuno , Nobutaka Shimada , Yoshiaki Shirai , None
关键词: Wheelchair 、 Gesture 、 Artificial intelligence 、 Human–computer interaction 、 Mobile robot 、 Motion control 、 Robotics 、 Computer vision 、 Engineering 、 Visual servoing 、 Gesture recognition 、 User interface
摘要: We propose a robotic wheelchair that observes the user and environment. It can understand user's intentions from his/her behaviors environmental information. also when he/she is off wheelchair, recognizing commands indicated by hand gestures. Experimental results show our approach to be promising. Although current system uses face direction, for people who find it difficult move their faces, modified use movements of mouth, eyes, or any other body parts they move. Since such are generally noisy, integration observing environment will effective in understanding real useful technique better human interfaces.