作者: Andre Lemme , Ananda Freire , Guilherme Barreto , Jochen Steil
DOI: 10.1016/J.NEUCOM.2012.12.040
关键词:
摘要: Pointing at something refers to orienting the hand, arm, head or body in direction of an object event. This skill constitutes a basic communicative ability for cognitive agents like, e.g. humanoid robots. The goal this study is show that approximate and, particular, precise pointing can be learned as direct mapping from object's pixel coordinates visual field hand positions joint angles. highly nonlinear defines pose and orientation robot's arm. underlines possible without calculating depth 3D position explicitly since only required. To aim, three state-of-the-art neural network paradigms (multilayer perceptron, extreme learning machine reservoir computing) are evaluated on real world data gathered robot iCub. Training interactively generated recorded kinesthetic teaching case pointing. Successful generalization verified iCub using laser pointer attached its hand.