作者: Amir Sadeghipour , Ramin Yaghoubzadeh , Andreas Rüter , Stefan Kopp
DOI: 10.1007/978-3-642-10403-9_20
关键词: Focus (computing) 、 Human–robot interaction 、 Embodied cognition 、 Probabilistic logic 、 Artificial intelligence 、 Gesture 、 Psychology 、 Representation (systemics) 、 Motor learning 、 Motor program
摘要: In this paper we present a biologically-inspired model for social behavior recognition and generation. Based on an unified sensorimotor representation, it integrates hierarchical motor knowledge structures, probabilistic forward models predicting observations, inverse learning. With focus hand gestures, results of initial evaluations against real-world data are presented.