作者: Ali Mollahosseini , Hojjat Abdollahi , Mohammad H Mahoor
DOI: 10.1109/ROMAN.2018.8525777
关键词:
摘要: Social robots are becoming an integrated part of our daily lives with the goal understanding humans' social intentions and feelings, a capability which is often referred to as empathy. Despite significant progress towards development empathic agents, current have yet reach full emotional capabilities. This paper presents recent effort on incorporating automated Facial Expression Recognition (FER) system based deep neural networks into spoken dialog robot (Ryan) extend enrich its capabilities beyond integrate user's affect state robot's responses. In order evaluate whether this incorporation can improve Ryan, we conducted series Human-Robot-Interaction (HRI) experiments. these experiments subjects watched some videos Ryan engaged them in conversation driven by facial expressions perceived robot. We measured accuracy FER when interacting different human well three social/interactive aspects, namely task engagement, empathy, likability The results HRI study indicate that rated empathy affect-aware significantly higher than non-empathic (the control condition) Ryan. Interestingly, found not limiting factor, agent equipped low likable expression was recognized observer.