作者: Jun-Wen Tan , Adriano O Andrade , Hang Li , Steffen Walter , David Hrabal
DOI: 10.1371/JOURNAL.PONE.0146691
关键词:
摘要: Background Research suggests that interaction between humans and digital environments characterizes a form of companionship in addition to technical convenience. To this effect, have attempted design computer systems able demonstrably empathize with the human affective experience. Facial electromyography (EMG) is one such technique enabling machines access states. Numerous studies investigated effects valence emotions on facial EMG activity captured over corrugator supercilii (frowning muscle) zygomaticus major (smiling muscle). The arousal emotion, specifically, has not received much research attention, however. In present study, we sought identify intensive states via activity. Methods Ten blocks pictures were separated into five categories: neutral valence/low (0VLA), positive valence/high (PVHA), negative (NVHA), (PVLA), (NVLA), ability each elicit corresponding was at length. One hundred thirteen participants subjected these stimuli provided EMG. A set 16 features based amplitude, frequency, predictability, variability signals defined classified using support vector machine (SVM). Results We observed highly accurate classification rates combined EMG, ranging from 75.69% 100.00% for baseline (0VLA, PVHA, PVLA, NVHA, NVLA) all individuals. There significant differences rate accuracy senior young adults, but there no difference female male participants. Conclusion Our provides robust evidences recognition adults. These findings contribute successful future application identifying user (HMI) or companion robotic (CRS).