Cooperative dancing with an industrial manipulator: Computational cybernetics complexities

作者: Figen Özen , Dilek Bilgin Tükel , Kübra Tural , None

DOI: 10.1109/SMC.2016.7844526

关键词:

摘要: Synchronization of music and dance industrial manipulators is studied, simulated realized. The results are shown in this paper. To formulate the dance, a modified Labanotation software has been developed, user-friendly human-machine interface designed to enable user program robot easily. system incorporated with analyzer, employing Fast Fourier Transform, track beats. dances synchrony input from analyzer. Experiments done on six degree-of-freedom robot.

参考文章(19)
Sulabh Kumra, Ferat Sahin, Dual flexible 7 DoF arm robot learns like a child to dance using Q-learning service oriented software engineering. pp. 292- 297 ,(2015) , 10.1109/SYSOSE.2015.7151920
Joao Lobato Oliveira, Luis Paulo Reis, Armando Sousa, Catarina B. Santiago, Autonomous robot dancing synchronized to musical rhythmic stimuli iberian conference on information systems and technologies. pp. 1- 6 ,(2011)
T Okamoto, T Shiratori, S Kudoh, K Ikeuchi, Temporal scaling of leg motion for music feedback system of a dancing humanoid robot intelligent robots and systems. pp. 2256- 2263 ,(2010) , 10.1109/IROS.2010.5652811
Yen-Fang Li, Chi-Yi Lai, Intelligent algorithm for music playing robot — Applied to the anthropomorphic piano robot control international symposium on industrial electronics. pp. 1538- 1543 ,(2014) , 10.1109/ISIE.2014.6864843
Joao Lobato Oliveira, Gokhan Ince, Keisuke Nakamura, Kazuhiro Nakadai, Hiroshi G. Okuno, Luis Paulo Reis, Fabien Gouyon, An active audition framework for auditory-driven HRI: Application to interactive robot dancing 2012 IEEE RO-MAN: The 21st IEEE International Symposium on Robot and Human Interactive Communication. pp. 1078- 1085 ,(2012) , 10.1109/ROMAN.2012.6343892
Ju-Hwan Seo, Jeong-Yean Yang, JaeWoo Kim, Dong-Soo Kwon, Autonomous Humanoid Robot Dance Generation System based on real-time music input robot and human interactive communication. pp. 204- 209 ,(2013) , 10.1109/ROMAN.2013.6628446
Takuma Otsuka, Kazuhiro Nakadai, Toru Takahashi, Tetsuya Ogata, HiroshiG Okuno, Real-time audio-to-score alignment using particle filter for coplayer music robots EURASIP Journal on Advances in Signal Processing. ,vol. 2011, pp. 384651- ,(2011) , 10.1155/2011/384651
Takahiro Okamoto, Takaaki Shiratori, Shunsuke Kudoh, Shinichiro Nakaoka, Katsushi Ikeuchi, Toward a Dancing Robot With Listening Capability: Keypose-Based Integration of Lower-, Middle-, and Upper-Body Motions for Varying Music Tempos IEEE Transactions on Robotics. ,vol. 30, pp. 771- 778 ,(2014) , 10.1109/TRO.2014.2300212
Hiroshi G. Okuno, Tetsuya Ogata, Kazunori Komatani, Computational Auditory Scene Analysis and Its Application to Robot Audition: Five Years Experience Second International Conference on Informatics Research for Development of Knowledge Society Infrastructure (ICKS'07). pp. 69- 76 ,(2007) , 10.1109/ICKS.2007.7
Angelica Lim, Tetsuya Ogata, Hiroshi G Okuno, Towards expressive musical robots: a cross-modal framework for emotional gesture, voice and music Eurasip Journal on Audio, Speech, and Music Processing. ,vol. 2012, pp. 3- ,(2012) , 10.1186/1687-4722-2012-3