作者: Myriam Desainte-Catherine , Nadine Couture , Joseph Larralde , Alexis Clay , Pierre-Henri Vulliard
DOI:
关键词:
摘要: The augmented ballet project aims at gathering research from several fields and directing them towards a same application case: adding virtual elements (visual acoustic) to dance live performance, allowing the dancer interact with them. In this paper, we describe novel interaction that used in frame of project: using dancer's movements recognize emotions he expresses, use these generate musical audio flows evolving real-time. originality is threefold. First, it covers whole cycle input (the movements) output generated music). Second, isn't direct but goes through high level abstraction: emotional expression recognized source music generation. Third, has been designed validated constant collaboration choreographer, culminating an performance front audience.