By continuing your visit to this site, you accept the use of cookies. These ensure the smooth running of our services. Learn more.

Jul 14, 2007

The EyesWeb Project

Via Networked Performance



From InfoMus Lab: Laboratorio di Informatica Musicale’s, Genova, Italy

The EyesWeb Project - The EyesWeb research project aims at exploring and developing models of interaction by extending music language toward gesture and visual languages, with a particular focus on the understanding of affect and expressive content in gesture. For example, in EyesWeb we aim at developing methods able to distinguish the different expressive content from two instances of the same movement pattern, e.g., two performances of the same dance fragment. Our research addresses the fields of KANSEI Information Processing and of analysis and synthesis of expressiveness in movement. More.

The EyesWeb open platform (free download) has been originally conceived for supporting research on multimodal expressive interfaces and multimedia interactive systems. EyesWeb has also been widely employed for designing and developing real-time dance, music, and multimedia applications. It supports the user in experimenting computational models of non-verbal expressive communication and in mapping gestures from different modalities (e.g., human full-body movement, music) onto multimedia output (e.g., sound, music, visual media). It allows fast development and experiment cycles of interactive performance set-ups by including a visual programming language allowing mapping, at different levels, of movement and audio into integrated music, visual, and mobile scenery.

EyesWeb has been designed with a special focus on the analysis and processing of expressive gesture in movement, midi, audio, and music signals. It was the basic platform of the EU-IST Project MEGA and it has been employed in many artistic performances and interactive installations. More.

16:15 Posted in Future interfaces | Permalink | Comments (0) | Tags: cybermusic

The comments are closed.