Ok

By continuing your visit to this site, you accept the use of cookies. These ensure the smooth running of our services. Learn more.

Dec 06, 2009

The iPhone Orchestra

The Stanford Mobile Phone Orchestra (MoPhO) is a new repertoire-based ensemble using mobile phones as musical instrument. MoPhO's interactive musical works take advantage of the unique technological capabilities of today's hardware and software, transforming multi-touch screens, built-in accelerometers, built-in microphones, GPS, data networks, and computation into powerful and yet mobile chamber meta-instruments.

The researcher behind the idea, Ge Wang, believes cell phones are becoming so powerful that we “cannot ignore them anymore as platforms for creativity. . . . It levels the playing ground in some ways, because everyone has a cell phone.”

 



The Stanford Mobile Phone Orchestra’s performance on December 3 at Palo Alto (CA) used an Apple iPhones amplified by speakers attached to small fingerless gloves. Here is a video of the concert.


Jul 14, 2007

The EyesWeb Project

Via Networked Performance

eywexprspace.jpg

 

From InfoMus Lab: Laboratorio di Informatica Musicale’s, Genova, Italy

The EyesWeb Project - The EyesWeb research project aims at exploring and developing models of interaction by extending music language toward gesture and visual languages, with a particular focus on the understanding of affect and expressive content in gesture. For example, in EyesWeb we aim at developing methods able to distinguish the different expressive content from two instances of the same movement pattern, e.g., two performances of the same dance fragment. Our research addresses the fields of KANSEI Information Processing and of analysis and synthesis of expressiveness in movement. More.

The EyesWeb open platform (free download) has been originally conceived for supporting research on multimodal expressive interfaces and multimedia interactive systems. EyesWeb has also been widely employed for designing and developing real-time dance, music, and multimedia applications. It supports the user in experimenting computational models of non-verbal expressive communication and in mapping gestures from different modalities (e.g., human full-body movement, music) onto multimedia output (e.g., sound, music, visual media). It allows fast development and experiment cycles of interactive performance set-ups by including a visual programming language allowing mapping, at different levels, of movement and audio into integrated music, visual, and mobile scenery.

EyesWeb has been designed with a special focus on the analysis and processing of expressive gesture in movement, midi, audio, and music signals. It was the basic platform of the EU-IST Project MEGA and it has been employed in many artistic performances and interactive installations. More.

16:15 Posted in Future interfaces | Permalink | Comments (0) | Tags: cybermusic