June 3rd (Thursday): Claudia Robles (Colombia, resident in Cologne – Germany)
The use of Bio-interfaces in my interactive Multi-Media performances.
5.15 pm – Parker Gallery (ground floor), 41-71 Commercial Road, London E1, London Metropolitan University:
There will be a live performance of the piece INsideOUT (2009), which works with 2 computers using both MAXMSP/Jitter, 2 screens, an EEG (electroencephalogram) interface and a quadraphonic sound field. The brain waves control video and sound during this performance.
After the performance, she will talk about this and another of her pieces, also using Bio-feedback.
Claudia Robles about her work: “I am particularly interested the interaction between media (audio and
visual) and bio-data from performers or from an audience. This is done by using Biofeedback – the process of measuring physiological data from a subject, analyzing the data, and feeding it back to the subject. This presentation is about two interactive performances which I created using bio-interfaces: an EMG (electromyogram) and an EEG (electroencephalogram). Both pieces, which were programmed in real time media with the software MaxMsp/Jitter, are described below:
Seed/Tree (2005) Installation: Butoh performance created during an artist in residence at the ZKM (Center for Art and Media) Karlsruhe (Germany). In this installation there are two types of interactivity. The first is the interaction between dance and sound in which the performers have microphones and EMG electrodes attached to their bodies; the breathing and heartbeat of two of the performers produce sounds that are continuously modified by the muscular tension of the third dancer. The second type of interactivity is that between the installation space and the visitors: during the performance, visitors can walk freely around the virtual forest and their presence interacts with the video projections.
INsideOUT (2009). This project was created during an artist in residence program at the Academy of Media Arts in Cologne (Germany). The performer, who is surrounded by sound and images, interacts with them using an EEG (electroencephalogram) interface, which measures the performer’s brain activity. The sounds and images – already stored in the computer – are continuously modified, via MAX/MSP-Jitter, by the values from two electrode combinations.”