Technology
Compositions

 

Navigation

+ Home
+ Works
+ Demo
+ Biography
+ Links
+ CD
+ Contact

 

Works

Technology
+SuperPolm
+BodySuit

+RoboticMusic

+The Case Study, “BodySuit” and “RoboticMusic”

Compositions
+VirtualAERI II
+o.m.2-g.i.-p.p.
+CsO

+RoboricMusic at AV Festival 06
+"Augmented Body and Virtual Body" at Utopiales 05
+"Augmented Body and Virtual Body" at Le Cube 06

 

Language

+ English
+ Français
+ 日本語

 

o.m.2 – g.i. – p.p.

Composition for BodySuit and Interactive Video


The sound for the original composition was generated from April until July in 1997 at IRCAM for instruments and computer. Some of the sections were modified later, and then adapted for use with BodySuit, creating this new version.


First the Max with ISPW on NeXT was mainly used to generate computer sound. The sound synthesis methods which are programmed for this composition are based upon additive Synthesis, FM synthesis and Granular Synthesis. These are now ported to Max/MSP. The algorithm is based on the idea to create mechanical texture which gradually transforms as time progresses. The parameters are decided with controlled random data which are sent through many levels of hierarchy. The Granular Synthesis was especially programmed to interpolate sound constantly.


This composition is based on the density of texture and the alternation between the dynamical and the statistical aspect of the movement. The ideas of the composition are summarized in the title of which the initials mean : o.m=Onomatopoeia and montage, both of them can be heard clearly in this composition, 2=second version, g=granular, i=interpolation ; p.p=poly-phase.


The mechanical textures are superimposed one onto another. At the same time this creates poly tempo. In each section the texture starts in one shape then gradually transforms into another. Not only in the sections, but also within the whole piece, the overall phase gradually transforms and intensifies.


The form is intentionally simplified, like the succession of "block type" sections. The static sections anticipate with the kinetic sections always following. These are abruptly alternated in this piece. This idea of form was originally experimented with in a previous composition. In this composition it is evolved to further possibilities.


Playing this composition, a performer wears BodySuit, on which 12 sensors are attached on each joint of the body. BodySuit functions as an interface of gesture. Depending on a movement, sound and video images are changed in real time. This differs from a traditional instrument and a controller. A player performs with larger movements, such as stretching and bending joints, twisting arms and so on. This gesture does not function like dance or theater. It contains, however, an element of "performance" within the live musical context. The gesture is not previously decided in a strict sense. An audience may observe an obvious difference of intensity of movement between a static section and a kinetic section in the composition.


BodySuit does not produce any sound by itself at all. Sound can be generated with a program, "Max/MSP". Therefore, thist can be widely changed according to the method of programming. In the same manner, a similar gesture may derive a very different result on other sections in the piece.

This BodySuit was developed by the engineer, Patrice Pierrot at IRCAM.


The interactive video part was done with Max/Jitter. This allows modification of a movie file which was previously prepared and a picture file in real time. These can effect each other by juxtaposing as foreground, background and displacement. In this composition, the effects are especially controlled by BodySuit.

   

 



Information

PDF file about BodySuit

Article (Virtual Musical Instruments)

Mail

Blog



Copyright 2005 © Suguru Goto. Site Designed by Suguru Goto. All Rights Reserved.