20:00 and 21:00
Aula, KHM, Filzengraben 2, 50676 Cologne, Germany
1. Description of performance
In this performance, we link the real world to the online reality of Second Life. We do this at the physical level: the bodily movements of a person in the real world control an avatar in Second Life, while an avatar’s movements guide a human being’s.
Second Life is a 3D online digital platform created by its residents. Here, each flesh-and-blood human creates a unique identity – an avatar – to inhabit the digital community environment. The contact between these realities is usually effected through a monitor, keyboard and mouse. In this work, the whole body takes part; we develop technology that makes communication between the two realities two-way and physical.
A movement-registering mechanism built into the BodySuit allows a person’s movements to directly control the behavior of an avatar in Second Life. Conversely, the Powered Suit contains motors that control the human body like a marionette. It is a sort of robot you can wear, controlled by an avatar.
With this technology, geography, location and space no longer hinder physical interaction between bodies. A Second Life avatar becomes a vehicle for physically connecting the individual to society. This could make it possible for people all over the world with the correct hardware to share each other’s bodies over the Internet.
We play with our perception of an individual’s body as his or her identity. In this project, we do not know exactly who is controlling the Powered Suit or where they are in the world. Our bodies become a combination of ‘real’ and computer-generated information and are thereby improved. Perhaps this could lead to new ways of using our bodies we can discover only by controlling them from outside, through the Internet.
2. How to participate in or see the performance with your avatar in Second Life
You can see or participate in this Interactive Installation or Performance with your avatar:
Download a customized version of the Second Life Viewer program from one of the following locations:
Go into the building and go up to the top where there is a concert hall.
The performance will take place on the stage in the concert hall. On the stage, there will be two avatars, one that is controlled by the BodySuit and another that is connected to the Powered Suit.
If you would like to participate the performance by controlling the Powered Suit (robot) with your avatar, go up to the stage and enter the zone on the right. Then, you become Powered Avatar and can control Powered Suit in real time. Depending on your gestures of Avatar, a dancer who wears the Powered Suit will peroform as you like. You can leave there anytime you like and let other avatars try this. You can also try to play some musical instruments with Powered Suit. The Powered Suit can be controlled by only one avatar at a time.
If you would like to participate in the performance by having your avatar be controlled by the BodySuit, go up the stage and enter the zone on the left. Then, you become BodySuit Avatar and can be controlled by BodySuit in real time. Depending on the gestures of BodySuit in Real Life, a dancer who wears BodySuit will control you avatar. Although your avatar is under the control, you can leave there anytime you like and let other avatars try this. You can also try to play some virtual musical instruments on this stage. Only one avatar can enter this region at one time.
Behind the stage, there is a screen which shows the real stage. You can monitor how the real stage with BodySuit and Powered Suit and the virtual stage with our avatar and your avatar perform.
Suguru Goto (Initiator of project, Concept, Inventor of BodySuit and Powered Suit, Music, Design of images)
Bernd Voss (Technical support, Academy of Media Arts Cologne)
Martin Nawrath (Electronic Technique, Academy of Media Arts Cologne)
Robert O'Kane (Unix Systems Administrator)
Sam Blanchard (Elbows of Powered Suit Technique, Ohio University)
Japan Electronics College (3D images support and cooperation)
Shinji Sasada (3D image direction)
Yuuta Ishii (3D images, Second Life 3D Images)
Tetsuya Sasaki (3D images)Miho Odagiri (Design)
Assistances:Jochen Arne Otto (Powered Suit Technique) Lorenz Wortmann (Powered Suit Technique) Julia Wewers (Powered Suit Technique) Anne-Marlen Gaus (Artistic Assistant) Marianne Heinz (Artistic Assistant) Veronika Schyra (Second Life Interface)
Very Special Thanks to: Anthony Moore, Heide Hageboelling, Academy of Media Arts Cologne
Special Thanks to: Klanglabor at KHM, Katherine Milton, Nathaniel Berger, The Aesthetic Technologies Lab - Ohio University, Christopher Keesey, IRCAM, Fuminori Yamasaki (iXs Research Corp.), Nir Bakshy, Cornelius Pöpel
4. Introduction of System
This description is intended to introduce the system, which combines "BodySuit” and especially “Powered Suit”, and “Second Life”, as well as its possibilities and its uses in an artistic application.
The system, which we propose contains both a gesture controller and robots at the same time. In this system, the Data Suit, "BodySuit" controls the avatar in “Second Life” and “Second Life” controls the exoskelton, “Powered Suit” in real time. These are related with each other in conjunction with “Second Life” in Internet. "BodySuit" doesn't contain a hand-held controller. A performer, for example a dancer wears a suit. Gestures are transformed into electronic signals by sensors.
“Powered Suit” is another suit that a dancer wears, but gestures are generated by motors. This is a sort of a wearable robot.
“Second Life” is a software that is developed by Linden Lab. It allows to create a virtual world and a virtual human, avatar in Internet.
Working together with "BodySuit", “Powered Suit”, and “Second Life” the idea behind the system is that a human body is augmented by electronic signals and is reflected as a virtual world in order to be able to perform interactively.
5. Detailed Description of the Exiting System, Gesture Controller – “BodySuit”
"BodySuit" has 12 sensors, which are placed on each joint of the body, such as a wrist, an elbow, a shoulder on the left and right arm an ankle, a knee, and the beginning of the left leg and right leg. The bending sensors are placed on the outer sides of the arms and on the front sides of the legs and fixed on a suit. Each sensor is connected with a cable to a box, and then it is connected with A/D interface.
Therefore, his gesture doesn't have to be based upon playing an instrument, but could be liberated to become a larger gesture, like a mime. This allows for collaboration with a person in a different field, for instance a dancer or an actor.
The audience easily observes this larger movement. That is to say it can be well adapted to a performance and musical theater situation.
Since this is not like a physical controller or instrument, which is held by hands, it allows to be collaborated with the idea, " Augmented Body" or "Extended Body" in the work. His body is amplified by electric signals to control something remotely or to be extended from his abstract gesture to a meaningful gesture.
6. “Powered Suit” / Robotic Suit
The idea of the “Powered Suit” is to have motors installed in each major joint of a suit that corresponds to a major joint on a human body: shoulders, elbows, wrists, hips, knees, ankles, and head / neck.
Using software, each joint motor could be controlled, and the body of the performer wearing this suit could therefore be remotely controlled, as if he or she were a marionette.
7. About “Second Life”
“Second Life” is written using C++, and the “Second Life” Viewer is now an Open Source program, so the source code is available and fully customizable.
“Second Life” supports the creation of objects – customizable and controllable items which are basic building blocks of all the things you see in the world (that aren't avatars) – and provides its own in-program scripting language called the Linden Scripting Language (LSL) that allows for control of objects and some aspects of avatars (with the proper permissions). Beyond C++ programming, some in-program LSL programming may be used to implement this project.
These new system are utilized in the project, which is entitled, "netBody - Augmented Body and Virtual BodyII." Especially, these will complete the theme in a deeper sense to explore this dualism and the relationship between artificiality and reality of human body in a context of dance/musical theater. Different realities will be connected using physical interfaces, such as Virtual (= Internet) World and Human Body (which is augmented by a robot). The world we may usually call the “real” world connects with the Internet-based world called “Second Life”. This updated project involves developing hardware and software to accomplish deeper communication between these worlds via Internet. Ultimately, the actions of one world will be reflected in the other world. Specifically, the “avatar” (a unique character or identity) in Second Life will be controlled by movements of a human body, and a human body will be controlled by movements of an avatar. For this project, this would be done using two types of physical interfaces - a motion capture suit called “BodySuit” and a robotic, controllable suit called “Powered Suit” - and custom software developed to enable communication between “Second Life” software and the physical interfaces. These developments will be further explored in performance contexts in artistic works. The Internet will be used to accomplish physical communication between bodies that would normally be impeded by geography and space constraints.