Second version of our "Rephlexion" interactive installation. The team ( Phillipe, Mag, Roy ) got back together for a week residency aiming at finalising a project started a year ago. Probably the only way to be really efficient considering each of us lives at a different geographical location.

boom

We ended up with a fully spatialized audiovisual response to the user(s) presence and movements. 4 speakers on each side of the interactive area allow us 3d sound manipulation. One side of the tunnel being used for retroprojection and the other lit up from above to optimise the camera vision background substraction.

boom

The space is divided in four zones. Each zone triggers a generative graphic and sound response, itself influenced by the body position (arms, head and hands). Once a second person enters an interactive area, symblolic links are being displayed and the composition becomes collaborative. Up to the public to use it then as they wish.

Below is a clumsy video of the display projected on one side of the interactive area.

The absence of physical interface makes it a very spontaneous experience. It was pleasing to see that people of different ages and background would immersed themselves with such an ease in our own little creative galaxy.

We also found out we can use this space as an instrument. Using custom video or light input to generate audio video ouputs regardless of people’s positions, or even presence.

We’re now hoping to show it to a larger public very soon.

On the technical side, we used 4 pieces of software run on three computers.
Eyes web on computer A taking care of the computer vision, mapping points along the person’s limbs.


Eyes web would then send the coordinates via OSC messages to Pure Data on computer B. Pure Data’s job is to allocate the points to new custom OSC events and send it to either Processing (running on the same computer) or Reaktor (on computer C).

Pure data and processing happy to run together on the same computer cause they are super cool :

Reaktor isn’t that cool so we left him in his own. It tends to develop allergy to OSC causing it to randomly crash.

Splitting complex camera vision into simple OSC nodes allowed us to get the system ready before physically meeting up. We then had time to fine tune the system and most importantly make it sound and look like we wanted.

It has proven to be a strong and flexible architecture allowing us to map new sounds and graphics to it. So if you have any idea or suggestions, be my guest.

Comment

Textile Help