Resonance – Trailer
Resonance is a live motion-scored performance attempting to express the tone of memory. A dancer is dancing to the memory of another dancer, and together their movement will create a musical piece.
Interactive sound design
In this project, I designed and implemented a music generating algorithm in Max Msp based on motion capture data sent through OSC from Unreal Engine. Then the MIDI data was transmitted to Ableton Live to generate sound.
Max/MSP motion to sound mapping
Basically the MIDI information is generated inside Max Msp, so that the higher both hands are, the higher the note is; the faster the movement, the stronger the volume. Then the MIDI information is passed to Ableton Live to create music.
Hand height -> [A3, C4, E4, A4, …]
Hand spatial speed -> volume [0 – 100]
Hearing “resonance” between the dancer and her memory is another goal of design. In order to achieve this, the relevance between both dancers’ movements is measured, and it determines the “resonance” quality of the sound. Thus if they dance resonates, the sound will become more distant, tranquil, and powerful.
100 – Abs (v1 – v2) -> resonance
(image courtesy of Andrew T. Foster)
This is a project for an ITP class Bodies in Motion, taught by Todd Bryant and Javier Molina.
Motion capture dancer: Sylvana Tapia and Kyla Ernst-Alper.
Group members: Dana Abrassart, Wangshu Sun, Wenbo Lan.