Visuano — Real-time projection mapping visual + Piano + live Performance
Visuano is a project done by me and Yinan Zhang, Qiuyi Wu.
The objective is to incorporate pianist’s gesture and real time motion graphics into classic piano movements for both solo and orchestra performance.
Two different experiments & live performances have been done in Boston University Piano Recital, and Mannes New school of Music, New York.
Hybrid the air instrument and the piano visualization;
Performer’s gesture capture system + live music visualization process by piano/keyboard
1, Gesture Capture to do real-time control and manipulation.
2, Audio signals to trigger a certain visual events or generate visual effects.
Original Concept Video ( Credit — Vocal: QiuYi Wu; Pianist: YiNanZhang ) :
Researches of music and gesture movements:
This whole project contains a lot of researches and regards to pianist’s gestures. Many pianists think that: ‘every gesture you make while playing should reflect the character of the music, its emotional and dramatic content’. Why is this necessary?
The pianistic gestures influence not only the quality of our sound and the character of the objectively heard music. They also create visual illusions, providing ‘back-up’ (or canceling) the dramatic content of the performed work. That’s why the purpose of the piano gesture is not only to enhance the quality and the character of the sound, but also to offer a convincing ‘visual support’ for the musical image you’re trying to communicate to the public.
We analysed the natural body movement from piano performance, and created a Matrix that regarding to different possible gestures of a pianist.
Technique Included in the project:
Lots of technique explored for this project, although not all of them were implemented for the final performance.
After all, there are four parts: live Input and output using Max Msp Jitter & Kinect , Projection Mapping using Madmapper, Visual design using After Effects, interactive audio visualization using OpenFrameworks.
Real Time Input and Output
We used Max/MSP/Jitter to do visual effects interpolation, connected with Kinect to capture the silhouette of pianist, record the body movement while creating life mask of pianist’s body.
We used iPad to control TouchOSC system to do real time manipulation; Synapse to capture the Kinect image to Max/MSP/Jitter.
Interactive Audio Visualization + Projection Mapping
OpenFrameworks is the main platform to generate and control animation. Volume of piano sound wave controls the speed of motion and brightness of lines. Using Macbook trackpad to control the directions of lines movements. http://www.openframeworks.cc
Madmapper is an easygoing software for mapping project like ours. Once it projected on the piano, the structure automatically appear on the screen, we manually draw masks and set positions for videos.
Syphon is a powerful open sources for sharing videos in realtime, connecting different creative VJ software for stunning audio visual performances.
Mapping demo for performance:
Sections of Visual Design
We created several different visual pieces from researches of music plus insight feelings.
1. Collaboration with Novatrio at Boston University
We have connected to NovaTrio, which located in Boston University, a lot of technique progress has been done during that collaboration. We were in charge of the reactive media design and live visual manipulation in this performance to develop 2 parts of piano music from various genres in the music history, they are:
English Suite No.5 in E Minor by J.S. Bach
Chaconne by Sofia Gubaidulina
Installation Flow ( click the image to see details )
Interactive Video Demo using OpenFrameworks:
Use geometric shapes to indicate the inner strucure of Bach’s music. The tension of motion speed synchronizes the expression of music.
Live Performance documentation videos:
Video 1 —
Chaconne in metaMorphosis
During performance, pianist’s silhouette was captured and projected back to the background screen. Then real-time visual effects were triggered by the piano sound wave that added to the gesture sihouette. We used ipad to do manipulation to let the visual effects match various moods in the music timeframe.
The visual effects were multiplied on the real captured figure and gesture of the pianist.
Kinect, TouchOSC, Max/MSP/Jitter, Synapse
Video 2 —
J.S Bach – English Suite No.5:
Sarabande, Passepied, Gigue
We designed dynamic patterns and geometric shapes floating in the background of stage as projected animation to interpret the logical structures of Bach’s music. The piano music controls the speed and brightness attributes of the geometric shapes generated by the computer program.
2. Collaboration with Mannes the New School of Music, New York
We did a collaboration with Mannes The New School for Music and invited piano major student Shulin Guo from Mannes to play the piano. She prepared 3 pieces of music during the visualization testing, they are:
Debussy – Etude (Pour les arpèges composés)
Scriabin- Sonata No.3 (3rd Movement)
Prokofiev- Sonata No.7 (3rd Movement)
Kinect camera captures the silhouette of pianist. Import videos and Kinect mask to Max/MSP/Jitter, then use Jitter to combine them together. Syphone to transmit Jitter video files to Madmapper. Use Madmapper to do projection mapping.
3. Collaboration with Fashion Designer Gabi Asfour, Baltimore Symphony Orchestra and Parsons Festival 2013
We did a collaboration with Fashion Designer Gabi Asfour and Baltimore Symphony Orchestra. Gabi designed a clothes for this project’s pianist, and we did a real-time live performance in Parsons Festival 2013.
Some news reports of this project:
The test in ThreeAsfour Studio:
The real-time live performance in Parsons Festival 2013:
Real-Time Performance in Parsons Festival 2013:
And this is the whole video of the Parsons Festival 2013, this project’s real-time live performance is from 0:26:15 to 0:32:00.
Special Credits for Visuano: