Visuano — Real-time projection mapping visual + Piano + live Performance

屏幕快照 2013-02-11 4.11.44 PM

Visuano is a project done by me and Yinan Zhang, Qiuyi Wu.

The objective is to incorporate pianist’s gesture and real time motion graphics into classic piano movements for both solo and orchestra performance.

Two different experiments & live performances have been done in Boston University Piano Recital, and Mannes New school of Music, New York.


Hybrid the air instrument and the piano visualization;

Performer’s gesture capture system  +   live music visualization process by piano/keyboard

1, Gesture Capture to do real-time control and manipulation.
2, Audio signals to trigger a certain visual events or generate visual effects.

Original Concept Video ( Credit — Vocal: QiuYi Wu; Pianist: YiNanZhang ) :

Researches of music and gesture movements:

This whole project contains a lot of researches and regards to pianist’s gestures. Many pianists think that: ‘every gesture you make while playing should reflect the character of the music, its emotional and dramatic content’. Why is this necessary?

The pianistic gestures influence not only the quality of our sound and the character of the objectively heard music. They also create visual illusions, providing ‘back-up’ (or canceling) the dramatic content of the performed work. That’s why the purpose of the piano gesture is not only to enhance the quality and the character of the sound, but also to offer a convincing ‘visual support’ for the musical image you’re trying to communicate to the public.​

We analysed the natural body movement from piano performance, and created a Matrix that regarding to different possible gestures of a pianist.

屏幕快照 2013-02-11 3.40.17 PM

Technique Included in the project:

Lots of technique explored for this project, although not all of them were implemented for the final performance.

After all, there are four parts: live Input and output using Max Msp Jitter & Kinect , Projection Mapping using Madmapper, Visual design using After Effects, interactive audio visualization using OpenFrameworks.

屏幕快照 2013-02-11 3.57.21 PM 屏幕快照 2013-02-11 3.58.01 PM 屏幕快照 2013-02-11 3.58.15 PM 屏幕快照 2013-02-11 3.58.27 PM 屏幕快照 2013-02-11 3.58.38 PM

Real Time Input and Output

We used Max/MSP/Jitter to do visual effects interpolation, connected with Kinect to capture the silhouette of pianist, record the body movement while creating life mask of pianist’s body.​

We used iPad to control TouchOSC system to do real time manipulation; Synapse to capture the Kinect image to Max/MSP/Jitter.

屏幕快照 2013-02-11 4.04.37 PM 屏幕快照 2013-02-11 4.04.51 PM 屏幕快照 2013-02-11 4.05.21 PM 屏幕快照 2013-02-11 4.05.36 PM

Interactive Audio Visualization + Projection Mapping

OpenFrameworks is the main platform to generate and control animation. Volume of piano sound wave controls the speed of motion and brightness of lines. Using Macbook trackpad to control the directions of lines movements.

Madmapper is an easygoing software for mapping project like ours. Once it projected on the piano, the structure automatically appear on the screen, we manually draw masks and set positions for videos.
Syphon is a powerful open sources for sharing videos in realtime, connecting different creative VJ software for stunning audio visual performances.

屏幕快照 2013-02-11 4.13.45 PM

Mapping demo for performance:

Sections of Visual Design

We created several different visual pieces from researches of music plus insight feelings.

屏幕快照 2013-02-11 4.22.19 PM

Live Performances 

1. Collaboration with Novatrio at Boston University

屏幕快照 2013-02-11 4.35.54 PM

We have connected to NovaTrio, which located in Boston University, a lot of technique progress has been done during that collaboration.​ We were in charge of the reactive media design and live visual manipulation in this performance to develop 2 parts of piano music from various genres in the music history, they are:​
English Suite No.5 in E Minor by J.S. Bach
Chaconne by Sofia Gubaidulina

Installation Flow ( click the image to see details )

屏幕快照 2013-02-11 4.31.25 PM 屏幕快照 2013-02-11 4.32.29 PM 屏幕快照 2013-02-11 4.33.00 PM

Interactive Video Demo using OpenFrameworks:

Use geometric shapes to indicate the inner strucure of Bach’s music.  The tension of motion speed synchronizes the expression of music. 

Live Performance documentation videos:

Video 1 —

Chaconne in metaMorphosis

During performance, pianist’s silhouette was captured and projected back to the background screen. Then real-time visual effects were triggered by the piano sound wave that added to the gesture sihouette. We used ipad to do manipulation to let the visual effects match various moods in the music timeframe.
The visual effects were multiplied on the real captured figure and gesture of the pianist.

Kinect, TouchOSC, Max/MSP/Jitter, Synapse

Video 2 —

J.S Bach – English Suite No.5:
Sarabande, Passepied, Gigue

We designed dynamic patterns and geometric shapes floating in the background of stage as projected animation to interpret the logical structures of Bach’s music. The piano music controls the speed and brightness attributes of the geometric shapes generated by the computer program.

OpenFrameworks, trackpad

2. Collaboration with Mannes the New School of Music, New York

屏幕快照 2013-02-11 5.11.56 PM

We did a collaboration with Mannes The New School for Music and invited piano major student Shulin Guo from Mannes to play the piano. She prepared 3 pieces of music during the visualization testing, they are:
Debussy – Etude (Pour les arpèges composés)
Scriabin- Sonata No.3 (3rd Movement)
Prokofiev- Sonata No.7 (3rd Movement)

Kinect camera captures the silhouette of pianist. Import videos and Kinect mask to Max/MSP/Jitter, then use Jitter to combine them together. Syphone to transmit Jitter video files to Madmapper. Use Madmapper to do projection mapping.


3. Collaboration with Fashion Designer Gabi Asfour, Baltimore Symphony Orchestra and Parsons Festival 2013

We did a collaboration with Fashion Designer Gabi Asfour and Baltimore Symphony Orchestra. Gabi designed a clothes for this project’s pianist, and we did a real-time live performance in Parsons Festival 2013.

Some news reports of this project:

The test in ThreeAsfour Studio:


DSC07743DSC07765 DSC07777

The real-time live performance in Parsons Festival 2013:




P1080104 P1080106 P1080117 P1080125 P1080124 P1080123

Real-Time Performance in Parsons Festival 2013:

And this is the whole video of the Parsons Festival 2013, this project’s real-time live performance is from 0:26:15 to 0:32:00.

Special Credits for Visuano:

AAPP — All Around Promotion Platform


AAPP is my thesis project in Parsons The New School for Design, New York.

It is an interactive 3D platform, a gesture based Natural User Interface platform to let people use gestures only to interact with digital content / 3D models without any physical controllers or wearable devices. This platform aims to provide an alternative way of interaction between humans and digital content, rather than traditional mouse or keyboard interaction, increasing the realism and enhancing the user experience.

All Around Promotion Platform (AAPP) uses strategies derived from Human-Computer Interaction, brand experience, and Natural User Interface design to create dimensional 3D display controlled by haptic gestures that can be used to market a range of commercial applications. My thesis project chooses car customization in a car show scenario as the example to show how it would work. This project makes use of product customization strategies already available in online car customization sites, but presents the information by immersive, dimensional 3D display controlled by hand gestures. To manipulate the 3D car model, customers at the show use gestures to rotate, zoom, select and view details of the car, explore inside the car, and customize their own cars such as changing color and accessories and previewing the effect in real-time, interaction and the process of customization can be more immersive and engaging with this gestural platform.

The basic features / functions of this project:

1. Through this platform, users could directly use gestures only to manipulate the 3D car model such as rotate and zoom, without any physical controllers.

2. Users can use gesture to select and click different details of the car such as wheel & tire. When users click the wheel & tire, it will play an intro animation of the wheel & tire to let users get familiar with the details.

3. Users can use gestures to customize their own cars, such as changing different color, accessories of the car.

4. Users can use gesture to click the door of car, to open the door and get into the car to explore the inside of the car.

More features are being developed.


This project has been exhibited in Parsons The New School for Design’s Design & Technology 2013 Thesis Exhibition.

During the exhibition, many people played and interacted with it, here are some photos of this platform in the exhibition, and how people interacted with it:


1368129318798_a7dea242455d 1368221746910_0cb4f9bec195



in fact, this platform is not limited to the car show and car customization scenario. It can be used to many other areas. All the content in this platform is with 3D effect. And more features are being developed. All the 3D content and features can be customized to fit different uses and scenarios.

This platform could fit some commercial scenarios such as for car shows, to provide an immersive and more engaging experience of interaction and customization. Also, this gesture based platform can let people interact with something which does not exist yet, and could be used in many other areas, such as surgical training for doctors, previewing printing mode for 3d models intended for 3d print, previewing architectural and urban planning models. This platform might be used for helping the design process of some engineered mechanical products such as cars, boats, to let people can interact with the demo models, and explore them, even look inside them, etc. And mathematicians could use this platform to mange and interact with some large data sets.


The tutorial video:

This is the tutorial video to show how this project work, the interface and looking in this tutorial video might look different with the actual platform. 


Presentation PDF

Flowing Wall — A project I made in Microsoft

In this summer, I did an internship in Microsoft Research Asia’s Human Computer Interaction group. This is one of the project I made, during my intern in Microsoft.

The name of this project is called ”Flowing Wall”.

In this project, I explore full-body interaction with big wall presentation scenario / public display scenario such as interactive advertisement.

This setting-up aims to generate immersive user experience through exaggerating visual effect and feedback such as view perspective and motion rhythm, for example, when walking toward/away or moving from one side to another.

I want to explore a connection with natural interactions in presentation, remove any physical controllers.

In this project, I use Kinect and Windows 8 style UI which is MODERN, CLEAN, FAST and IN MOTION, to make the presentation more “alive”, responsive and emotional.

I am designing and developing a platform which combines body positions and hand gestures, to control and interact with digital contents on the big wall. 

The contents on the big screen will look like alive and respond to you directly, naturally and more interactively, not like the ‘dead’ traditional PPT slides.

The body’s position serves as local panning to change the perspective of the flowing wall, and we could use hand gestures as the global scrolling to scroll the content on the wall.

This platform can be used in many applications / scenarios. Here are some examples.

1. Product line keynote speak

This is the demo example video of one of the application / scenario for this platform—Microsoft product line keynote speak.

In this kind of keynote speak, this platform highlights the performance aspect of the presentation and engage with the audience. Make the presentation process more emotional.

2. Interactive Map
Another example application for this platform is Interactive Map.

When we search a map through internet, for the maps in web, we can just see the map with eyes and the place on the map is just somewhere else.

However, with this platform, we can feel the map with body, the scale is 1:1 and it simulate the real scene and make the users feel just like in reality, it largely enhance the user experience and it is compelling engaging.

When you go close to the wall , the map will turn from 2D to 3D street side. When you walk along the wall, the street view perspective will changed depends on your body position emotionally. The experience is just like you walk around that place in reality.

Example video (simulation) :

3. interactive TV

This platform could use for TV too. We could use this gesture and body position based platform to interact with the menu or contents on TV.

This is the demo example video for Microsoft Xbox TV.

Presentation PDF

Facial Energy (Smile Can Change The World)

The concept of this project is —- ”Facial Energy”.
That is use facial expression to become energy source, engine, to become a power.
Use face to be the energy source to control the music, to control other things.
At this stage, my starting point for this exploration of the facial expression is smile, I want to start from smile, because smile sometimes is a power.
So the name of this Final Project is “Smile Can Change The World”.
This project includes a series (three) of projects / applications related to smile.
In these projects, smile is the engine, the energy source, the power to make everything happen.
We can only use our smile to make a difference, we can only use smile to change everything and make everything happen.
Just smile, it will be different.
First, I planned to start from the virtual world.
A virtual world that controlled by Smile.
This is a virtual world I built, and only the smile will be the energy source to control this world.
1. We can only use our smile to play music (the pitch and note of the music will depend on the extent of our smile).
2. There are many objects fall depend on the rules of gravity, as usual.
However, when we smile, we can change the gravity of this world,we can beat the gravity.
When we smile, everything will stop falling,and will fly around our smile。At the same time we can hear the object’s laughter.  Our smile is their energy.
The harder we smile, the stronger energy they will get.
If we do not smile, nothing will happen, if we smile, it can change the rule of this world.
A game related to Smile
In the first virtual world, the smile’s power changed and beat the gravity, made the gravity disappear.
In this second virtual world, on the other hand, we can use smile to make it happen.
We can use smile as an energy source to trigger gravity.
When we smile, we could hear the laugher, and it will produce gravity, the objects in this world will fall depend on our smile, which is a force to produce gravity.
It’s like a game, when we keep smiling, and objects in this world will keep appearing and falling.
And there is a scoreboard at the upper left corner of the screen. The longer time we smile, the higher score we will get.
Smile Energy / Engine.
This project is use smile as an energy source and Engine to control something in reality.
In this case, the smile as an energy source and Engine will control and launch the light and sound.
The light and sound will be launched only when we smile. The smile is the only energy and power to control and launch it all.
In this case, I used Arduino.
Just like we use smile to conductive (Future Direction ! )
When we do facial expressions we will produce energy, I am wondering that whether we could use this energy as an energy source to control something or contribute to some industries , such as use this energy to conductive.  And to see whether it can be a more environmentally friendly energy source and everyone can produce and use.
This project is the start of this research.
So the future direction of this project is :
Explore the possibility of using the energy generated by the movement of facial expressions as the energy source, to see whether we could collect this energy, convert and transfer this energy to other forms of energy such as electricity. So that the facial expressions will be the original energy, and to become a real green energy, and everyone can produce and use. As everyone knows how to smile.
Presentation PDF is here:

Glowtrix — Presentation of my project in Eyebeam Art + Technology Center, New York, NY

This is my Fashionable Technology project. I did a presentation of this project in Eyebeam Art + Technology Center, New York, NY.

This project is a jacket I made, it is called “Glowtrix”, it is a waterproof jacket that could interact with water.

The concept of this Jacket is “wearing water”, and —

“The water, the Nature illuminates the color of the city.”

I used Hydrochromic ink to print some cities’ names such as New York, London, Tokyo, Hong Kong, Paris on the jacket. The original color of the ‘cities’ is just white, but when the water or raindrop contacts the ‘cities’ on the jacket, it will change the color of the ‘cities’ to blue, the color of the ocean. Just like we wear ocean on our jacket. And we could feel the movement of the water. When the water leave, when it become dry, the color of the ‘cities’ will turn back to white.

It represents my concept: The water, the Nature illuminates the color of the city.

The jacket :

The presentation in Eyebeam Art + Technology Center, New York, NY :

For more detail informations such as inspiration, progress about this project, please see the presentation PDF here:


Love Will Guide You

Concept and Problem Statement:

The concept and idea of this project is inspired by one of my friend. He’s very shy and ALWAYS feel embarrassed to express his real emotion to the girl he like.

Some shy people always feel embarrassed and shy to express themselves directly, especially to the people they like, it is very hard to express their real feelings and emotions directly by mouth. Sometimes men always feel embarrassed to say sorry, or express the real love and feeling to the people they love. I want to help my friend, I want to help this kind of shy people to express their real emotions to the people they like. So I was wondering that wether the shy people can express themselves in some efficient, interactive, interesting and cute ways.

Initial Prototype:

My initial idea is to make an interactive instrument that we could use our body gestures to play different notes. When our body move to different directions, it will play different notes.  That is use music to express the emotion.


As I think my initial prototype is not straight forward, efficient and interesting enough to express emotions, I came up with a new interactive way for this webcam based interactive application. That is we could see ourselves on the screen and interact and affect the movement of the ‘hearts’ on the screen.

Also, I came up with a new idea for the people who prefer to use texts to express themselves. That is to make a custom designed texts based Magic 8 Ball, and they can design their own texts, put them in the Magic 8 Ball, and send it as a gift to the girl or boy they love. I made the official website of this project, it should includes the webcam based interactive application and the Magic 8 Ball plan, it offers different plans and ways for people to choose which way they want to use.

The official website of this project:

Final Version: 

The final version of this project includes the completed webcam based interactive application, the concept of the custom designed texts based Magic 8 Ball plan, and the official website of this project.

Example Video:

User Test Video:


Processing code:

Max/Msp/Jitter patch:




This project is aimed to make some interesting, cute, straight forward and interactive ways to help shy people express their real emotions to the people they love. I made a webcam based interactive application,  shy people who tested this application feel comfortable to use it, and think it is interesting. Also, some people want to express themselves by the specific texts they write. So I came up with the custom designed texts based Magic 8 Ball plan. That is let the people write their own texts, which they want to tell the one they love, and put the texts in the Magic 8 Ball, then send it as a gift to the people they love. I made a official website for this project. Some shy people tested this project and feel comfortable to use the interactive application and the custom designed texts based Magic 8 Ball.

Final presentation PDF:

Love Will Guide You Final Presentation PDF

Interactive Noise Wall

This project is inspired by the broken screen of my Macbook Pro…

The screen is broken, and it keeps showing some patterns and effects, just like made by openFrameworks (it’s so openFrameworks style).

I decided to simulate the effect of the broken screen, this is the starting point.

Then, I extended this idea, since I think the effect of my broken screen looks like a ‘noise wall’, so I decided to make a noise wall with interactivity.

Basically, I made some moving points as the pattern to simulate this kind of broken effect, it looks like a moving ‘noise wall’, and will change to different states when we pressing mouse and different keys (key ‘a’ and key ‘b’).

This is a noise wall which could make people interact with it.  I combined this noise wall with sound. When we click and touch different positions on the noise wall, it will show the ‘noise’ effect depend on  the position you clicked.  At the same time, it will generate different ‘noise’ sound / music I made.