Projection Mapped Music Performance

I've been fascinated by projection mapping for a long time, and for my senior project, lit up the walls with a surprise audiovisual performance

I’ve always had an attraction to technological experiences that create a sense of awe, and the first time I saw a video of someone changing the appearance of real-life objects using projection mapping, I knew I had to tinker with it–just seeing boxes shape shift between different colors blew my mind. I also had a small obsession with the impressive musical mashups people made with the Launchpad controller, and wanted to combine these two interests to create a live music and visuals performance. My senior project gave me an opportunity to do so, and while there was so much more that I wanted to add to this project, I’m still proud of what I was able to do in just my spare time over the course of a month. I was extremely secretive about what I was working on, and on the day of the performance, brought everyone into a dark room, and triggered the first clip. Hearing the audience’s “woah!” when I hit the button that lit up the walls was a huge thrill.

The control flow starts with a Novation Launchpad, connected to my laptop running Ableton Live, which drives all of the audio. The laptop is connected to a more powerful desktop over a private wired network, and sends OSC messages to it when actions are performed in Live.

The desktop is connected to two projectors, and renders the visuals in real-time with Unity. Different camera path animations are triggered by actions performed in Live with the Launchpad. Unity’s camera output textures are shared using Spout with custom projection mapping software I wrote for actually warping the images and projecting onto the walls (existing software existed, but was generally very expensive).

Everything was in place for a continuous-looking immersive image to span across both walls, but due to the very last-minute nature of the project, and a design flaw with the projection software, I had to fallback on mirroring one picture to both walls right before presenting. For some reason parts of the visuals didn’t render, causing the very dark/empty scenes. I’m still not sure why – it never happened during testing. I was kind of bummed, but it looked cool, I guess ¯\_(ツ)_/¯. The project was put together in such a short timeframe that I never had a chance to do a full rehearsal of the final version, so I’m still proud of what I was able to pull of.

During development I had limited time testing in the performance space, and limited access to all of the hardware used. I wrote a simple app in Unity that allowed me to preview what the real performance would look like (based on a model of the room) in VR with an HTC Vive. The real-world projection software, titled rlProjection, was developed shortly afterwards, and is available at https://github.com/GiantSox/rlProjection

Next Project

VR-Sculpted Figurines