The Leap Motion Controller gives musicians an enormous amount of freedom in creating their sounds. Since we released our technology to the world last year, we’ve seen an incredible evolution in UX design and musical experimentation. Here are four ways that motion control can transform how we create and perform music – in the words of the musicians themselves.

1. You can access many more dimensions of control

Because the sensor is always on, the Leap Motion Controller can track your hands anywhere within the total interaction space. With only pitch, yaw, and position for two hands, you can access 10 dimensions of control – and that’s just the beginning. That’s how Hagai Davidoff was able to play a full orchestra with one hand on a keyboard and the other in the air:

I’ve always felt a huge gap between hardware and software, the latter being far superior in terms of possibilities and details. Sometimes tactile resistance is a musical must, sometimes it just kills your natural (MIDI) curves. Hand gestures is an intuitive and much more natural way to create music. When I play my current setup, I feel like I’m dynamically conducting an orchestra.

— Hagai Davidoff, composer

2. Electronic music can be smooth, organic, and improvised

You can really feel the parameters you’re changing with your hands. You can flow with the music and express it with your body. Controlling more than 10 parameters simultaneously with both hands provides you with full control over the construction of the song, the emotion, and the energy. When you try it, you won’t leave your studio for a week.

Uriel Yehezkel, electronic musician/producer

Whether you’re playing your own music or playing someone else’s music, it’s a written track. It’s a done deal, it’s going to come out the way it was produced. Leap Motion lets you add a layer on top of that – so that every time you do it, it’s going to be different. You’re not just controlling the music, you’re now part of the music.

DJ SelArom

3. It transforms music into a 3D exploratory space

I don’t really see architecture, music, and programming as separate disciplines. You curate a series of events in a sequence to produce unlimited effects and interesting outcomes…. I believe the emotional power of architecture relies on the sequence of spaces and journey of inhabitants as they move through it.

Felix Faire, creator of Contact

As we saw last week, design metaphors like virtual buttons can reflect real-world interactions for an easy learning curve. At the same time, they can feel unsatisfying. Designed by UX prototyper Pohung Chen, the Unity DJ demo below uses physical buttons and sliders that can be moved by the user’s fingers, which are shown as spheres.

Pohung quickly realized that while the demo is easy to learn, it’s difficult to use, because you have to be careful with each finger. It’s too easy to make a mistake or misjudge the distance, so that the user is concentrating on the interaction, rather than what they want to achieve. This fundamental insight informed much of our later work.

Instead of mimicking physical instruments and making you focus on the position of your hands in 3D space, apps like GecoMIDI and MUSE create whole new spaces designed for music. GecoMIDI uses clear visual feedback with tiles that light up to show what your hands are doing, so that you don’t need to focus too hard on where your hands are. MUSE lets you hover, swipe, and trigger your way through a rotating world of cubes.

Both apps are unlike anything you’d find in real life with abstract visual designs and interactions geared towards continuous movements. Binary triggers are easy to activate, and the lack of tactile response feels freeing rather than clumsy. This design makes it possible to quickly learn and create without getting caught up in the interactions.

4. You can tweak and transform other instruments

You might be thinking that – although this is an innovation that helps when playing live – the same effects can already be achieved when post-editing a recording. But this is only partly true. The dance of embodied and feedback-driven action is more suited for artistic expression than the carefully pondered tweaking of symbolic representations petrified into a recording.

Nicolás Earnshaw, guitarist and creative coder

With tracking beta features like fingerbone-level tracking and occlusion robustness, we can’t wait to hear what the next generation of 3D interactive music will sound like. (Though based on this awesome 3D smart piano, we’re guessing Freddie Mercury.) What do you think of these musical experiments? How would you reimagine your favorite instrument for motion control?


/ Alex is the head writer at Leap Motion, where he stands as the final bulwark against bad grammar.

Twitter