// V2 Tracking Beta

I have a recurring dream that starts as a nightmare but turns into something else altogether. Imagine the stage of a monumental concert hall. The auditorium is packed, and as the audience notices you, thousands of conversations turn into a deep, imposing silence that sends a chill down your spine. Spotlights on a majestic grand […]

Old habits can be hard to break. When I’m building Leap Motion prototypes, I often find myself slipping into designing for cursors and touchscreens – paradigms based on one-handed interactions. By remembering to think outside the mouse, we can open ourselves up to interacting with virtual objects using both hands. But when are two-handed interactions the right approach?

arm-api

Immersion is everything in a VR experience. Since your hands don’t actually float in space, we created a new Forearm API that tracks your physical arms. This makes it possible to create a more realistic experience with onscreen forearms.

1280px-San_Francisco_(Evening)

The way we interact with technology is changing, and what we see as resources – wood, water, earth – may one day include digital content. At last week’s API Night at RocketSpace, Leap Motion CTO David Holz discussed our evolution over the past year and what we’re working on. Featured speakers and v2 demos ranged from Unity and creative coding to LeapJS and JavaScript plugins.

camera-controls-hero

As any interaction designer knows, there’s rarely only one right answer, but there are usually about a million wrong answers. That’s why we’ve been experimenting with a variety of interaction models, including camera controls for Three.js.

ragdoll_screenshot_800_600

The opposable thumb gives human beings a hand up in grabbing and holding objects. At Leap Motion, we want to achieve the same grabbing ability in the virtual world. To demonstrate, we’ve developed a simple ragdoll game in the Unity game engine where you can grab and throw ragdolls around a room.

23a20fe9-0ecf-4148-ad60-5b06bf46ae6e

When combined with auditory and other forms of visual feedback, onscreen hands can create a real sense of physical space, as well as complete the illusion created by VR interfaces like the Oculus Rift. But rigged hands also involve several intriguing challenges.

The next generation of Leap Motion tracking (Version 2) is now in public beta for developers. V2 retains the speed and positional accuracy found in V1, but the software also now tracks the actual joints and bones inside each of the user’s fingers.

In any 3D virtual environment, selecting objects with a mouse becomes difficult if the scene becomes densely populated and structures are occluded. This is a real problem with anatomy models, where there is no true empty space and organs, vessels, and nerves always sit flush with adjacent structures.