I have a recurring dream that starts as a nightmare but turns into something else altogether. Imagine the stage of a monumental concert hall. The auditorium is packed, and as the audience notices you, thousands of conversations turn into a deep, imposing silence that sends a chill down your spine. Spotlights on a majestic grand […]
// V2 Tracking Beta
Old habits can be hard to break. When I’m building Leap Motion prototypes, I often find myself slipping into designing for cursors and touchscreens – paradigms based on one-handed interactions. By remembering to think outside the mouse, we can open ourselves up to interacting with virtual objects using both hands. But when are two-handed interactions the right approach?
The way we interact with technology is changing, and what we see as resources – wood, water, earth – may one day include digital content. At last week’s API Night at RocketSpace, Leap Motion CTO David Holz discussed our evolution over the past year and what we’re working on. Featured speakers and v2 demos ranged from Unity and creative coding to LeapJS and JavaScript plugins.
At Leap Motion, we want to make interaction with technology as seamless and natural as the real world. V2 skeletal tracking, which we released into public developer beta yesterday, was built to provide a new level of tracking robustness to hands, and to expose full degrees of freedom for every moving part of the hand.
The next generation of Leap Motion tracking (Version 2) is now in public beta for developers. V2 retains the speed and positional accuracy found in V1, but the software also now tracks the actual joints and bones inside each of the user’s fingers.
In any 3D virtual environment, selecting objects with a mouse becomes difficult if the scene becomes densely populated and structures are occluded. This is a real problem with anatomy models, where there is no true empty space and organs, vessels, and nerves always sit flush with adjacent structures.