// Unity

arm-api

Immersion is everything in a VR experience. Since your hands don’t actually float in space, we created a new Forearm API that tracks your physical arms. This makes it possible to create a more realistic experience with onscreen forearms.

1280px-San_Francisco_(Evening)

The way we interact with technology is changing, and what we see as resources – wood, water, earth – may one day include digital content. At last week’s API Night at RocketSpace, Leap Motion CTO David Holz discussed our evolution over the past year and what we’re working on. Featured speakers and v2 demos ranged from Unity and creative coding to LeapJS and JavaScript plugins.

Art imitates life, but it doesn’t have to be bound by its rules. While natural interactions always begin with real world analogues, it’s our job as designers to craft new experiences geared towards virtual platforms. Building on last week’s game design tips, today we’re thinking about how we can break real-world rules and give people superpowers.

What can virtual environments teach us about real-world issues? At last month’s ImagineRIT festival, Team Galacticod’s game Ripple took visitors into an interactive ocean to learn about threats facing coral reefs.

ragdoll_screenshot_800_600

The opposable thumb gives human beings a hand up in grabbing and holding objects. At Leap Motion, we want to achieve the same grabbing ability in the virtual world. To demonstrate, we’ve developed a simple ragdoll game in the Unity game engine where you can grab and throw ragdolls around a room.

Built in Unity over the course of five weeks, Volcano Salvation lets you change your perspective in the game by turning your head from side to side. This approach to camera controls – facial recognition via webcam – makes it easy for the player to focus on the hand interactions.

23a20fe9-0ecf-4148-ad60-5b06bf46ae6e

When combined with auditory and other forms of visual feedback, onscreen hands can create a real sense of physical space, as well as complete the illusion created by VR interfaces like the Oculus Rift. But rigged hands also involve several intriguing challenges.

In any 3D virtual environment, selecting objects with a mouse becomes difficult if the scene becomes densely populated and structures are occluded. This is a real problem with anatomy models, where there is no true empty space and organs, vessels, and nerves always sit flush with adjacent structures.

By creating a game that forced his eyes to work together, cross-eye sufferer and game developer James Blaha has been working to overcome his condition and retrain his brain with the power of gamification.