// Free and Open Source

ardrone

Following my tutorial on controlling the Sphero using the Leap Motion, I thought I would keep on converting my Node.js projects to Cylon.js and work on controlling an AR.Drone with Leap Motion.

iothero-blog

One of the most powerful things about the Leap Motion platform is its ability to tie into just about any creative platform. That’s why we created a Platform Integrations & Libraries showcase where you can discover the latest wrappers, plugins, and integrations.

Cylon.js is a JavaScript framework for robotics, physical computing, and the Internet of Things (IoT) that makes it easy to network 36 different platforms (and counting). On our Developer Gallery, you can find example projects to help you get started with wirelessly controlled Arduino boards and Parrot AR.Drones. Recently, we got in touch with Ron Evans, the creator of Cylon.js and other open source robotics frameworks, about the emerging IoT revolution.

sphero-leap

In my personal time, I love to play around with hardware and robots. I started in Node.js but recently I discovered Cylon.js and after a quick play around with it, I found it pretty awesome and decided to rewrite my projects using this framework.

As a starting point, I decided to rewrite the project to control the Sphero with the Leap Motion Controller.

lahacks2015a

This weekend, Team Leap Motion made the trip from San Francisco to join over 1500 students at the Pauley Pavilion. Amidst the sleeping bags, Red Bulls, and bleary-eyed jamming sessions, we watched as hundreds of hacks came to life.

What if you could disassemble a robot at a touch? Motion control opens up exciting possibilities for manipulating 3D designs, with VR adding a whole new dimension to the mix. Recently, Battleship VR and Robot Chess developer Nathan Beattie showcased a small CAD experiment at the Avalon Airshow. Supported by the School of Engineering, Deakin University, the demo lets users take apart a small spherical robot created by engineering student Daniel Howard.

Nathan has since open sourced the project, although the laboratory environment is only available in the executable demo for licensing reasons. Check out the source code at github.com/Zaeran/CAD-Demo.

augmented_hand_series_25fps_600x300

The “Augmented Hand Series” (by Golan Levin, Chris Sugrue, and Kyle McDonald) is a real-time interactive software system that presents playful, dreamlike, and uncanny transformations of its visitors’ hands. It consists of a box into which the visitor inserts their hand, and a screen which displays their ‘reimagined’ hand—for example, with an extra finger, or with fingers that move autonomously. Critically, the project’s transformations operate within the logical space of the hand itself, which is to say: the artwork performs “hand-aware” visualizations that alter the deep structure of how the hand appears.

Hovercast-Slider

Menu interfaces are a vital aspect of most software applications. For well-established input methods – mouse, keyboard, game controller, touch – there are a variety of options and accepted standards for menu systems. For the array of new 3D input devices, especially in virtual reality, the lack of options and standards can create significant development challenges.

Yesterday, I introduced you to Hovercast – a hand-controlled menu interface for virtual reality environments. In this post, we’ll take a closer look at the development process behind Hovercast, including some insights on usability and design for virtual reality.

Hovercast-Slider

Hovercast is a menu interface for virtual reality environments. Built as a tool for developers, it’s highly customizable, and can include many nested levels of selectors, toggles, triggers, and sliders. All menu actions – including navigation between levels – are controlled by simple hand movements and reliable gestures.

With input from a Leap Motion Controller, Hovercast radiates from the palm of your hand – becoming a powerful, versatile extension of your virtual self. As you rotate your palm toward your eyes, the Hovercast menu fades into view. A wide arc of menu items extends just beyond your fingertips, and follows your hand’s every movement. You can interact with menu items using the index finger of your opposite hand. To select an item, simply move your fingertip (the cursor) nearby, and hover there for a short time.

blog-widgets

Hi, I’m Wilbur Yu! You might remember me from such webcasts as Let’s Play! Soon You Will Fly and Getting Started with VR. In this post, we’ll look at how we structured Widgets to be as accessible and comprehensive as possible.

Daniel here again! This time around, I’ll talk a bit about how we handled integrating the UI Widgets into the data model for Planetarium, and what this means for you.

The first iteration of Widgets we released to developers was cut almost directly from a set of internal interaction design experiments. They’re useful for quickly setting up a virtual reality interface, but they’re missing some pieces to make them useable in a robust production application. When we sat down to build Planetarium, the need for an explicit event messaging and data-binding layer became obvious.