18 October 2013 by leapmotiondeveloper
0 Comments
3D Printer Finger Painting at Maker Faire

Visitors at this year’s Toronto Mini Maker Faire were able to effortlessly turn air drawings into plastic sculptures, thanks to a 3D printer and the Leap Motion Controller. Created by Hot Pop Factory, the exhibit brought to life the possibilities of 3D design for people of all ages. We’ll let Hot Pop Factory tell you the full story below:

We had such a blast at this year’s Toronto Mini Maker Faire! For this special weekend event, we revisited one of our favorite childhood activities, finger painting, and updated it to our digital age with a Leap Motion Controller, some 3D printers (of course) and a little bit of home-made code. The result is the 3D Printer Finger Painting booth!

Over the course of two days, we watched the bewildered faces of hundreds of kids, parents, grandparents, and friends as they waved their fingers over the Leap Motion micro sensor, and saw their doodles instantly appear on a digital monitor. The best part of all, is seeing those doodles come to life in 3 dimensions on our 3D printers as funky, abstract sculptures. Read on for a review of our Leap Motion experience.

Inspiration

We are passionate about finding the creative applications of 3D printing. We’re enamored by how 3D printing can empower people from all walks of life to shape the world around them and the products they own. Right now the tools used to create content for 3D printers are often rather archaic and unintuitive for those who are not already experienced 3D designers. The projects that are exciting us most at the moment involve finding new means of interacting with these machines that break down these barriers.

Why Finger Painting?

We knew that Maker Faire would be a busy place with many kids in attendance. Our mission was to create a fun and easy way for people to generate their own 3D content in just a few seconds without any training. Since 3D printing is a new technology, we find it’s very helpful to get people comfortable with it by tying it to a familiar metaphor like finger painting or kissing booths. This allows people to approach 3D printing on familiar terms without being overwhelmed by the technology itself.

Hardware + Software

We used the Leap Motion Controller, Makerbot Replicator 3D printers, and a couple of regular old desktop computers. We tied them all together by writing our own software in Processing. The program we wrote allows people to generate colorful 3D models on the screen using their fingers and the Leap Motion Controller. We then exported these models to the 3D printers using standard file formats so that they could be reconstructed in real life.

Highlights

The response at Maker Faire was fantastic! It drew crowds throughout the weekend. I think the most rewarding part was seeing how many people were able to use it without any instruction – especially kids. Our favorite finger painter was a little boy who was convinced that clapping his hands and spreading his little fingers out like a fan resulted in explosion of cubes on the doodle screen. It was really the speed of his motion that the program responded to, but he was having so much fun creating with his big clapping gestures, all we could do was sit back and smile. We loved seeing how different people approached the sensor and learned to self-navigate our program on their own terms, bringing their own personality.

Future Applications

What’s nice about the Leap Motion Controller is that casually interacting with the computer becomes a very low cost investment of time and attention for a passerby at a busy venue like Maker Faire. People don’t have to waste time fumbling around with a keyboard and mouse – this is a big part of why we chose it for this event. For 3D printing in particular the Leap Motion Controller is especially interesting, because unlike a mouse, which moves around on a 2D plane, it’s specifically equipped to interpret 3D input. This helps alleviate one of the major pain points in getting new users to create 3D content, which is navigating 3D space on a 2D screen with a 2D input device.

3D Printed Leap Motion

3D Printed Heart Leap Motion

3D Printing Finger Painting Leap Motion

Hot Pop Factory Maker Faire Leap Motion

Hot Pop Factory experiments with 3D printing, software, and form to create printed jewelry. This post originally appeared on their blog, where they showcase their ongoing 3D art innovations.

15 October 2013 by leapmotiondeveloper
0 Comments
Thinking as a Designer: What’s a Good Leap + Three.js Boilerplate?

As a truly 3D human interface, the Leap Motion Controller opens up a lot of possibilities for developers of all stripes. For modern designers, it means that we have to constantly rethink and tinker with a new way of interacting with computers. It can be frustrating.

At this point, you might expect me to say that it doesn’t have to be frustrating. While you’d be wrong – constantly running into walls is a part of any experimental process – it is possible to lay down a solid foundation. You need to be bold. And you need a boilerplate. So put on your design hat and dive down into the rabbit hole – it’s time to get messy.

If you can’t see the boilerplate in action above, check out this demo video:

Read More

15 October 2013 by leapmotiondeveloper
0 Comments
Blue Estate’s Viktor Kalvachev on Building a Leap Motion FPS

This week, indie developer HESAW’s Mafia-themed PC rail shooter Blue Estate Prologue made a big splash in the Airspace Store. Free for a limited time and designed exclusively for the Leap Motion Controller, it puts players into the shoes of a trigger-happy mobster who shoots his way through outlandish locations including a burlesque show and a steam bath.

Read More

8 October 2013 by leapmotiondeveloper
0 Comments
Submit Your Web Links to the Airspace Store

When we launched the Airspace Store, we heard from lots of developers excited about getting their web apps and other creations into the app store. Today, we’re happy to announce that we’re introducing support for browser-based experiences alongside native apps.

We’re ready to take your submissions for the new All Links category. Each link has its own app detail page, giving users a better opportunity to learn about it, along with a link to the external site for that particular app. As an example, check out HelloRun, a 3D runner game that runs entirely in your browser.

We’re excited to take this first step to welcome apps built around our JavaScript API into the forefront of Airspace Store and stoked to see what you’ve built!

To submit your link:

  • Send an email to submissions@leapmotion.com
  • Include the URL to your app
  • Include a brief description of how you’ve used Leap Motion’s software and what users can do (so our team knows what to check out)
  • Following that, our submissions team will work with you to collect additional information for the app details page

In case you’ve not yet dabbled with JavaScript-based apps, one of our top web resources is js.leapmotion.com, where you’ll find a variety of examples and tutorials, along with our full JavaScript API. We’ve seen that web apps are popular among our users – it’s great that they’re just one click away.

Happy coding – we can’t wait to see what you’ll create.

8 October 2013 by leapmotiondeveloper
0 Comments
Creating a Hand-Controlled Orchestra with GecoMIDI

Hi, I’m Hagai Davidoff. I specialize in music for theater, film, media and MIDI mockups. Recently, I performed a full orchestra using just a keyboard, orchestral VSTIs, and GeoMIDI with the Leap Motion Controller. Check out the full video below, or skip to 6:00 to hear the music.

My main passions are orchestral music and music technology, and I love to create acoustic simulations. Over the course of my daily work, I always try to find the most natural way to input MIDI data into my digital audio workstation (DAW). That’s why my studio is filled with different kinds of MIDI gadgetry and controllers – pad controllers, MIDI keyboards, breath controllers, various iPhone apps, etc.

I’ve always felt a huge gap between hardware and software, the latter being far superior in terms of possibilities and details. My MIDI keyboard couldn’t reach all the velocity layers present in the sample libraries I use. I couldn’t naturally glide between different musical articulations intuitively (I hate keyswitches).

I could go on, but you get the idea. What good is a 20-velocity-layers sample library, if your keyboard always roughly transmits the same 10 velocity layers?

So I tried to re-organize my hardware life by buying an electric piano. These are usually much more responsive, yet costly. Since most digital pianos don’t come with knobs and faders, I’ve hacked my old Maudio Keystation Pro 88 into a MIDI control box and fused it with my breath controller.

Thus, I’ve built my perfect MIDI freak controlling extravaganza, which you can see in my full photoblog. Here’s my spaceship:

image

But there was still something missing. Tactile control has a price – resistance. Sometimes this resistance is a musical must, sometimes it just kills your natural (MIDI) curves.

When I first saw a video demo of the Leap Motion Controller, the obvious wow factor hit me. I immediately thought in musical terms and tried to figure out how I could implement Leap Motion into my workflow.

Then came GecoMIDI – the answer was there. The possibilities overwhelmed me, and I started experimenting. The result you see in the video at the top of this post is one of the first ideas I came up with, and already it feels very rewarding.

Suddenly, you can control XYZ with one open hand, and another set of XYZ with the same hand, closed. Now double that for two hands, and by the time I finish writing this, the tech boffins out there will probably make gestures of individual fingers a MIDI reality (go Geert go), taking things to a whole new level.

Hand gestures is an intuitive and much more natural way to create music. When I play my current setup, I feel like I’m dynamically conducting an orchestra. It helps me forget the techy stuff and get down to the musical Ideas and the joy of composing.

When you combine good hardware technology with excellent software (like GecoMIDI and the Spitfire Audio “Albion I” sample library, which I’ve used in the video), the results can be stunning.

There are still apps and features I would love to see from the developers out there in Leap Motion land:

  1. A hand glide gesture that sends actual MIDI notes. This would enable me to create natural glisses for virtual harps (I know, I know, but AirHarp doesn’t send MIDI out).
  2. An app that translates conducting into actual tempo tracking in my DAW (like a real conductor).
  3. General better and more stable tracking.

I’ve never seen anyone use the Leap Motion Controller like this before, and I feel honored to be an early adopter of this creative technique.

Hagai Davidoff is a critically acclaimed composer, producer, and arranger specializing in orchestral and acoustical simulations. He also produced sample libraries for Sonokinetic, and is a Cubase, Production, and Virtual orchestration teacher at BPM College in Tel-Aviv, Israel. For more info and music, check out hagaid.com.

5 October 2013 by leapmotiondeveloper
0 Comments
Manipulating rigged hand with Leap Motion in Three.js

Using Three.js and the Leap Motion Controller, Ukrainian developer Roman Liutikov was able to create a rigged hand that runs in your browser. Check out the demo below with your device or watch the video, then hear about how Roman was able to apply rigged geometry to the Leap Motion JavaScript API.

The demo showed in this video is actually the initial version, which is slightly different from the current one.

API

In this section I’ll describe the valuable data for current case only; for the full API, check the official docs.

Leap Motion works in a snapshot manner, which means it sends a block (frame) of data with all info about the current state of the scene in a particular moment of time. Here’s the data required for a hand model with an armature.

{
  hands: [
    {
      direction: [0, 0, 0],
      palmPosition: [0, 0, 0]
    }
  ],
  pointables: [
    {
      direction: [0, 0, 0]
    }
  ]
}

There is, of course, a lot more output data, but it’s fairly enough to implement rigged manipulation.

The hands array includes objects with data which describes each detected hand. The direction array describes the direction unit vector, which points from the palm position toward the fingers. The palmPosition array is the center position of the palm in x, y, z format. The pointables array is the list of the Pointable objects. The direction array of the Pointable object describes the direction (as unit vector) in which the finger or tool is pointing.

Hand rigging

This is how the armature of the hand should look.

And here are vertex groups assigned to appropriate bones.

(If you’re wondering how to rig mesh in Blender, check out my blog post about rigging and skeletal animation.)

Preparing the scene

  1. Export the model and make sure the required export options are checked: skinning, bones, and skeletal animation.
  2. Set up a basic Three.js scene with model loader code from my rigging article.
  3. Grab the latest Leap.js client lib from its repo and include it in your html.

Setting up and passing Leap Motion data

You might know about the Leap Motion Controller object, which is used to manually connect to the device, but this is not necessary when using the frame loop, as it will setup the controller and connect by itself.

var leap = new Leap.Controller({host: 'localhost', port: 6437});
leap.connect();

Run frame loop. Leap.loop(); function passes a frame of data to the callback function 60 times per second using requestAnimationFrame();. Add this function call to the very end of the model load function.

Leap.loop(function (frame) {
  animate(frame, hand); // pass frame and hand model
});

The core function called in the callback includes extracted and structured data that describes the position of the hand and fingers in 3D space, as well as position updating functions.

function animate (frame, handMesh) {
  if (frame.hands.length > 0) { // do stuff if at least one hand is detected
    var leapHand = frame.hands[0], // grab the first hand
        leapFingers = frame.pointables, // grab fingers
        handObj, fingersObj;
    // grab, structure and apply hand position data
    handObj = {
      position: {
        z: -leapHand.palmPosition[0]/4,
        y: leapHand.palmPosition[1]/6-30,
        x: -leapHand.palmPosition[2]/4+10
      },
      rotation: {
        z: leapHand.palmNormal[2],
        y: leapHand.palmNormal[0],
        x: -Math.atan2(leapHand.palmNormal[0], leapHand.palmNormal[1]) + Math.PI
      },
      update: function() {
        var VectorDir = new THREE.Vector3(leapHand.direction[0], -leapHand.direction[1]+.6, leapHand.direction[2]); // define direction vector
        handMesh.lookAt(VectorDir.add(handMesh.position)); // setup view
        handMesh.position = this.position; // apply position
        handMesh.bones[1].rotation.set(this.rotation.x, this.rotation.y, this.rotation.z); // apply rotation
      }
    };
    // grab, structure and apply fingers position data
    fingersObj = {
      update: function (boneNum, fingerNum, isThumb) {
        var bone = handMesh.bones[boneNum], // define main bone
            phalanges = [handMesh.bones[boneNum+1], handMesh.bones[boneNum+2]], // define phalanges bones
            finger = leapFingers[fingerNum], // grab finger
            dir = finger.direction; // grab direction
        // if current finger is thumb, use only one additional phalange
        if (!!isThumb) {
          phalanges = [handMesh.bones[boneNum+1]];
        }
        // make sure fingers won't go into weird position
        for (var i = 0, length = dir.length; i < length; i++) {
          if (dir[i] >= .1) {
            dir[i] = .1;
          }
        }
        bone.rotation.set(0, -dir[0], -dir[1]); // apply rotation to the main bone
        // apply rotation to additional phalanges
        for (var i = 0, length = phalanges.length; i < length; i++) {
          var phalange = phalanges[i];
          phalange.rotation.set(0, 0, -dir[1]);
        }
      },
      // define each finger and update its position
      // passing main bone number and finger number
      fingers: {
        pinky: function() {
          fingersObj.update(3, 3);
        },
        ring: function() {
          fingersObj.update(7, 1);
        },
        mid: function() {
          fingersObj.update(11, 0);
        },
        index: function() {
          fingersObj.update(15, 2);
        },
        thumb: function() {
          fingersObj.update(19, 4, true);
        }
      },
      // update all fingers function
      updateAll: function() {
        var fingers = this.fingers;
        for (var finger in fingers) {
          fingers[finger]();
        }
      }
    };
    handObj.update(); // update hand postion
    // update fingers position if there are all five fingers is detected
    if (leapFingers.length == 5) {
      fingersObj.updateAll();
    }
  }
}

Basically, it’s easy to set up and run something with Leap Motion, but when the goal is to achieve the best possible results, all the pitfalls immediately go up. For example, I’ve used magical Math.atan2 for one of the hand rotation axis, instead of palmNormal value. As it turned out, there’s no nicely represented pitch, roll, and yaw values. Instead, you need to calculate some manually – check this Leap demo to see what’s wrong with the rotation data. Also, I’ve tweaked almost all data to make the model behave nicely on the screen.

One of the most important things to remember when building the armature for the model in Blender (or other software) for Three.js: do not move/rotate the armature, but always align it using the bone’s head/tail position. (This is true for Three.js r60, but it seems like in r56 it wasn’t required).

Roman Liutikov is a freelance front-end developer from Ukraine who specializes in front-end development. Check out his website at romanliutikov.com.

2 October 2013 by leapmotiondeveloper
0 Comments
Dr. Guillermo Rosa on Touchless Control During Surgery

Last month, we heard from Dr. Guillermo Rosa, who made Leap Motion history at his family’s private dental practice in Resistencia, Argentina. As the first person to use the Leap Motion Controller during dental implant surgery, Dr. Rosa is also the first person ever confirmed to use our technology under sterile operating conditions for a human patient.

Recently, we caught up with Dr. Rosa to ask him about his love of technology and experience with the Leap Motion Controller.

Technology in the Operating Room

I’ve been an enthusiast of computers from childhood. My father bought me my first computer at the age of 11 – a Timex SICLAIR 2068. I did some programs, and with my brother built a gun to shoot the monitor.

Right after I graduated 17 years ago, I started to look for better ways of using computers in a clinical situation, trying to integrate as much as possible a PC (and all digital media) with dental operations in clinical/surgery situations, such as digital photographs to document cases, digital simulations of cosmetic changes, and improved communication with patients and the dental laboratory.

This helped us a lot in many aspects of our practice. We modified the dental equipment and developed an integration between the dental chair and a PC workstation, as the options in the market were very limited.

Sterile Conditions and Messy Peripherals

Every day, the number of digital images at the dental office is going up (for the better!), but interacting with this technology in a clinical/surgical situation is not easy. For proper ergonomics, we need to use an appropriate input device – you need the minimum (or ideally no) physical contact with surfaces that need to be regularly sterilized. The problem of sterility is one of the greatest challenges in modern healthcare technology.

We started using trackballs, followed by touchpads among others. But these input devices were not for surgery – unless we use an assistant outside the surgical field, they were not practical.

Then, last year, I saw a demo video on the Internet about the Leap Motion Controller. I was amazed, and I thought, “This technology may be great for the touchless clinical input interface we need!”

Using the Leap Motion Controller

To perform the surgery and take the video, I was helped by my colleagues at C.O.R.E. Clinic, Dr. Maria Lidia Elizondo and Dr. Daniel Rosa. Using the Touchless for Windows app with dental imaging software during surgery, I navigated through Cone Beam Computed Tomography (CBCT) 2D-slice images, a planned implant surgery 3D model, and intraoperatory digital X ray images. Also, I used the device to manipulate (zoom in, zoom out, enhance contrast) intraoperatory digital X-ray images – new images obtained during surgery – to confirm the right position of the surgical guide and final position of the implant.

The system allowed intraoperative touchless control during the surgery. I was able to navigate through the windows, zooming in and out, navigating through the different images and slices, and use different imaging tools. It was also possible to move the 3D implant surgery planning model. In the end, a dental implant placement, simultaneously with guided bone regeneration procedure, was accomplished.

Little Training Needed

The combined system performed very well, and I found it very useful to control the system without touching anything and keeping the surgical environment. The Leap Motion technology worked fantastic during surgery and was extremely useful.

The habituation period is not very long – after some minutes, one can easily interact with the system. But, for using it during surgery, I recommend that the user spend several training sessions – assuming that s/he previously mastered the software with the standard input devices (e.g. mouse, touchpad).

Ultimately, it depends on how the user is accustomed to natural user interfaces like touchscreens. With a little training by the user, without a doubt, it is easier and faster than changing sterile gloves. I think it has enormous potential in our field, and in general surgery too.

Dr. Rosa is continuing his work with other computer control apps like GameWAVE and Pointable. What do you think about the use of Leap Motion technology in clinical settings? Where else can you imagine it being used – and what sorts of apps would be needed?

2 October 2013 by leapmotiondeveloper
0 Comments
Leap Motion By the Numbers

image

20 September 2013 by jaltschuler
0 Comments
Designing Your Menus for Optimal Usability

Over the past two months, I’ve been impressed with the variety and creativity of Leap Motion-enabled applications we’re seeing in Airspace, as well as the responses we’ve received from our users. We’ve been listening and one thing our users want more of is – wait for itconsistency! Especially when it comes to application menu systems.

Menus may be the last part of your app experience that you want to think about, but it should be the first. It’s going to be the first thing that a user will see – if they don’t have a good user experience right away, they may not go deeper into your app. The menu should set the tone for the rest of the experience.

The following are a few best practices to keep in mind as you begin to design and develop your app experience.

Menu Design and Layout

However you organize your menu to accommodate your experience and artwork, always keep the usability, legibility, and simplicity of interaction in mind. Be sure to space the buttons appropriately so that it’s easy for a user to select and tap a particular button without accidentally mis-tapping another.

image

In this example menu you can see a number of best practices at work.

  1. Buttons are large, well-organized, and include a clear highlight/depressed state.
  2. Buttons use high-contrast colors and the text/font is very legible.
  3. The Exit button is easily accessible and clearly indicated
  4. Required gestures are displayed using easy-to-read iconography. The recommended gesture for menus is the Leap Motion Touch-Zone API, or “poke.”

Proximity-Based Highlighting

Another way of simplifying the menu experience for your users is to provide a proximity-based highlighting scheme. This highlights the closest item to the user’s cursor, without having to actually be over it. In the example below, the five possible actions are outlined to show how this might work.

image

The user’s cursor is in the upper left quadrant, so therefore the Play button is lit. Performing a tap gesture would activate the Play button. Anticipating what the user might want in these contexts can save time and eliminate frustration.

Menu Access and Exit

With this in mind, accessing these menus as well as exiting your application should be handled in the same manner – simple and foolproof.

  • For most games and applications, removing your hands from the field of view should pause the interaction and display a “Continue, Main Menu, Quit” dialog.
  • Providing an explicit “Settings” or “Menu” button is another option.
  • There should be an explicit “Exit” or “Quit” button.
  • The Escape key should exit the app (on Mac and Windows).
  • The Command-Q (Mac) or Alt-F4 (Window) should also exit.
  • You may also want to make your menus accessible via mouse.

image

Above: The first-person shooter, Dead Motion: Prologue, displays this simple menu when a user removes their hands from the field of view.

Feedback, Feedback, Feedback

For any selection approach you utilize, giving users proper cues and feedback is integral to ensuring they feel in complete control. It should be immediately clear which elements are interactive – and it never hurts to give users unobtrusive graphical or textual cues, e.g. a simple illustration and “Tap to select an article.”

Once interacting with the element, it should respond fluidly with appropriate visual and auditory feedback. Buttons should be highlighted when hovered over, and respond with a “click” and indent as they are depressed; sliders should move freely; etc. The more information you can give to help orient the user and signal their selections, the easier it will be for them to complete each task.

image

Above: Frog Dissection uses large buttons that highlight and magnify when indicated and the Leap Motion Touch Zone API to handle the button “tap.”

(For more on the implementation of the Touch Zone API, please review this section of our documentation and this excellent blog post from one of our developers.)

I hope that you find these examples and best practices helpful as you create your application user experience. Over the coming weeks, we’ll be posting more best practices on Developer Labs. Let us know what you think, and what you’d like to see next. I’m looking forward to hearing from you and seeing more of your great work.

Jon Altschuler is the Director of Creative Technology at Leap Motion.

17 September 2013 by leapmotiondeveloper
0 Comments
3 Questions: Rom’s Interactive LED Displays

Today on the Leap Motion blog, we’ve featured 7 visual experiments created by our developer community. Last week, we caught up with creative developer Rom, who designed the interactive LED wall and architectural model in the videos below.



Read More