Leap Motion and Blender
Wow, that is really something.
I noticed that you kinda have a skeletal structure for this hand.. so if you could get the joint of each finger for assign them to each of your fingers and move their coordinate frames, that would be awesome!
Best,
Hi Tyler,
I have a robotics background (currently finishing my PhD), I thought the Leap could be helpful to record grasping movements/postures, so I fiddled with that. However, IMHO the Leap is not suited for this kind of use-case because the fingers "disappear" if you bent them to far. Anyway, awesome device, just not designed for grasping.
To answer your question, no I didn't, but I could do that. Are you interested?
Definitely interested in the Blender code! I am pretty certain many others are as well.
Wow, robotics is a field I always dreamed of alternatively taking up!
I also had ideas for grasping gestures (amongst many others) when I first saw the leap in action, namely the point cloud demo. It has been a bit of a letdown that fingers disappear so much, but they say full skeletal hand and revamp of detection algorithms is in the works and priority, so hoping for the best. Also, I don't know if you saw somebody posted today the possibility of raw data (https://www.leapmotiontechnology.com/forums/forums/6/topics/1579?page=2#post-10048), I imagine on Linux. It is coming I believe.
Hey Florian,
A quick question, what middleware did you use in your second video? Also, any more thoughts on posting the code somewhere?
Thanks,
Tyler