Leap Motion Watcher + a hand skeleton decoder
-
I am collaborating with a dancer who uses ASL gestures in their work. The leap motion watcher gives us access to whole hand movements to influence the audio and visual outputs of our work. ASL gestures incorporate finger positions and movement. I'm wondering if anyone is currently developing or interested to collaborate with me in developing an Isadora user actor or a modification of the Leap Motion Watcher that "sees" and decodes each hand into its skeleton and outputs the x,y,z data for the fingers and thumb of each hand.
-
The Kinect Version 2 does finger tracking. Not sure it's made it to Isadora yet, but it's definitely possible in Kinect SDK. Web searches throw up many implementations.
Might be better to go down that route than Leap Motion. Kinect V2, with its ability to track over much bigger space than Leap Motion, has got to be more flexible IRW situations, I'd think. Especially if it's a dancer who is signing! -
@bill-cottman said:
I'm wondering if anyone is currently developing or interested to collaborate with me in developing an Isadora user actor or a modification of the Leap Motion Watcher that "sees" and decodes each hand into its skeleton and outputs the x,y,z data for the fingers and thumb of each hand.
It's a good idea @bill-cottman. Let me look into what it would take to add an output for that data. How urgent is your need?
BestWishes,
Mark -
-
@mark Thank you for your reply. My need is not urgent. It is part of my Isadora toolkit development process to create a collection of user actors and associated control panels that are responsive to ambient sound & movement. Their outputs will be used to generate projections in realtime that I call New Social Landscapes.