Hand Tracking with Realsense d435
VJMC last edited by
I'm playing around with a variety of hand tracking methods for an interactive book I'm designing for a museum. One of the options I'm playing with is the intel Realsense d435. If I understand correctly, the d435 has built in hand tracking capabilities, but the openNI actor seems to only have access to the full body skeleton data. I'm curious if anyone has succeeded, either directly in Isadora or through a 3rd party app, in capturing the d435's hand data in a useful and stable manner.
Alternatively, is there another device anyone's played with that you'd deem superior?
- To detect when a user's hand collides with a reactive object. (The plan is to use time based selection vs. touch based)
- Visualize hand tracking with a floating cursor. (Dot, glowly effect, etc)
Here's what I've tried so far:
- Leap Motion sending osc using Brekel Pro Hands: https://brekel.com/hands/ (The Leap actor in Isadora doesn't work with the latest Gemini release. Gemini is a huge improvement over previous Leap platforms, unconvinced I could achieve the level of stability and usability I'm looking for with previous releases.)
- Leap Motion's Touch Free: TouchFree
- Monica Lim's browser based tracking using Mediapipe: Handmate Youtube
- Blob tracking with the eyes++ actor using d435 depth feed coming from the OpenNI Tracker actor.
- Collision detection using multiple cropped depth feed streams and the Calculate Brightness actor.
I am not aware of any 3rd party plugin for Intel Realsense. I did a little searching for addons for Openframeworks since I sometimes work in that framework, but the available addons are limited and haven't been updated recently.
Did you test with the V3 Leapmotion drivers and the Isadora Leap Motion actor?
I updated the SDK links just the other day in our Solutions Article and during my testing found everything very responsive.
I can't think of any more precise devices. I have seen code for Hand Tracking with Kinect v2, and although functional, its not nearly as tight as Leap Motion.
VJMC last edited by
@dusx I have used the v3 drivers with the Leap actor before. In my experience, it works well under tightly controlled conditions with a user that has a base knowledge of how to interact with the Leap. This installation will be out in the wild. In my testing I found the v5 platform to be much better at guessing in low confidence. Things like tucking the thumb into a fist and rotating the hands track pretty well.
But I might just go with blob tracking. I ran through Mark's eyes++ tutorial yesterday using the depth feed from the D435 and I'm kind of stunned at how well it's working. I don't actually need gesture recognition or digit tracking in this instance, so, blob's ftw?
Thanks for the response!