Motion/depth sensor camera - Mac User
-
Anyone having success using a motion/depth sensor camera for a gallery installation that is native to or at least works very well with MacOS? I would prefer to use Isadora over TouchDesigner, but every camera I look into seems to run into issues with compatibility. Looking into Orbbec Astra 2 or Mini now but would be very grateful for recommendations (and success stories)! I am an artist and am new to this motion/depth sensor tech but highly motivated to figure it out. Currently on an Intel Mac but may upgrade to Silicon sooner rather than later if that makes a difference.
Thanks in advance!
-
My go-to is a Model 1414 Microsoft Kinect for Xbox 360.
-
@jessicacohen said:
every camera I look into seems to run into issues with compatibility
The list of depth cameras compatible with Isadora's implementation of OpenNI can be found in the description of the plugin here: https://troikatronix.com/add-ons/openni-tracker/:
Compatible Cameras This plugin is currently compatible with the Kinect v2 (Kinect for Xbox One) , Kinect v1 (Kinect for Xbox 360, Models 1414 and 1473), the Orbbec Astra, and the Intel Realsense D435.
-
@jessicacohen Hi there! I’m also working with motion tracking for the first time in a project and have had great results using VisionOSC: https://github.com/LingDong-/V...
It’s based on Apple’s Vision Framework and runs quite smoothly. The only downside is that it requires a fair amount of computing power—but for single-type tracking, it performs really well. I’ve also built a User Actor for the Pose Detection feature, in case you’re interested! ☺️
-
-
@2250watt Also i just saw this: https://community.troikatronix...
-
Hi,
I have been unable to get the new release of Orbbec Astra cameras to work with OpenNi module in Isadora. I made some enquiries to Orbbec support and they informed me that their current Astra models are not compatible with Isadora’s Open Ni implementation. Basically they informed me that they are not the same camera evidenced by the change in model number. Although the current Astra and Astra Mini look identical to previous devices they are not going to work with OpenNi in Isadora.
The Astra Mini I purchased direct from Orbbec sales works with the Orbbec Top in Touch Designer but not with Open Ni in Isadora.
I was able to find an older Astra in the second hand market and it works great with Open Ni and Isadora in Rosetta mode on a M series Mac.
Best wishes
Russell
-
@jessicacohen I think this depends on what you want to do with the depth camera - overall I would go with something newer - check the Orbec Fmeto Bolt https://www.orbbec.com/product...
This is basically the Kinect Azure hardware repackaged - it supports windows linux and OSX. It seems to have good support in Touch Designer (link below)- but it will not work with Isadora.
If you want skeleton tracking the only Isadora option is using OpenNi which is a long dead and no-longer updated library. Although some legacy code and resuscitation make keep it going for a while, I would look into other options.
The Kinect Azure did come with Native skeleton tracking for Windows and Linux -again available in TD, but not Isadora (the large mac user base makes this a difficult implementation).
For Skeleton, Face and Body tracking there is a thread about using Python and media pipe https://community.troikatronix...
The solutions discussed there do not need a dpeth camera, and can be quite good. I have not used mediapipe in Isadora but have used it in Openframeworks using this addon https://github.com/design-io/o... not that I expect you to go with this code, but just as an example of what it can do.
The only real missing piece here is precise distance data (it makes a good guess but it is essentially 2d). Although a depth camera is not needed to use Mediapipe, using TD - which gives you access to a depth stream from the Femto (or maybe in Isadora using python) - there are helper functions where you can look up a pixel coordinate (joint coordinate) from the RGB camera and get its real world position from the depth camera.
It feels like we are at a bit of a crossover point with this tech - depth cameras are not so necessary for skeleton tracking and will be less so, but they do still have advantages - likely because RGB skeleton tracking is getting so advanced no one is really pushing the depth stuff.
Using the Orbec SDK it would be trivial (for a programmer) to make a viewer that streamed the RGB, depth and pointcloud over Syphon/Spout - I dont have the hardware but if someone lends me something I would give it a go.
The other issue to think about is connectivity - good recent depth cameras are using USB 3.0 which is expensive to extend long distances.
There is also the Zed cameras - with skeleton tracking in the SDK https://www.stereolabs.com/en-... also supported by TD https://derivative.ca/UserGuid... again no OSX support.
I think for solutions where I would have just automatically used a depth camera, right now I would take a better look at what kind of tools can solve the issue I need. Google's Mediapipe (https://github.com/google-ai-e...) is kind of winning at the moment and can be used within Isadora via Python.
If you have a specific use case that you need to solve maybe I can give more pointed advice.
-
@woland Thanks!
-
@2250watt Thank you!
-
@gapworks Thank you!
-
@bonemap - Ah, yes, this explains the confusion. Thank you! Very helpful.
-
@fred Had been circling mediapipe as a solution so good to see you mention it here. Thank you!