Ghost hunting, pornography and interactive art: the weird afterlife of Xbox Kinect
-
Interesting article in today's Guardian (UK vaguely left-wing serious newspaper - free to read) about the lasting effect of the Kinect. Also contains some links to things I knew nothing about, like the stereolabs cameras. Worth a couple minutes of your time:
https://www.theguardian.com/ga... -
Is someone going to make a plug-in for buttplug.io?
-
@gavspav said:
Is someone going to make a plug-in for buttplug.io?
Python can probably do that! haha
-
@mark_m I have a stereolabs camera (ZED2) and they're great since they're the only ones that have a range of 30+ feet with skeletal tracking. All the rest are limited to small rooms. I also have over a dozen Kinects that I use in my business, and they're super annoying because ~ 30% of them stop working whenever Windows does a forced update and the drivers need to be re-installed, which isn't always a smooth process. But I love the Zed 2s, they're great if you have lots of light available since they filter out IR.
-
Thanks for the artcle Mark, Well, I loved the kinect at first (infrared grid) sight. I bought several over the years. And I still believe in it. I did some installations with it that toured quite a bit over the years. Now what is next on the skeleton tracking ? I've been following these developments for years and here a summary of what I know.
1) with AI it is easy to predict that all monocular rgb cameras will become kinetct-like skeleton tracking devices
We've all seen Mark's FB posts experimenting that with python code.
Google Mediapipe, Open CV, Yolo, Movenet and other frameworks already provide code for pose estimation with monocular rgb vision
2) On the contrary, since some of these models, handle and and finger tracking, I think that Leapmotion will probably die. IA and calculation power is probably going to replace it. We've already seen the ability to track hands for menu and game interactions in Apple Vision pro or Meta Quest 3 MR headset (yes as for this week even META banned the word VR for MR) (MR = mixed reality)
More to come !!!
Kinect is dead? Long live its (IA) successors !!!
-
Hi @armando,
I have also been following these developments for years and have Mediapipe running in Isadora through the Pythoner plugin. I have hand tracking, face tracking and pose tracking variations as separate Pythoner patches. There has been a fair bit of upkeep to these patches and the upgrade of patches with new versions of Mediapipe, Pythoner and Isadora. This has meant reinvesting in the integration with Isadora over time.
The BIG QUESTION for me using flat RGB video for these new AI and ML approaches is that they do not allow me to track performers in a theatrical setting - this means in lighting conditions that are not optimum for capturing the body as an RGB image - for example n darkness. It remains critical that body tracking for performance is optimised as agnostic to reflected light ie: works in darkness or with a variety of lighting and projection sources. AI and ML tracking has not proved itself in theatrical performance because it requires the tracking subject to be clearly represented in a video stream.
Structured light devices - like the Kinect and OoenNi variations are still important precisely because they operate without a visible light source illuminating the tracking subject.
But please, if there is an AI or ML solution that works in darkness without visible light, I would love to know about it!
Best wishes
Russell
-
@bonemap I think the best solution right now is https://www.move.ai/ BUT disguise has got its hands on it. So the price is high... unfortunately.
-
@bonemap a lot - if not most of the ML models are fine with black and white images - ie IR video (as is used by the TOF and structured light cameras you mention) - however you will need to provide the light source and the camera stream that can see the IR lit subjects - no pixels is no information. Kinect etx have their own light source - which of course you can use and feed into these models (ie a kinect IR video stream)- this may be a better approach than feeding the images into openNI which is essentially a dead outdated hack.
Not sure what Ml model you are using but there are many and this is a good one: https://github.com/MVIG-SJTU/A...Fred
-
@bonemap Well we don't need a special solution for low light conditions. I did that for years. Any camera that see near ir light, flood with infrared that doesn't change and put a visible light filter in front of the camera et voila. Never had problems with it 90% of the time.
-
If you use tungsten pars with a red green and blue filter in front even at low intensity it will emit a lot of IR light. This shouldn't Change, but you can change all the other lights IF you filter them out of your IR camera.