Motion Tracking Camera
I'm hoping for a bit of advice. I haven't played with motion tracking in Isadora since the original Kinect was the go-to camera. I am embarking on a fairly intensive interactive dance piece and was hoping to get recommendations on a new camera to purchase, as I no longer have my Kinect.
I saw Mark's post about the natively supported cameras in Isadora 3, but I don't have any hands on experience with them.
There are a number of options for motion tracking in Isadora, so it will depend on what kind of tracking you are going to attempt and what hardware and operating system is being used.
I have found the simplest solution, for my time and money, to be pairing Delecode NiMate with a Microsoft Kinect and streaming the skeleton tracking points produced into Isadora. This solution is getting harder to accomplish, because of the discontinuation of the Kinect and the progress of software updates etc.
I am also anticipating a solution from Isadora v3. But it appears to be a fragmented market in terms of Kinect alternatives. Mark has suggested Orbbec Astra as a potential candidate for skeleton tracking in Isadora v3. However, the Orbbec camera is not supported by the Delicode software - NiMate & Z vector. There is also uncertainty over the legal integration of some of the code in the Isadora NiTE plugin which we are yet to see as a beta release.
I have projects that use skeleton tracking for museum, stage and street front applications and am starting to get the impression that I might need to invest in a solution and commission the development of custom software to achieve my goals.
mark_m last edited by
My opinion: we don't know when Isadora version 3 will be available, and as @bonemap says, there are potentially some licensing issues around motion capture cameras.
So, what works well at the moment? Isadora Version 2.6 and the Kinect 2. The hardware may have been discontinued but there are hundreds of them in secondhand shops and on eBay, as well as the adaptors to use them with PCs and Macs.
Like Bonemap, I've had good success using NiMate and Kinect 2 to get skeleton data via OSC into Isadora.
However, there are other tracking methods, using video cameras and the eyes / eyes++ actors.
To some extent what strategy you use depends on what you want to want to accomplish... if it's detecting movement into and out of areas on the floor to trigger something, then a USB webcam and eyes++ actor might work.
Mark has a whole tutorial using infra-red tracking which isn't affected by theatre lights. That's here. It's a good watch, not only for the IR technique, but for Isadora tracking in general.
Thank you both! It looks like my best option is picking up a couple of Kinects on eBay. Is it possible to run more than one Kinect sensor and/or track more than one skeleton with a Kinect. I am interested in tracking up to four people simultaneously. I understand that I would likely lose the ability to differentiate between who is who, but I can work around that limitation.
dbini last edited by
if you use NIMate as a way to get data from the Kinect into Izzy via OSC, you can use the free version to get data from multiple skeletons (i think its up to 6 simultaneously) - but you can't use more than one Kinect at a time unless you have a license for NIMate. (also the spout/syphon feed is watermarked every 10 minutes on the free version)
the issues with tracking 4 people with one Kinect are: range is pretty limited and it works best if your 4 bodies are in a line about 5m from the sensor. if your people are moving around a lot, they tend to block each other, and bear in mind the sensor, like any single camera, sees a cone of the space, so you're going to be working with a wedge of the stage, rather than a rectangle. here's video of a community dance piece i made with a Kinect on the floor downstage centre.
i recently got a black and white USB camera that is really sensitive in low light and is ideal for placing overhead and blob tracking bodies using Eyes++. (on ebay) - if you have access to halogen stage lighting then you can use multiple filters to create an IR wash to separate out your bodies from the environment.
mark_m last edited by
i recently got a black and white USB camera that is really sensitive in low light and is ideal for placing overhead and blob tracking bodies using Eyes++.
Thanks for that link: looks great. Does it come with an IR filter? Looks like it on the pic, but IME what's in the pic and what arrives from China aren't necessarily the same!!
dbini last edited by
hi Mark, the colour ELP cameras generally have IR cut filters, but that particular camera doesn't. its actually very sensitive to IR - i need to use my IR LED floods perpendicular to the lens, otherwise there's a bright hotspot (which i don't get with an analogue cctv camera and IR pass filter). i think the colour cameras need to cut IR to keep the colours looking natural, whereas the black and white chips don't need to filter it out.
hope you have a great christmas, buddy,
One way is to use the Orbbec SDK to send the skeleton data to OpenFrameworks and from there you can send it to Isadora or whatever you're working with. Another option is to use the Asus Xtion Pro Live which is supported by NI-mate but is unfortunately twice the price of the Orbbec. A new big thing on the rise is the OpenPose library.
As far as I know these are the only options currently. Please, write a follow up when something pops up!
Hi....I have PRO hd 1080 form works extraordinary, have everything set up with my presets yet I was thinking about how to get the camera to follow development. for instance I have ptz presets going from extreme left across to extreme right ( much like perusing a screen or book). I needed to have the camera track whatever development and follow it across the presets varying.
Like if its pointing at a parking area, vehicle enters extreme left by ptz 1 and track it as it gets across ptz 2 and 3 while recording video.
Is this at all conceivable or have I missed something basic? I read the manuals, poured over the settings and am puzzled.