Kinect + Isadora Tutorials Now Available
-
-
@ DusX
All is working fine. Thanks for the help, and to those who wrote the tutorial. this all very exiting -
Thanks all for the wonderful comments and helpful feedback. Get out there and make some cool art :D
-
I am guessing from the screenshots and from my own patch that all the red text (errors?) that show up in Processing patch are normal?
-
"I am guessing from the screenshots and from my own patch that all the red text (errors?) that show up in Processing patch are normal?"
Copy & paste a sample - all library initialisation output is generally red by default... so __shouldn’t__ be anything to worry about, but I can take a look."Like @mark mentioned in regards to tracking multiple skeletons, switching the video feed would have to be added to the Processing code.
_I imagine that OSC control of the video type could be added.__Perhaps a Processing Pro [user] in the forum might take this one and share the final script."_Can do... but probs not til the weekend. -
It would be a greatly appreciated contribution. -
Oh yessssssss
-
Have hoisted the code over to my GitHub repo, so just keep an eye on that... feel free to report issues via that too as we go along, or fork / contribute etc. Not done anything to it at the moment, just preparing the ground so to speak.
@Mark / @mc_monte / @dusx / @skulpture / @michel et alle - if you want some specific text putting in the README.md acknowledging orig sources etc, just fire it over to me via message or similar and I’ll swap it in. At the minute I’ve left license type as “none”... suggest ‘CC0' seeing as it’s all open source anyways. Thoughts?See https://github.com/PatchworkBoy/isadora-kinect-tutorial-files -
I've also done a video walkthrough (unofficialy)
https://www.youtube.com/watch?v=0HY5U6QSyhM -
@skulpture - Added link to README.md
-
Cheers @Marci
-
OK - 2 things y'all need to be aware of when using these tutorials...
1: natural limitations of SimpleOpenNIThe SimpleOpenNI library used in Processing is one of a few frameworks for interfacing with a Kinect. It combines the old way of doing things where one had to manually install OpenNIv1 and NiTe etc to get skeleton / limb and user tracking, thus simplifying the process. The Kinect itself doesn’t do any skeleton or limb identification or tracking.If on a USB3 host (Macbook Retina for instance), running anything other than Depth camera at the same time as skeleton tracking may randomly throw an iso_callback() error and/or trigger 'Isochronous transfer error log messages’... this is inherent to SimpleOpenNI and can’t be avoided, and will render everything unstable. It could go at any time, whenever it feels like it. It will either plain bomb the processing sketch (in case of the iso_callback() error), or cause _everything_ to lock up (in case of the Isochronous transfer error) until eventually the sketch bombs (which occasionally you have to force by simply pulling the USB cable).The only way to get round this is to use a USB2 host (older Macbook Pro, MacMini), or just chance your luck. (I’ve documented this over on the GitHub Repo, yonder: https://github.com/PatchworkBoy/isadora-kinect-tutorial-files/issues/1)I’ve documented previously on here that skeletons should only be used and implemented within a sketch when absolutely necessary / desired... as these are the only features unique to the SimpleOpenNI libraries and introduce a lot of CPU weight. If not using skeletons, turn to lighterweight OpenKinect libraries etc for simple camera feeds & depth point tracking / blob tracking. From the Isadora end of things, dealing with Skeletons is the easiest way of handling Kinect data in an obvious way... but from the Kinect middleware point of view, it produces the most unstable results. Purely a result of SimpleOpenNI being out of development (hence no Kinect v2 support, and no fix to this particular issue as when it went out of development [aka, the PrimeSense technology & software rights were snapped up by Apple then passed to Occipital and are now part of the Structure.IO SDK] USB3 wasn’t released - OpenNIv1 and v2 are now in a complete code freeze with no further development). Ultimately, that means this all has rather a limited lifespan - sorry!And- Mirror mode only affects the RGB / IR output. It has no impact on depth or user output.2: natural limitations of the Kinect v1 hardwareYou can either:- Start in RGB mode, and switch between that and User & Depth mode.OR- Start in IR mode and switch between that and User & Depth mode.Once the camera has been initialised in either RGB or IR mode, it can’t be switched to the other mode basically!To change from RGB to IR mode, the sketch must be stopped and restarted.To change from IR to RGB mode, the sketch must be stopped and restarted.You can’t:- Start in RGB mode and switch between that and IR mode- Start in IR mode and switch between that and RGB modeBearing all of that in mind... the GitHub Repo is now updated with an updated Processing sketch and Izzy file (for Mac) with stream switching enabled via OSC and a few other bits.Please read the Warnings on the GitHub page, the OSC MultiTransmit Actor notes in the Izzy file, & comments in the processing sketch code.https://github.com/PatchworkBoy/isadora-kinect-tutorial-files -
this is great, thanks
-
Thanks for the detailed reply @Marci. We are semi-aware that this method has a lifespan. But we get a lot of questions on the forum, inbox messages and emails asking how to get the kinect sensor working with isadora. I must get two emails a week to my personal email. So we thought it was best to come up with an 'official' method. I must admit I was not aware of the 'iso_callback() error' I have had the occasional lock up - but nothing too bad. I often wondered why and this may be it. Like you've said, the lifespan of this method is limited with the take over and apple; prime sense, etc etc. Which is a shame of course. Similarly we get/I get a lot of questions about the Kinect V2 and why won't it work, etc. This is a tough nut to crack really.... But thanks for your help and notes on github, etc. Cheers.
-
It’s same discussion as we had previous. First question should be: skeletons and limbs, or blobs. What tool do i specifically need...?
If blobs, OpenKinect framework which is still maintained, supports KinectONE & USB3 fully iirc. If skeleton, OpenNI framework as no other choice. Skeletons are quicker and easier to get an idea going with, but usually the same can always be achieved with blobs also.If you must use skeletons and it’s going to be mission critical to a show or installation, use an older MacMini / MBPro / Laptop that only has USB2 support dedicated to handling the Kinect side of things, and fire the OSC over the network to your ‘main’ system. Avoids all the risk. -
@Marci a ghost Image would be simply great!! I mainly need the shape of the performer/s and i´m not tracking a skeleton nor OSC date. Only a simple Shape trying to get close to an infrared cam quality.
-
Ghost support added... https://github.com/PatchworkBoy/isadora-kinect-tutorial-files/blob/master/Isadora_Kinect_Tracking_Mac/Isadora_Kinect_Tracking_Mac.pde
Run the sketch, let it get your skeleton, hit 5 key on keyboard to switch to ghost view, s key to disable skeleton. -
To change ghost color, in the source code, search for...
// set Ghost color here...and change the color(255,255,255) values: (R,G,B) where values can be between 0 & 255.Add some blur via Isadora if needed. -
@Marci What do you think would be involved in porting this over to pc?
-
Errrr.... Good question. On PC iirc we're reliant on the proper MS Kinect SDK aren't we? Thought that had nuked out the opportunity to use anything but the proper XBox for Windows models which would preclude me from testing...? Will need to read up on it. I'll look into it...!