How to make the content of a mask follow the mask
-
@gapworks The kinect/processing tutorial sketch (I assume that’s what you’re using) sends torso position over OSC by default... use that to control your text’s X/Y position.
From the default Izzy file within the download, add 2x OSC Listeners, listening to channels 7 (x) & 8 (y). Link value output of these to x / y position of your text draw actor. Use calc actors in-between to apply any required offsets or scaling to the values. The torso z position (distance from sensor) should be on OSC channel 9 if needed. -
If you remove the following two lines (365 & 367) from the processing sketch:
canvas.stroke(userClr[ (userList[i] - 1) % userClr.length ] ); drawSkeleton(userList[i]); ...then you can run with skeletons enabled without actually drawing them on screen, which will initialise sending of the OSC data.If the centre of gravity still shows up (that should say WHEN, cos it will), then delete lines 375 to 390 as well. -
@dbini - I sent them links to the published fixes for the whole USB3 issue probably over a year ago (possibly over 2 years ago now that I think about it), along with links to fixes for kinect motor not working. I was told then that they wouldn’t fix the issue. These issues have been around since NI-Mate v1 and are all down to the version of libusb that they’re utilising. Same issue as with the SimpleOpenNI Processing Kinect modules. I’m presuming they don’t have the ability to update the version or implementation of libusb that’s included within the OpenNIv1 libs that are necessary to provide Kinect v1 support that allows skeletons, which means it probably won’t ever get fixed in v2 and they’ll end up dropping Kinect v1 support. The specific problems are within Delicode_NI_Mate/Contents/Frameworks/libOpenNI.dylib, libusb-1.0.0.dylib and stable_libusb-1.0.0.dylib
S’why I ended up abandoning several hundred pounds worth of investment in their software (Ni-Mate & Z-Vector) - they weren’t prepared to fix known bugs, or even add an announcement to warn future purchasers of the unavoidable compatibility issues - a practice they continued when they released v2 with no announcement re: USB2 Kinect & USB3 ports. S’why I moved over to Processing for Kinect stuff in the first place. *shrug* - Joys of commercial software relying on out-of-development open source software.TL:DR; ANY & ALL software relying on OpenNI1 to provide Kinect rev1 support will have issues with USB3 ports. That means any & all software which provides skeleton output from a rev1 Kinect. -
@Marci - this kind of stuff is all beyond my capabilities. I read your posts and am constantly impressed by your level of detail. Thanks for your contributions to the Isadora community.
I just want to get a toolkit that works, and I don't mind paying a bit for something that's going to be plug and play and solve my problems. i do object to buying a license for something that's going to be useless in 1 or 2 years. here's what Delicode said in reply to my questions:"OS X El Capitan is still proving to be a huge problem due to the operating system USB system having been changed. We have a few ideas on how to fix this, but this will take a while. For now using Kinect for XBox 360 on El Capitan is not recommended, and staying in Yosemite is a safer bet."fortunately I just got my MYO working nicely, so am going to focus on that for a while and hope to find a simple solution for kinect sometime in the future. -
If you want any form of longevity from Kinect, bin the rev1 (or resign it strictly to no skeleton/user detection and use Freenect based solutions rather than OpenNI 1) and get a KinectONE instead... openni 2 is maintained & active and has no issues.
-
@Marci I have only KinetctONE. Two of them! And please do keep in Mind that you are talking to a designer and Photographer who is trying his best to learn some programming Languages. Processing at the Moment. So i failed after rev1.....:)
best -
(That was for @dbini)
-
Ps: Processing is just the editor software. The language is actually Java.
-
@gapworks - going back to your original question. is this what you're looking for? (i'm only using a syphon feed from NI Mate as the source image for this)
-
@dbini basically this is what i need. I gave it a quick try, text moves :) but i think i need to do a bit of scaling. finally :) i hardly used eyes because i always found it difficult to control its values.