KinectV2 OSX
-
What version of OSX, I have not checked this for a while as I only had the kinect 2 for a while. I'll have a chance to check out soon.
-
Dear All,
I've been working here with @chimerik and I think I have a tip for @Fred 's app. This tip came from various reports from the libfreenect2 community, the library used to enable connectivity with the Kinect v2 on Mac OS.- Ensure power is applied to Kinect v2 via power supply
- Plug in USB 3 connection to computer
- Attempt to run Fred's app... likely it won't work. Quit the app.
- Unplug power to Kinect v2 but do not unplug the USB 3 cable
- Restore power to Kinect.
- Run Fred's app. Voila! It works. (And seemed to work every time thereafter... though we never rebooted the computer nor did we reconnect the Kinect v2.)
Worked for us. ;-)Best Wishes,MarkP.S. Also ran this super simple app on a PC to get Skeleton data from the Kinect v2\. We then sent the OSC data to the Mac with @chimerik's patch. Very reliable and as simple as can be. https://github.com/microcosm/KinectV2-OSCP.P.S. Because the Windows app above changes the ID for the body every time a new body appears, added a new feature to the Stream Setup. You can now code an address like this /first/*/third. The * means "match anything" so that, even if the second part of the address changes, the message will still be recognized. This will be available in the next beta. -
Hey one other thing, from an OF newbie. I tried to compile your app, but am getting millions of errors due to freeimage.aE.g.,Undefined symbols for architecture i386:"_uncompress", referenced from:Imf_2_2::DwaCompressor::uncompress(char const*, int, Imath_2_2::Box >, char const*&) in freeimage.a(ImfDwaCompressor.o-i386)Imf_2_2::Pxr24Compressor::uncompress(char const*, int, Imath_2_2::Box >, char const*&) in freeimage.a(ImfPxr24Compressor.o-i386)Imf_2_2::Zip::uncompress(char const*, int, char*) in freeimage.a(ImfZip.o-i386)(maybe you meant: _ssl3_do_uncompress)"_opj_stream_set_user_data", referenced from:opj_freeimage_stream_create(FreeImageIO*, void*, int) in freeimage.a(J2KHelper.o-i386)"_opj_stream_set_user_data_length", referenced from:opj_freeimage_stream_create(FreeImageIO*, void*, int) in freeimage.a(J2KHelper.o-i386)"_opj_stream_set_skip_function", referenced from:opj_freeimage_stream_create(FreeImageIO*, void*, int) in freeimage.a(J2KHelper.o-i386)Any idea why that's happening?I installed freeimage using homebrew, but to no avail. If you know of a simple solution, that would be great.Best Wishes,Mark -
Ok, I updated the giutub to the latest OF and up to date versions of all the various libraries and addons, there are some changes in the dependancies for the addons that are used.
It should fix some troubles getting the kinect recognised and a few things are cleaned up.Mark- I am not sure exactly what you are doing with OF, if you download the latest release and then collect the addons needed (check the git for this app and the links for the various addons) and clone them to openframeworks->addons it should work.Clone this repo into openframeworks->apps->myAppsI did it with the latest branches this morning and had no trouble. -
The * fix is great. That new body address was really driving me nuts. -
Just did some more fixes (finally got hold of a kinect v2), github is not playing nice with the crappy internet here, the repo is up to date, but not the release, ill upload it later tonight with better internet.
-
Can't wait to try this with my Kinect 2. Great thread here
-
Just a note that the release is updated, I have a bit of time and access to a kinect, it would be great if someone ha the time to try it out.
-
Just got a hold of a Kinect v2 and followed Mark's steps listed above. Was up and running in less than a two minutes!
Thank you so much Fred!!!!!!-Alex -
Just a question,
Is the white line that I am seeing in the RGB feed a standard artifact from the Kinect 2? -
Not for me so far, but I have to say my testing is super limited. I don't have a kinect V2, they have come and gone for a few projects for me. I tested this with 2 laptops and one kinect and I did not see this... The driver for OSX is an experimental 'hack' so it is also not perfect.
-
Hey Guys,
Just a question for the long term.I'm really looking forward working with my new Kinect V2. I got it working on a Windows machine but still a hassle to get it to work with Mac. All these links here look really promising.Although I have a more specifiek question. Does anyone know if there is an app or code to get the heart rate data over OSC ? and maybe some other data then the Syphon stream? Saw the kinect giving me data like state of being, wearing glasses or not, married or not and so forth.Here in the Netherlands every gamer tries to get rid of its Kinect V2 and they go for 50,- euro's nowadays :p but since i'm not a good coder I depend on github and smart people to get it working in my favor...
Cheers!Pépé
-
So far there is no middle ware for OSX capable of decoding the skeleton data or anything other than the video streams provided by the app in this thread. It may happen in the future but I would not rely on it. On windows using the SDK, you can get full access to the higher functions of the camera, there are some solutions for sending skeleton data over OSC. The app in this thread will only send the image streams which is marginally useful, but not life changing.
-
@ Fred
Does that include Ni Mate?https://forum.ni-mate.com/t/os-x-test-build-for-kinect-for-xbox-one-and-kinect-for-xbox-360/586cheersbonemap -
@bonemap there has been some activity on the libfreenect forums on this front (incorporating openNi code to get skeleton data with an updated libfreenect). In the end there may be some kind of fruitful activity, but I doubt it will catch up to the speed and efficiency as well as the large feature set on windows. This post pretty much says it is unstable and un reliable, so yes, here is something, but it does not sound ready for shows.
Who knows it may end up working ok one day, but it has been quite a long time this works perfectly on windows, MS even took the effort to work with the creative communities of Cinder and Openframeworks to create a set of tools for using it on windows.[bonemap](http://troikatronix.com/troikatronixforum/profile/248/bonemap) did you try this? -
Thanks @Fred,
No I haven't tried it. It does not seem to be worth the effort at this point. What are your thoughts on Apple in this area? For example, there was the Apple purchase of PrimeSense some years ago now, but nothing has emerged, except rumours about depth sensors built into future iterations of iPad etc.I don't know if it is worth waiting for Apple Mac to offer development in this arena? Or perhaps it will be the next big launch or new technology for Apple?cheers,bonemap -
I am trying to align the Depth image and the Color image in OF (from Kinect 2).I can't find information on how this alignment should work. I know the color is 1920*1080 and the depth 512*424but scaling the depth upto to the height of the color doesn't align the images. Any knowledge / pointers ? I can get it workably close within Isadora.. but not perfect. -
@DusX Ok, lets go back to some fundamentals, no amount of scaling and quad warping will actually line up 2 cameras, or a camera and a projector. The lenses, sensors and imaging systems will produce differently warped images (no image is not warped) and have different extrinsics, instrinsics and FOV's. The offset is not linear and needs a complicated algorithm to transform between one and the other.
Isadora misses fundamental tools to do this. It can be done through some calibration (like you can see with camera calibration in openCV).Microsoft have of course prepared this transformation in their SDK through the coordinate mapper that is accessible in OF in the windows only addon ofxKinectForWindows2If you are on PC in OF you can see a bit how this works with these functions (this is not the place to go into depth into code so here are the method names)virtual HRESULT STDMETHODCALLTYPE MapCameraPointToDepthSpace(virtual HRESULT STDMETHODCALLTYPE MapCameraPointToColorSpace(You could prepare a mesh, or reconstruct the coordinate mapping and use a shader in the GLSL tools now available in Isadora to achieve this, but first you need to reproduce the coordinate map.I have wanted this kind of intelligent image manipulation in Isadora for a long time, camera calibration, and camera to world/ projector calibration would be a great tool and is something that underlies many questions that come up on the forum- projection on tracked objects... -
Funny, as always.. shortly after writing the previous post I found the coordinate mapper function in the ofxKinectForWindows2 addon (that is what I am working with)I see that nearly what I want to do is already done in the BodyIndexColor example, so I think I can simply port that code with some minor changes (previously I was building from the Base example)Thanks for the info. I had hoped that the images were corrected for alignment upfront (in the kinect hardware before exposing the images). Simply wasn't sure of how the Kinects structure/logic is setup. -
Hi all,
@Fred thanks so much for "KinectV2_Syphon"!
We are testing the app on El Capitan Macbook Pro. Works good. We can't make it work on Yosemite Mac mini though. We tried debugging by unplugging the power chord just like mark recommends and still nothing. Only a black image.
We are wondering if the app does not work on Yosemite?Thanks!