• I am working on another project with Kinect. Once again new to working with Kinect and I would like to know if I can achieve to capture the motion and see it in Isadora in 3D. I know the 3D can be achieved by axis X,Y, Z and the Z is in a way rentering the depth of 2D space. However I can I see it in Isadora? So far I have 2D image and Z is represented with different colours and perhaps thickness of the image: For example when I am far away from kinect it goes green and thicker and when closer to the kinect  its red and thinner. The idea is to use the motion tracking system and capture the moving body in space. Has anyone worked with two Kinects and Isadora before? If yes what was the set up and did you use any supporting softwares/interfaces? I understand that one kinect can work with one machine. However I wonder if there is a way around it with using only one kinect and Isadora. I do not want to use 3D modeling software. Any thoughts anyone? Thank you. 

  • @LucieDance

    problems are multiple in you question:

    – Mac or PC?

    – Kinect V1 or V2?

    – Use of skeleton or depth map?

    Personaly I use kinect V2 on mac. For Skeleton I use Delicode Ni-Mate sending data via OSC and Osculator to Isadora. For Depth Map, I use Processing and shader and I send the processed image via Syphon to Isadora. I can use two kinect on the same computer if I use an external CalDigit box plugged via thunderbolt and adding a real USB3 plug.

    Your project need to be a little more developed and you need to make your proper test, borrowing material.

    Its normally easier on PC.


  • @jhoepffner - Hi Jacques. Can I ask: what is the benefit of sending the OSC through Osculator, rather than going straight from NIMate into Isadora?



  • @dbini

    1) Ni-Mate send OSC without slash, so Isadora doesn't recognize it

    2) Ni-Mate send OSC to port 7000

    With the free version of Ni-Mate, you cannot save, so you have to redo each time the setting. With Osculator you can make a setting adding a slash to each address, rerouting to 1234 and save the routing. In any case, I think Osculator (19.99 €) it's necessary to use properly OSC on MacOs.


  • Beta Platinum

    @jhoepffner you use the free version of NI Mate?

  • Izzy Guru

    Hey; just an FYI - yo can add the slash in the Live set up window but it is a bit of a pain to do! 

    Your probably know this already but may help other users. 

  • @Skulpture

    Yes, but with the free version, you cannot save it, so each time to dote it is a real pain, good reason for Osculator…

  • @jhoepffner Thank you for your reply. It is on Mac and I believe I am working with Kinect V1 (model 1414 which was advisable that is works with Isadora then the newer version) and Skeleton. I have not considered or did not know about the depth map. What are the advantages of working with depth map? Do you use the two Kinects with depth map? Thank you for the info on using Processing. That is something I wanted to explore and glad that this is the right way to go about - to expand on that: do I need to code in processing to get that result? Yes Syphon is something I am trying to explore too, still very new to this tho. The project is pilot project with G4A funding so it is RnD in a way, to test, explore and find ways of working. Lucie   

  • @LucieDance

    – V1 (1414) and V2 are not working at all with Isadora directly (for the moment…) V1 is more supported on Mac compared to V2, but you can send different informations from both to Isadora using other softwares.

    – for V1, the best way is with Processing (see Isadora tutorial), you can have skeleton infos via OSC and different images via Syphon. With a little Processing knowledge you can prepare infos before sending. The Isadora tutorial use SimpleOpenNi library working only with Processing 2, with Processing 3 you need to use the more modern Open Kinect for Processing, but no skeleton there, only with Delicode Ni-mate.

    – for V2, you can send skeleton infos from Delicode Ni-Mate through Osculator to Isadora via OSC and images (included depth map) with Processing (Open Kinect for Processing library)

    – I  use depth map because my kinects are zenithal and because I need a map I can analyze. I send the pre-analyzed (threshold) image from Processing (with a shader) via Syphon to Isadora for tracking, for masking and for different image spatial and temporal treatments.


  • @jhoepffner Thank you this is all very helpful. I am trying to text this idea, which it says that it works with Processing 3. So if I understand it correctly from your reply, if have kinect V1 it will only work with Processiong 2. right? The video below is something I am trying to do. (The codes are available to download) Also I am reading about the Depth Map, which is used with Kinect V2. So if I would want to purchase V2 for Mac where I would be looking for one? What model? Do you have any examples of your work with V2? 

  • @LucieDance

    Perhaps begin with an easier example, here are so many different and complicate propositions…

    corrections: kinect V1 work with processing 2 and 3, it's the SimpleOpenNi library for skeleton who works only with processing 2 at the moment (there is a improved version for P3 on github, but it doesn't work for me). All V2 model are working but you need the "kinect adaptor" from Microsoft to use it and an USB3 plug.

    A work made with V2 depth map is here


  • @jhoepffner Thank you for sharing your work. Very interesting what you can achieve with the depth map. Something to test perhaps. Regarding the post I have posted about the Processing sketch - I know it is advanced stuff at the moment for me, however it is perfect for the project I am working on. I need some 3D images when motion tracking and this is sketch from github looks excatly what I am looking for. Therefore I am looking for someone urgently who would help me to get this up and running. Anyone you may know who knows Processing 3 and Kinect? I logged this on the Processing forum and managed to download all libraries, but have an error about kinect scanner, which I believe its something to do with libfreenect manager (installing the terminal drivers) right? Anyone out there can help me urgently? I would like to have this up and running ideally by this Satruday. It would be paid work, received funding. I would really really appreciate that help. Lucie 

  • @LucieDance

    Where are you situated on the globe?

  • Tech Staff


    If you're in New York City, I can probably assist you.

  • Hi Jacques and Lucas and everyone,

    In workshops I generally use NIMate to syphon the User ID (colour coded figures) Live View out to Isadora (kinect2 on Mac) - is it possible to do something like this in Processing? - even if the figures are all the same colour, I just need to separate them out from the background in a nice, stable way (for a 3-week always-on installation piece)



  • Tech Staff

    Hi there @dbini,

    Why do you want to use processing instead of using Nimate ? It is possible to do this in Processing, but personally I would rather use NiMate then Processing..

  • @Juriaan - i'm trying to keep costs down and don't want the bunny logo appearing every 10 minutes.

  • Beta Platinum

    I also would be interested how to get something like the ghost image from NI-Mate from something cheaper or free software

  • @crystalhorizon - I had a look at Vuo but at the moment it seems it only supports kinect1.