• Products
    • Isadora
    • Get It
    • ADD-ONS
    • IzzyCast
    • Get It
  • Forum
  • Help
  • Werkstatt
  • Newsletter
  • Impressum
  • Dsgvo
  • Press
  • Isadora
  • Get It
  • ADD-ONS
  • IzzyCast
  • Get It
  • Press
  • Dsgvo
  • Impressum

Navigation

    • Register
    • Login
    • Search
    • Categories
    • Recent
    • Popular
    • Tags

    Kinect + Isadora Tutorials Now Available

    How To... ?
    28
    126
    236078
    Loading More Posts
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as topic
    Log in to reply
    This topic has been deleted. Only users with topic management privileges can see it.
    • Marci
      Marci last edited by

      @skulpture - Added link to README.md

      rMBP 11,3 (mOS 10.13) / rMBP 11,4 (mOS 10.14) / 3x Kinect + Leap / TH2Go
      Warning: autistic - may come across rather blunt and lacking in humour!

      1 Reply Last reply Reply Quote 0
      • Skulpture
        Skulpture Izzy Guru last edited by

        Cheers @Marci

        Graham Thorne | www.grahamthorne.co.uk
        RIG 1: Custom-built PC: Windows 11. Ryzen 7 7700X, RTX3080, 32G DDR5 RAM. 2 x m.2.
        RIG 2: Laptop Dell G15: Windows 11, Intel i9 12th Gen. RTX3070ti, 16G RAM (DDR5), 2 x NVME M.2 SSD.
        RIG 3: Apple Laptop: rMBP i7, 8gig RAM 256 SSD, HD, OS X 10.12.12

        1 Reply Last reply Reply Quote 0
        • Marci
          Marci last edited by

          OK - 2 things y'all need to be aware of when using these tutorials...

          1: natural limitations of SimpleOpenNI
          The SimpleOpenNI library used in Processing is one of a few frameworks for interfacing with a Kinect. It combines the old way of doing things where one had to manually install OpenNIv1 and NiTe etc to get skeleton / limb and user tracking, thus simplifying the process. The Kinect itself doesn’t do any skeleton or limb identification or tracking. 
          If on a USB3 host (Macbook Retina for instance), running anything other than Depth camera at the same time as skeleton tracking may randomly throw an iso_callback() error and/or trigger 'Isochronous transfer error log messages’... this is inherent to SimpleOpenNI and can’t be avoided, and will render everything unstable. It could go at any time, whenever it feels like it. It will either plain bomb the processing sketch (in case of the iso_callback() error), or cause _everything_ to lock up (in case of the Isochronous transfer error) until eventually the sketch bombs (which occasionally you have to force by simply pulling the USB cable).
          The only way to get round this is to use a USB2 host (older Macbook Pro, MacMini), or just chance your luck. (I’ve documented this over on the GitHub Repo, yonder: https://github.com/PatchworkBoy/isadora-kinect-tutorial-files/issues/1)
          I’ve documented previously on here that skeletons should only be used and implemented within a sketch when absolutely necessary / desired... as these are the only features unique to the SimpleOpenNI libraries and introduce a lot of CPU weight. If not using skeletons, turn to lighterweight OpenKinect libraries etc for simple camera feeds & depth point tracking / blob tracking. From the Isadora end of things, dealing with Skeletons is the easiest way of handling Kinect data in an obvious way... but from the Kinect middleware point of view, it produces the most unstable results. Purely a result of SimpleOpenNI being out of development (hence no Kinect v2 support, and no fix to this particular issue as when it went out of development [aka, the PrimeSense technology & software rights were snapped up by Apple then passed to Occipital and are now part of the Structure.IO SDK] USB3 wasn’t released - OpenNIv1 and v2 are now in a complete code freeze with no further development).  Ultimately, that means this all has rather a limited lifespan - sorry!
          And
          - Mirror mode only affects the RGB / IR output. It has no impact on depth or user output. 
          2: natural limitations of the Kinect v1 hardware
          You can either:
          - Start in RGB mode, and switch between that and User & Depth mode.
          OR
          - Start in IR mode and switch between that and User & Depth mode.
          Once the camera has been initialised in either RGB or IR mode, it can’t be switched to the other mode basically!
          To change from RGB to IR mode, the sketch must be stopped and restarted.
          To change from IR to RGB mode, the sketch must be stopped and restarted.
          You can’t:
          - Start in RGB mode and switch between that and IR mode
          - Start in IR mode and switch between that and RGB mode
          Bearing all of that in mind... the GitHub Repo is now updated with an updated Processing sketch and Izzy file (for Mac) with stream switching enabled via OSC and a few other bits.
          Please read the Warnings on the GitHub page, the OSC MultiTransmit Actor notes in the Izzy file, & comments in the processing sketch code.
          https://github.com/PatchworkBoy/isadora-kinect-tutorial-files

          rMBP 11,3 (mOS 10.13) / rMBP 11,4 (mOS 10.14) / 3x Kinect + Leap / TH2Go
          Warning: autistic - may come across rather blunt and lacking in humour!

          1 Reply Last reply Reply Quote 0
          • bruper
            bruper last edited by

            this is great, thanks

            17"MBP 2.93GHZ Core2Duo mid 2009 - OSX10.11.6 - 8GB, 1TBCrucial_SSD, izzy 3.0.7

            1 Reply Last reply Reply Quote 0
            • Skulpture
              Skulpture Izzy Guru last edited by

              Thanks for the detailed reply @Marci. We are semi-aware that this method has a lifespan. But we get a lot of questions on the forum, inbox messages and emails asking how to get the kinect sensor working with isadora. I must get two emails a week to my personal email. So we thought it was best to come up with an 'official' method. I must admit I was not aware of the 'iso_callback() error' I have had the occasional lock up - but nothing too bad. I often wondered why and this may be it. Like you've said, the lifespan of this method is limited with the take over and apple; prime sense, etc etc. Which is a shame of course. Similarly we get/I get a lot of questions about the Kinect V2 and why won't it work, etc. This is a tough nut to crack really.... But thanks for your help and notes on github, etc. Cheers.

              Graham Thorne | www.grahamthorne.co.uk
              RIG 1: Custom-built PC: Windows 11. Ryzen 7 7700X, RTX3080, 32G DDR5 RAM. 2 x m.2.
              RIG 2: Laptop Dell G15: Windows 11, Intel i9 12th Gen. RTX3070ti, 16G RAM (DDR5), 2 x NVME M.2 SSD.
              RIG 3: Apple Laptop: rMBP i7, 8gig RAM 256 SSD, HD, OS X 10.12.12

              1 Reply Last reply Reply Quote 0
              • Marci
                Marci last edited by

                It’s same discussion as we had previous. First question should be: skeletons and limbs, or blobs. What tool do i specifically need...?

                If blobs, OpenKinect framework which is still maintained, supports KinectONE & USB3 fully iirc. If skeleton, OpenNI framework as no other choice.  Skeletons are quicker and easier to get an idea going with, but usually the same can always be achieved with blobs also.
                If you must use skeletons and it’s going to be mission critical to a show or installation, use an older MacMini / MBPro / Laptop that only has USB2 support dedicated to handling the Kinect side of things, and fire the OSC over the network to your ‘main’ system. Avoids all the risk.

                rMBP 11,3 (mOS 10.13) / rMBP 11,4 (mOS 10.14) / 3x Kinect + Leap / TH2Go
                Warning: autistic - may come across rather blunt and lacking in humour!

                1 Reply Last reply Reply Quote 0
                • gapworks
                  gapworks last edited by

                  @Marci a ghost Image would be simply great!! I mainly need the shape of the performer/s and i´m not tracking a skeleton nor OSC date. Only a simple Shape trying to get close to an infrared cam quality.

                  Running MBP2017 / Ventura Osx 13.6.7 / 16 GB 2133 MHz LPDDR3 / Intel HD Graphics 630 1536 MB / Latest Isadora Version / www.gapworks.at / located in Vienna Austria

                  1 Reply Last reply Reply Quote 0
                  • Marci
                    Marci last edited by

                    Ghost support added... https://github.com/PatchworkBoy/isadora-kinect-tutorial-files/blob/master/Isadora_Kinect_Tracking_Mac/Isadora_Kinect_Tracking_Mac.pde

                    Run the sketch, let it get your skeleton, hit 5 key on keyboard to switch to ghost view, s key to disable skeleton.

                    rMBP 11,3 (mOS 10.13) / rMBP 11,4 (mOS 10.14) / 3x Kinect + Leap / TH2Go
                    Warning: autistic - may come across rather blunt and lacking in humour!

                    1 Reply Last reply Reply Quote 0
                    • Marci
                      Marci last edited by

                      To change ghost color, in the source code, search for...

                           // set Ghost color here
                      ...and change the color(255,255,255) values: (R,G,B) where values can be between 0 & 255.
                      Add some blur via Isadora if needed.

                      rMBP 11,3 (mOS 10.13) / rMBP 11,4 (mOS 10.14) / 3x Kinect + Leap / TH2Go
                      Warning: autistic - may come across rather blunt and lacking in humour!

                      1 Reply Last reply Reply Quote 0
                      • DusX
                        DusX Tech Staff last edited by

                        @Marci What do you think would be involved in porting this over to pc?

                        Troikatronix Technical Support

                        • New Support Ticket Link: https://support.troikatronix.com/support/tickets/new
                        • My Add-ons: https://troikatronix.com/add-ons/?u=dusx
                        • Profession Services: https://support.troikatronix.com/support/solutions/articles/13000109444-professional-services

                        Running: Win 11 64bit, i7, M.2 PCIe SSD's, 32gb DDR4, nVidia GTX 4070 | located in Ontario Canada.

                        1 Reply Last reply Reply Quote 0
                        • Marci
                          Marci last edited by

                          Errrr.... Good question. On PC iirc we're reliant on the proper MS Kinect SDK aren't we? Thought that had nuked out the opportunity to use anything but the proper XBox for Windows models which would preclude me from testing...? Will need to read up on it. I'll look into it...!

                          rMBP 11,3 (mOS 10.13) / rMBP 11,4 (mOS 10.14) / 3x Kinect + Leap / TH2Go
                          Warning: autistic - may come across rather blunt and lacking in humour!

                          1 Reply Last reply Reply Quote 0
                          • mark
                            mark last edited by

                            First, I've been so busy working on Isadora v2.2 that I haven't been looking at the forum much in the last weeks. I was excited to see all the energy here among the community for this Kinect solution. Thanks to everyone for contributing, and especially @Marci for updates and setting up the GitHub.

                            I wanted to note to @Marci that I tested the Windows version using a Kinect 1473\. So you definitely are not _forced_ to use Xbox for Windows models.
                            For a recent workshop, I did a quick and dirty implementation of multiple skeletons. Once v2.2 is out, I'll add that to the repro so everyone can have it.
                            Best Wishes,
                            Mark

                            Media Artist & Creator of Isadora
                            Macintosh SE-30, 32 Mb RAM, MacOS 7.6, Dual Floppy Drives

                            1 Reply Last reply Reply Quote 0
                            • Marci
                              Marci last edited by

                              Ah cool - in which case I’ll look to reproduce for win. Never really bothered hooking it up to a PC as it was always simpler to get things going on OSX.

                              rMBP 11,3 (mOS 10.13) / rMBP 11,4 (mOS 10.14) / 3x Kinect + Leap / TH2Go
                              Warning: autistic - may come across rather blunt and lacking in humour!

                              1 Reply Last reply Reply Quote 0
                              • Marci
                                Marci last edited by

                                Just skimmed through the windows sketch - yeah it'll be no bother to port. Will get it done this week.

                                rMBP 11,3 (mOS 10.13) / rMBP 11,4 (mOS 10.14) / 3x Kinect + Leap / TH2Go
                                Warning: autistic - may come across rather blunt and lacking in humour!

                                1 Reply Last reply Reply Quote 0
                                • bruper
                                  bruper last edited by

                                  Something I cannot find is the way to change the Processing sketch to start with camera in RGB mode or IR MODE.

                                  I tried with no success to change it in the OSC multitrasmit of the "Isadora Skeleton Test Mac" supplied in 
                                  https://github.com/PatchworkBoy/isadora-kinect-tutorial-files

                                  and the Processing sketch of 9 days ago and the today one.

                                  I searched in the Processing sketch if there was something I would be able to change but nope... It starts on kCameraImage_Depth = 3.

                                  I would like to try the others kCameraImage.

                                  Thanks for enlighten me.

                                  17"MBP 2.93GHZ Core2Duo mid 2009 - OSX10.11.6 - 8GB, 1TBCrucial_SSD, izzy 3.0.7

                                  1 Reply Last reply Reply Quote 0
                                  • mark
                                    mark last edited by

                                    Dear @bruper,

                                    Look at this part of the code:
                                    // --------------------------------------------------------------------------------
                                    //  CAMERA IMAGE SENT VIA SYPHON
                                    // --------------------------------------------------------------------------------
                                    int kCameraImage_RGB = 1;                // rgb camera image
                                    int kCameraImage_IR = 2;                 // infra red camera image
                                    int kCameraImage_Depth = 3;              // depth without colored bodies of tracked bodies
                                    int kCameraImage_User = 4;               // depth image with colored bodies of tracked bodies
                                    int kCameraImage_Ghost = 5;
                                    int kCameraImageMode = kCameraImage_IR; // << Set this value to one of the kCamerImage constants above
                                                                             // for purposes of switching via OSC, we need to launch with 
                                                                             // EITHER kCameraImage_RGB, or kCameraImage_IR
                                    You should be able to set it to kCameraImage_RGB to show the RGB channel.
                                    Best,
                                    Mark

                                    Media Artist & Creator of Isadora
                                    Macintosh SE-30, 32 Mb RAM, MacOS 7.6, Dual Floppy Drives

                                    1 Reply Last reply Reply Quote 0
                                    • bruper
                                      bruper last edited by

                                      @mark

                                      thanks, yes how silly of me.. totally overlooked...

                                      17"MBP 2.93GHZ Core2Duo mid 2009 - OSX10.11.6 - 8GB, 1TBCrucial_SSD, izzy 3.0.7

                                      1 Reply Last reply Reply Quote 0
                                      • gapworks
                                        gapworks last edited by

                                        @Marci thanks for the fast reply ! i will give it a try on Wednesday when i return from venice as i don´t travel with my kinect!

                                        Running MBP2017 / Ventura Osx 13.6.7 / 16 GB 2133 MHz LPDDR3 / Intel HD Graphics 630 1536 MB / Latest Isadora Version / www.gapworks.at / located in Vienna Austria

                                        1 Reply Last reply Reply Quote 0
                                        • Marci
                                          Marci last edited by

                                          "tried with no success to change it in the OSC multitrasmit of the "Isadora Skeleton Test Mac" supplied in  https://github.com/PatchworkBoy/isadora-kinect-tutorial-filesand the Processing sketch of 9 days ago and the today one.”

                                          Like I said, if you start in RGB you can’t switch to IR. If you start in IR you can’t switch to RGB. Limitation of the Kinect hardware, not Processing. Nothing anyone can do about it regardless of what software you’re working in.
                                          Both can’t be activated and switched between in a Processing sketch. You must choose one to work with. You can have (IR _OR_ RGB) & Depth & User & Ghost. You can’t have IR & RGB & Depth & User & Ghost.

                                          rMBP 11,3 (mOS 10.13) / rMBP 11,4 (mOS 10.14) / 3x Kinect + Leap / TH2Go
                                          Warning: autistic - may come across rather blunt and lacking in humour!

                                          1 Reply Last reply Reply Quote 0
                                          • Marci
                                            Marci last edited by

                                            The confusion everyone is making here is passing the camera images to Isadora full stop.

                                            If you’re creating and affecting visuals in Isadora, pass OSC data to Isadora, and do everything visual in Isadora.
                                            If you’re creating visual effects in processing, just use processing. Then only pass the syphon feed to Isadora to integrate that into an Isadora scene. Pass the minimal OSC data as needed to control anything additional you may want to control in Isadora, if anything at all.
                                            The majority of folks dabbling in Kinect at the moment here seem to be wanting to produce visual effects that are done wholly in processing (i.e.: stuff from openprocessing.org, but substitute mouse for hand/kinect).
                                            Isadora exists so people don’t have to learn code to create visuals. When using Processing, you’re just using Isadora as a camera / media player switcher / sequencer, and hardcoding all your visuals and interaction within Processing.
                                            You have to abstract how you think about it. Work out what you want to achieve and WHERE to achieve it...

                                            rMBP 11,3 (mOS 10.13) / rMBP 11,4 (mOS 10.14) / 3x Kinect + Leap / TH2Go
                                            Warning: autistic - may come across rather blunt and lacking in humour!

                                            1 Reply Last reply Reply Quote 0
                                            • First post
                                              Last post