• Products
    • Isadora
    • Get It
    • ADD-ONS
    • IzzyCast
    • Get It
  • Forum
  • Help
  • Werkstatt
  • Newsletter
  • Impressum
  • Dsgvo
  • Press
  • Isadora
  • Get It
  • ADD-ONS
  • IzzyCast
  • Get It
  • Press
  • Dsgvo
  • Impressum

Navigation

    • Register
    • Login
    • Search
    • Categories
    • Recent
    • Popular
    • Tags

    Motion Tracking through Isadora

    How To... ?
    7
    20
    6617
    Loading More Posts
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as topic
    Log in to reply
    This topic has been deleted. Only users with topic management privileges can see it.
    • chiligoldman
      chiligoldman last edited by

      Hi Marci.

      Thank you for sharing your tutorial with code examples. It's so informative and is something I've been trying to work out for a while. I'm having some stability issues and sometimes get errors running Processing that sometimes results in a crash. I wonder if it could do with versions. 
      I'm curious as to which Kinect sensor you are using. Do you think that could have anything to do with it? Mine is the first generation.
      Thanks again! I appreciate it.
      1 Reply Last reply Reply Quote 0
      • Marci
        Marci last edited by

        Me? Kinect for xBox360... 1417 model (x2). The only issue with these is if running over USB3 and using multiple feeds as noted above.

        rMBP 11,3 (mOS 10.13) / rMBP 11,4 (mOS 10.14) / 3x Kinect + Leap / TH2Go
        Warning: autistic - may come across rather blunt and lacking in humour!

        1 Reply Last reply Reply Quote 0
        • gapworks
          gapworks last edited by

          @ Marci

          finally got it working but i can't get it into isadora :( Processing does not appear in the syphon receiver. 

          Running MBP2017 / Ventura Osx 13.6.7 / 16 GB 2133 MHz LPDDR3 / Intel HD Graphics 630 1536 MB / Latest Isadora Version / www.gapworks.at / located in Vienna Austria

          1 Reply Last reply Reply Quote 0
          • gapworks
            gapworks last edited by

            @ skulpture

            finally getting to motion tracking/ kinect... ect. i looked into your tutorials today. I even downloaded Synapse but all i get is a Black Preview Image. Could it be that is it not working in "Yosemite"?
            Or is it the MaxPatch that is missing (also in your tutorial)
            thanks for any further Information
            p.

            Running MBP2017 / Ventura Osx 13.6.7 / 16 GB 2133 MHz LPDDR3 / Intel HD Graphics 630 1536 MB / Latest Isadora Version / www.gapworks.at / located in Vienna Austria

            1 Reply Last reply Reply Quote 0
            • Skulpture
              Skulpture Izzy Guru last edited by

              Hi. Yes this is an old tutorial I'm afraid. I will have another look into it for you as soon as I can.

              Graham Thorne | www.grahamthorne.co.uk
              RIG 1: Custom-built PC: Windows 11. Ryzen 7 7700X, RTX3080, 32G DDR5 RAM. 2 x m.2.
              RIG 2: Laptop Dell G15: Windows 11, Intel i9 12th Gen. RTX3070ti, 16G RAM (DDR5), 2 x NVME M.2 SSD.
              RIG 3: Apple Laptop: rMBP i7, 8gig RAM 256 SSD, HD, OS X 10.12.12

              1 Reply Last reply Reply Quote 0
              • Marci
                Marci last edited by

                "finally got it working but i can't get it into isadora :( Processing does not appear in the syphon receiver."

                Hit CMD-SHIFT-O in processing and it should give you the Examples browser... Expand Contributed Libraries, find the Syphon library's examples, run them and check they show up in Isadora. 

                rMBP 11,3 (mOS 10.13) / rMBP 11,4 (mOS 10.14) / 3x Kinect + Leap / TH2Go
                Warning: autistic - may come across rather blunt and lacking in humour!

                1 Reply Last reply Reply Quote 0
                • gapworks
                  gapworks last edited by

                  @marci yes the examples do show up!  see attached screenshot.

                  best

                  084b9c-screen-shot-2015-12-14-at-16.52.01.png

                  Running MBP2017 / Ventura Osx 13.6.7 / 16 GB 2133 MHz LPDDR3 / Intel HD Graphics 630 1536 MB / Latest Isadora Version / www.gapworks.at / located in Vienna Austria

                  1 Reply Last reply Reply Quote 0
                  • Skulpture
                    Skulpture Izzy Guru last edited by

                    @gapworks

                    Worked fine for me. I did have to do the 'hands up' pose and then it sprung to life. 

                    55ab83-screen-shot-2015-12-15-at-09.10.31.png

                    Graham Thorne | www.grahamthorne.co.uk
                    RIG 1: Custom-built PC: Windows 11. Ryzen 7 7700X, RTX3080, 32G DDR5 RAM. 2 x m.2.
                    RIG 2: Laptop Dell G15: Windows 11, Intel i9 12th Gen. RTX3070ti, 16G RAM (DDR5), 2 x NVME M.2 SSD.
                    RIG 3: Apple Laptop: rMBP i7, 8gig RAM 256 SSD, HD, OS X 10.12.12

                    1 Reply Last reply Reply Quote 0
                    • gapworks
                      gapworks last edited by

                      @skulpture yes after moving ( more jumping) in front of the kinect it woke up after all. i even managed to get the skeleton working, but getting to my laptop ( meaning closer to the sensor) it really freaked out :( so i might end up using nimate...

                      Running MBP2017 / Ventura Osx 13.6.7 / 16 GB 2133 MHz LPDDR3 / Intel HD Graphics 630 1536 MB / Latest Isadora Version / www.gapworks.at / located in Vienna Austria

                      1 Reply Last reply Reply Quote 0
                      • Marci
                        Marci last edited by

                        Anything based on SimpleOpenNI (including NIMate) takes an age to initialise skeleton properly unless it can see full height (feet to head) in shot. When you get too close, frequently skeleton legs start to go whappy - skeleton doesn't like it when it can't work out where a limb is, and just makes something up, which often freaks out whatever the Kinect is driving... One of many reasons I avoid using skeleton and prefer depth blobs. Otherwise, you have to add your own checking routines to remove limb markers between your middleware and output, or disable whatever that limb is driving when the coordinates exceed screen / stage / world boundaries or just vanish.

                        rMBP 11,3 (mOS 10.13) / rMBP 11,4 (mOS 10.14) / 3x Kinect + Leap / TH2Go
                        Warning: autistic - may come across rather blunt and lacking in humour!

                        1 Reply Last reply Reply Quote 0
                        • First post
                          Last post