• Products
    • Isadora
    • Get It
    • ADD-ONS
    • IzzyCast
    • Get It
  • Forum
  • Help
  • Werkstatt
  • Newsletter
  • Impressum
  • Dsgvo
  • Press
  • Isadora
  • Get It
  • ADD-ONS
  • IzzyCast
  • Get It
  • Press
  • Dsgvo
  • Impressum

Navigation

    • Register
    • Login
    • Search
    • Categories
    • Recent
    • Popular
    • Tags

    Tracking two Dancers

    How To... ?
    12
    33
    22262
    Loading More Posts
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as topic
    Log in to reply
    This topic has been deleted. Only users with topic management privileges can see it.
    • Skulpture
      Skulpture Izzy Guru last edited by

      I've mentioned it on the NI Mate forum and had a reply;

      It would supper cool if the ghost images for each user could be changed; rather than all people being green.

      A small preferences window/pop-up would be ideal.

      Person 1 = green

      Person 2 = red

      Person 3 = blue

      Person 4 = yellow

      Be great for use with isadora and colour tracking/keying.

      Janne October 11

      _skulpture: That's actually a pretty cool idea.. we'll have to see what we come up with for the next version :)_

      Graham Thorne | www.grahamthorne.co.uk
      RIG 1: Custom-built PC: Windows 11. Ryzen 7 7700X, RTX3080, 32G DDR5 RAM. 2 x m.2.
      RIG 2: Laptop Dell G15: Windows 11, Intel i9 12th Gen. RTX3070ti, 16G RAM (DDR5), 2 x NVME M.2 SSD.
      RIG 3: Apple Laptop: rMBP i7, 8gig RAM 256 SSD, HD, OS X 10.12.12

      1 Reply Last reply Reply Quote 0
      • deliriodelux
        deliriodelux last edited by

        Hi

        It is possible to send both ghost through Syphon (up to 6) , go to ni mate preferences and activate "all users" in ghost tab. We are using the torso value to track user position in space.  There is also a problem with how ni mate assigns user ID to osc. If somebody disconnect and depending in how long reactivates, ni mate will assign a new ID. We are making an izzi patch to being able to assign user id depending in where they are in space and not in the sequence they are seen by ni mate. . If somebody need it, i can posted here,.    
        As far as know there is no CI or QC chromakey actor.  
        cheers
        a

        10.9.4. 2 x 2.26 GHz Xeon. Radeon HD 7950.32 GB DDR3.com.ar

        1 Reply Last reply Reply Quote 0
        • V
          vrsck Beta Silver last edited by

          hey DELIRIODELUX: post IZZY file patch, please!

          :)

          1 Reply Last reply Reply Quote 0
          • deliriodelux
            deliriodelux last edited by

            Hey vrsck,

            sorry for the delay, very busy week.
            But, here it is. 
            We have divide the space (you can upgrade it up to 6) in four parts, and the patch will report and publish activity and values (not depending in user id) for 4 users. Attached is the patch and the OSC stream set up. 
            Hope it helps, any questions feel free to ask. 
            best 
            a

            9f8714-archive.zip

            10.9.4. 2 x 2.26 GHz Xeon. Radeon HD 7950.32 GB DDR3.com.ar

            1 Reply Last reply Reply Quote 0
            • zanetti
              zanetti last edited by

              Hi

              I am new to here and trying to do a similar installation as mentioned in the first place here.
              Is there any tutorial for the IR alternative to get a mask?
              My idea is very simple I just have basically two moving image layers and I want the people in front of the wall where this is projected to provoke a cutout to one of the layers so you can see the one below with their shapes.
              So I think I wouldn't need the kinect solution don't I? Isn't the problem with the kinect that it really tracks people and cannot do more than 2? Or is the kinect also able to just send a plain mask with its IR feed?
              best wishes and looking forward to your answers.
              Leo
              1 Reply Last reply Reply Quote 0
              • Skulpture
                Skulpture Izzy Guru last edited by

                With a software called Ni-Mate you can track up to four people.

                There are no specific tutorials as its such as specific area with many factors to consider.
                Start by tracking one person and then build it up.

                Graham Thorne | www.grahamthorne.co.uk
                RIG 1: Custom-built PC: Windows 11. Ryzen 7 7700X, RTX3080, 32G DDR5 RAM. 2 x m.2.
                RIG 2: Laptop Dell G15: Windows 11, Intel i9 12th Gen. RTX3070ti, 16G RAM (DDR5), 2 x NVME M.2 SSD.
                RIG 3: Apple Laptop: rMBP i7, 8gig RAM 256 SSD, HD, OS X 10.12.12

                1 Reply Last reply Reply Quote 0
                • zanetti
                  zanetti last edited by

                  Hi Skulpture,

                  I don't really need to track people. I just want to have their "shapes" as an alpha mask with which I would be able to process my effect then.
                  So just wanted to ask if that is possible in that way:
                  having a infrared camera recording the room of the installation from behind the viewers. Capturing an Livefeed with the shapes of the people in front of the projection and then using that as a mask in Isadora.
                  Thanks a lot
                  1 Reply Last reply Reply Quote 0
                  • Skulpture
                    Skulpture Izzy Guru last edited by

                    Hi,

                    I understand. NI-Mate still could be your friend here. It's track people, grabs their shape and you can then send it via Syphon (on Mac) into isadora - this is then your mask.
                    This is what i did here:
                    http://vjskulpture.wordpress.com/2013/04/02/isadora-live-feed-video-mask-how-to/
                    You can get similar results using the luminance key actor in Isadora but the problem is that once a person(s) move towards and away from the camera the luminance changes and it won't work - the person(s) can only move side to side really.
                    Keep at it - you will find the best solution I'm sure.

                    Graham Thorne | www.grahamthorne.co.uk
                    RIG 1: Custom-built PC: Windows 11. Ryzen 7 7700X, RTX3080, 32G DDR5 RAM. 2 x m.2.
                    RIG 2: Laptop Dell G15: Windows 11, Intel i9 12th Gen. RTX3070ti, 16G RAM (DDR5), 2 x NVME M.2 SSD.
                    RIG 3: Apple Laptop: rMBP i7, 8gig RAM 256 SSD, HD, OS X 10.12.12

                    1 Reply Last reply Reply Quote 0
                    • crystalhorizon
                      crystalhorizon Beta Platinum last edited by

                      Using the ghost image from NI Mate is really handy and nice. Works perfect via syphon

                      Alexander Nantschev | http://www.crystalhorizon.at | located in Vienna Austria

                      1 Reply Last reply Reply Quote 0
                      • crystalhorizon
                        crystalhorizon Beta Platinum last edited by

                        But Im not shure if there is a ghost image in the demo of NI Mate

                        Alexander Nantschev | http://www.crystalhorizon.at | located in Vienna Austria

                        1 Reply Last reply Reply Quote 0
                        • zanetti
                          zanetti last edited by

                          Thanks for that advice. I guess I am gonna try the kinect then.

                          Is there a difference (but the price) between the xbox version and the windows version?
                          user dbini also recommended the nimate solution for smaller rooms.
                          1 Reply Last reply Reply Quote 0
                          • dbini
                            dbini last edited by

                            the xbox version of the kinect needs an adapter to power it and convert the signal to USB. these are pretty cheap from ebay. otherwise, its the same unit. demo of NIMate is fully functional but only works for 10 minutes and applies a watermark to any output.

                            John Collingswood
                            taikabox.com
                            2019 MBPT 2.6GHZ i7 OSX15.3.2 16GB
                            plus an old iMac and assorted Mac Minis for installations

                            1 Reply Last reply Reply Quote 0
                            • Fred
                              Fred last edited by

                              Personally I would go for something with higher resolution than the kinect. IR tracking with a good IR camera is fast and reliable. There are a few steps to go through to get it all right and you need a capture box, an IR compatible camera (one with no IR blocking filter and preferably and IR pass-colour blocking filter. WIth IR lights and IR blockers on your incandescent lights you can have great control over the contrast. Once you get a good clean image with high contrast you can track the users easily.

                              I would take a look at this software as well
                              http://www.tsps.cc/
                              You can track outside of Isadora (or on another computer) on a different thread and leave your Isadora patch to do your visuals.
                              Isadoras tracking implementation is quite good but Isadora runs on a single thread and tracking can slow things down if you want high resolution. Kinects have a pretty low range and low resolution and still work on IR anyway so for reliability you need the same iR precautions as standard IR tracking.
                              Kinects are also a pain to get reliable extensions for, although some active USB extensions work ok, they are not reliable enough for a show, instead I use a geffen USB over ethernet extender that costs around 250.
                              I track IR with an HD security camera and non standard lens. The camera has full manual control and I bought an IR pass filter that slips on over the lens. It is 1920*1080 and the lens is wide, I get good results and great silhouettes when the IR lighting is arranged properly. The connection to the computer is HDSDI so it is easy to do a long run and the camera can be switched presets of RS485\. I use a cheap 150 capture box from blackmagic to get the image in. The latency is very low and the camera is reliable.
                              Just my two cents on IR and the kinect.
                              Fred

                              http://www.fredrodrigues.net/
                              https://github.com/fred-dev
                              OSX 13.6.4 (22G513) MBP 2019 16" 2.3 GHz 8-Core i9, Radeon Pro 5500M 8 GB, 32g RAM
                              Windows 10 7700K, GTX 1080ti, 32g RAM, 2tb raided SSD

                              1 Reply Last reply Reply Quote 1
                              • crystalhorizon
                                crystalhorizon Beta Platinum last edited by

                                @Fred thats more than two cents! First look at tsps is great: Thinking about such kind of software the last days. I have a sony camcorder with night shot function. I think Mark used one like this with IR lights at a back wall (Filters for normal theatre lights are mentioned in a forum post, maybe the old forum). Also the IR filter in front of the lens, is something I have to try out. Getting data from that IR camera for Isadora was something that was the missing link in my brain-how to get nearly similar OSC data as NI-Mate-Im not interested in so many parameters, just basic stuff. But I think what is tricky is the Z-Axis, because the kinect has "two eyes" and sees the depth, or am I wrong here?

                                Alexander Nantschev | http://www.crystalhorizon.at | located in Vienna Austria

                                1 Reply Last reply Reply Quote 0
                                • eight
                                  eight last edited by

                                  @Fred, a thing or two to not forget about.

                                  Kinects will give you (albeit noisy) mesh, not just a silhouette, which then can be used in scene flow (similar to optical flow, but in 3d). 
                                  The new kinect v2 has greatly reduced the noise –– in my observations it is barely visible.

                                  Analysis: http://post.scriptum.ru | Synthesis: http://onewaytheater.us
                                  Twitter: https://twitter.com/eight_io | Flickr: http://www.flickr.com/photos/eight_io/
                                  Github: https://github.com/eighteight | MulchCam: https//mulchcam.com
                                  MulchTune: https://itunes.apple.com/us/app/mulch-tune/id1070973465 | Augmented Theatre: https://augmentedtheatre.com

                                  1 Reply Last reply Reply Quote 0
                                  • Fred
                                    Fred last edited by

                                    Kinect 2 is much much better. The biggest problem is that if you want to extend the cable you need a $1500 usb3 over optical fibre system which is makes it a very expensive system. Also I have used it on a mac but it is way better on a PC. Still after that you are working with a limited range for depth anyway and no adjustable lens to correct the perspective if you need to have the camera in a particular place.

                                    I think the kinects are pretty amazing in theory but using one in a show on tour comes with a lot of hassle.
                                    @[feinsinn](http://troikatronix.com/troikatronixforum/profile/371/feinsinn) Night shot is good but as far as I know it just removes the IR blocker but does not have a visible light block/ir pass filter.
                                    @eight what exactly do you mean by mesh? Usually this refers to an array of points describing a 3d wireframe. It is great if you can get it but as far as I can tell there is no way to get this into Isadora, maybe via OSc but I have a feeling that sending that much data every frame will choke Isadora, also it is quite a pain to deal with arrays of 3D points in izzy- and after that Isadora cannot do that much with this information. 
                                    https://www.google.nl/search?q=openGL+mesh&safe=off&espv=2&biw=1745&bih=792&tbm=isch&tbo=u&source=univ&sa=X&ei=M5kaVKH2M4rQ7AaYqoGoDQ&ved=0CEkQsAQ

                                    http://www.fredrodrigues.net/
                                    https://github.com/fred-dev
                                    OSX 13.6.4 (22G513) MBP 2019 16" 2.3 GHz 8-Core i9, Radeon Pro 5500M 8 GB, 32g RAM
                                    Windows 10 7700K, GTX 1080ti, 32g RAM, 2tb raided SSD

                                    1 Reply Last reply Reply Quote 0
                                    • eight
                                      eight last edited by

                                      I am using a $150 optical extender (50m) for kinect v2 from amazon.

                                      I would not send the mesh to Isadora (I would not know what to do with it there), but mesh, processed into an image via a syphon, that's what I did to get something like the attachment.
                                      But the main thing I hope to get is the scene flow, which, given the reduced noise, should be easier to get, -- that's a WIP at the moment.
                                      --8

                                      f783ad-10646838_715761375168217_8007643821009547844_n.jpg

                                      Analysis: http://post.scriptum.ru | Synthesis: http://onewaytheater.us
                                      Twitter: https://twitter.com/eight_io | Flickr: http://www.flickr.com/photos/eight_io/
                                      Github: https://github.com/eighteight | MulchCam: https//mulchcam.com
                                      MulchTune: https://itunes.apple.com/us/app/mulch-tune/id1070973465 | Augmented Theatre: https://augmentedtheatre.com

                                      1 Reply Last reply Reply Quote 0
                                      • Fred
                                        Fred last edited by

                                        Ok, that is the point cloud.

                                        Can you send a link to the 150 extender, I tested some cheap ones and they were not fast enough to get reliable performance. 

                                        http://www.fredrodrigues.net/
                                        https://github.com/fred-dev
                                        OSX 13.6.4 (22G513) MBP 2019 16" 2.3 GHz 8-Core i9, Radeon Pro 5500M 8 GB, 32g RAM
                                        Windows 10 7700K, GTX 1080ti, 32g RAM, 2tb raided SSD

                                        1 Reply Last reply Reply Quote 0
                                        • eight
                                          eight last edited by

                                          This is the one I am using: http://www.amazon.com/ADNACO-UF1-50-Active-Optical-Cable-length/dp/B00KPG41Y0/ref=pd_rhf_se_p_img_9 , although Amazon is out of them, I think I saw them somewhere else.

                                          It took me couple of hours to figure out how to set it up, including the call to the tech support in Canada, which turned out to be very good –– they may also suggest where to buy it. 
                                          The sequence of connecting things together and powering them up is critical, which is 
                                          1. Connect the fiber end to computer
                                          2. Connect the other fiber end to the accompanying adapter.
                                          3. Power the adapter in 2. up.
                                          4. Connect the kinect to the adapter in 2.
                                          5. Power up the kinect.
                                          6. Launch the program
                                          7. Bingo.
                                          --8

                                          Analysis: http://post.scriptum.ru | Synthesis: http://onewaytheater.us
                                          Twitter: https://twitter.com/eight_io | Flickr: http://www.flickr.com/photos/eight_io/
                                          Github: https://github.com/eighteight | MulchCam: https//mulchcam.com
                                          MulchTune: https://itunes.apple.com/us/app/mulch-tune/id1070973465 | Augmented Theatre: https://augmentedtheatre.com

                                          1 Reply Last reply Reply Quote 0
                                          • zanetti
                                            zanetti last edited by

                                            hi guys, just saw the proceeding of that post. i do work with the kinect v1 at the moment. I don't know how much meters extension you need, but I am using a 10m very chep 15€ delock active usb extension and it works perfectly.

                                            i live in berlin and i got it at cyberport if anyone is interested
                                            1 Reply Last reply Reply Quote 0
                                            • First post
                                              Last post