• Products
    • Isadora
    • Get It
    • ADD-ONS
    • IzzyCast
    • Get It
  • Forum
  • Help
  • Werkstatt
  • Newsletter
  • Impressum
  • Dsgvo
  • Press
  • Isadora
  • Get It
  • ADD-ONS
  • IzzyCast
  • Get It
  • Press
  • Dsgvo
  • Impressum

Navigation

    • Register
    • Login
    • Search
    • Categories
    • Recent
    • Popular
    • Tags

    Ghost hunting, pornography and interactive art: the weird afterlife of Xbox Kinect

    Hardware
    7
    10
    569
    Loading More Posts
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as topic
    Log in to reply
    This topic has been deleted. Only users with topic management privileges can see it.
    • mark_m
      mark_m last edited by

      Interesting article in today's Guardian (UK vaguely left-wing serious newspaper - free to read) about the lasting effect of the Kinect. Also contains some links to things I knew nothing about, like the stereolabs cameras. Worth a couple minutes of your time:


      https://www.theguardian.com/ga...

      Intel NUC8i7HVK Hades Canyon VR Gaming NUC, i7-8809G w/ Radeon RX Vega M GH 4GB Graphics, 32GB RAM, 2 x NVMe SSD
      Gigabyte Aero 15 OLED XD. Intel Core i7-11800H, NVidia RTX3070, 32GB RAM 2 x NVMe SSD
      PC Specialist Desktop: i9-14900K, RTX4070Ti, 64GB RAM, Win11Pro
      www.natalieinsideout.com

      G R Armando 3 Replies Last reply Reply Quote 4
      • G
        gavspav Beta Silver @mark_m last edited by

        Is someone going to make a plug-in for buttplug.io?

        http://www.digitalfunfair.co.uk I'm using M1 MBP 14" mostly but sometimes use older Mac & Windows machines.

        Skulpture 1 Reply Last reply Reply Quote 4
        • Skulpture
          Skulpture Izzy Guru @gavspav last edited by Woland

          @gavspav said:

          Is someone going to make a plug-in for buttplug.io?

           Python can probably do that! haha

          Graham Thorne | www.grahamthorne.co.uk
          RIG 1: Custom-built PC: Windows 11. Ryzen 7 7700X, RTX3080, 32G DDR5 RAM. 2 x m.2.
          RIG 2: Laptop Dell G15: Windows 11, Intel i9 12th Gen. RTX3070ti, 16G RAM (DDR5), 2 x NVME M.2 SSD.
          RIG 3: Apple Laptop: rMBP i7, 8gig RAM 256 SSD, HD, OS X 10.12.12

          1 Reply Last reply Reply Quote 1
          • R
            Reload2024 @mark_m last edited by

            @mark_m I have a stereolabs camera (ZED2) and they're great since they're the only ones that have a range of 30+ feet with skeletal tracking. All the rest are limited to small rooms.  I also have over a dozen Kinects that I use in my business, and they're super annoying because ~ 30% of them stop working whenever Windows does a forced update and the drivers need to be re-installed, which isn't always a smooth process.  But I love the Zed 2s, they're great if you have lots of light available since they filter out IR.

            1 Reply Last reply Reply Quote 1
            • Armando
              Armando Beta Gold @mark_m last edited by

              @mark_m

              Thanks for the artcle Mark, Well, I loved the kinect at first (infrared grid) sight. I bought several over the years. And I still believe in it. I did some installations with it that toured quite a bit over the years. Now what is next on the skeleton tracking ? I've been following these developments for years and here a summary of what I know. 

              1) with AI it is easy to predict that all monocular rgb cameras will become kinetct-like skeleton tracking devices

              We've all seen Mark's FB posts experimenting that with python code.

              Google Mediapipe, Open CV, Yolo, Movenet and other frameworks already provide code for pose estimation with monocular rgb vision

              2) On the contrary, since some of these models, handle and and finger tracking, I think that Leapmotion will probably die. IA and calculation power is probably going to replace it. We've already seen the ability to track hands for menu and game interactions in Apple Vision pro or Meta Quest 3 MR headset (yes as for this week even META banned the word VR for MR) (MR = mixed reality) 

              More to come !!!

              Kinect is dead? Long live its (IA) successors !!!

              Armando Menicacci
              www.studiosit.ca
              MacBook Pro 16-inch, 2021 Apple M1 Max, RAM 64 GB, 4TB SSD, Mac OS Sonoma 14.4.1 (23E224)

              bonemap 1 Reply Last reply Reply Quote 2
              • bonemap
                bonemap Izzy Guru @Armando last edited by

                @armando

                Hi @armando,

                I have also been following these developments for years and have Mediapipe running in Isadora through the Pythoner plugin. I have hand tracking, face tracking and pose tracking variations as separate Pythoner patches. There has been a fair bit of upkeep to these patches and the upgrade of patches with new versions of Mediapipe, Pythoner and Isadora. This has meant reinvesting in the integration with Isadora over time. 

                The BIG QUESTION for me using flat RGB video for these new AI and ML approaches is that they do not allow me to track performers in a theatrical setting - this means in lighting conditions that are not optimum for capturing the body as an RGB image - for example n darkness. It remains critical that body tracking for performance is optimised as agnostic to reflected light ie: works in darkness or with a variety of lighting and projection sources. AI and ML tracking has not proved itself in theatrical performance because it requires the tracking subject to be clearly represented in a video stream.

                Structured light devices - like the Kinect and OoenNi variations are still important precisely because they operate without a visible light source illuminating the tracking subject.

                But please, if there is an AI or ML solution that works in darkness without visible light, I would love to know about it!

                Best wishes

                Russell

                http://bonemap.com | Australia
                Izzy STD 4.2 | USB 3.6 | + Beta
                MBP 16” 2019 2.4 GHz Intel i9 64GB AMD Radeon Pro 5500 8 GB 4TB SSD | 14.5 Sonoma
                Mac Studio 2023 M2 Ultra 128GB | OSX 15.3 Sequoia
                A range of deployable older Macs

                Skulpture Fred Armando 3 Replies Last reply Reply Quote 1
                • Skulpture
                  Skulpture Izzy Guru @bonemap last edited by

                  @bonemap I think the best solution right now is https://www.move.ai/ BUT disguise has got its hands on it. So the price is high... unfortunately. 

                  Graham Thorne | www.grahamthorne.co.uk
                  RIG 1: Custom-built PC: Windows 11. Ryzen 7 7700X, RTX3080, 32G DDR5 RAM. 2 x m.2.
                  RIG 2: Laptop Dell G15: Windows 11, Intel i9 12th Gen. RTX3070ti, 16G RAM (DDR5), 2 x NVME M.2 SSD.
                  RIG 3: Apple Laptop: rMBP i7, 8gig RAM 256 SSD, HD, OS X 10.12.12

                  1 Reply Last reply Reply Quote 3
                  • Fred
                    Fred @bonemap last edited by

                    @bonemap a lot - if not most of the ML models are fine with black and white images - ie IR video (as is used by the TOF and structured light cameras you mention) - however you will need to provide the light source and the camera stream that can see the IR lit subjects - no pixels is no information. Kinect etx have their own light source - which of course you can use and feed into these models (ie a kinect IR video stream)- this may be a better approach than feeding the images into openNI which is essentially a dead outdated hack.


                    Not sure what Ml model you are using but there are many and this is a good one: https://github.com/MVIG-SJTU/A...

                    Fred

                    http://www.fredrodrigues.net/
                    https://github.com/fred-dev
                    OSX 13.6.4 (22G513) MBP 2019 16" 2.3 GHz 8-Core i9, Radeon Pro 5500M 8 GB, 32g RAM
                    Windows 10 7700K, GTX 1080ti, 32g RAM, 2tb raided SSD

                    1 Reply Last reply Reply Quote 0
                    • Armando
                      Armando Beta Gold @bonemap last edited by

                      @bonemap Well we don't need a special solution for low light conditions. I did that for years. Any camera that see near ir light, flood with infrared that doesn't change and put a visible light filter in front of the camera et voila. Never had problems with it 90% of the time.  

                      Armando Menicacci
                      www.studiosit.ca
                      MacBook Pro 16-inch, 2021 Apple M1 Max, RAM 64 GB, 4TB SSD, Mac OS Sonoma 14.4.1 (23E224)

                      Armando 1 Reply Last reply Reply Quote 0
                      • Armando
                        Armando Beta Gold @Armando last edited by

                        If you use tungsten pars with a red green and blue filter in front even at low intensity it will emit a lot of IR light. This shouldn't Change, but you can change all the other lights IF you filter them out of your IR camera.

                        Armando Menicacci
                        www.studiosit.ca
                        MacBook Pro 16-inch, 2021 Apple M1 Max, RAM 64 GB, 4TB SSD, Mac OS Sonoma 14.4.1 (23E224)

                        1 Reply Last reply Reply Quote 0
                        • First post
                          Last post