• Isadora
  • Get it
  • Forum
  • Help
  • ADD-ONS
  • Newsletter
  • Impressum
  • Dsgvo
  • Impressum
Forum

Navigation

    • Register
    • Login
    • Search
    • Categories
    • Recent
    • Popular
    • Tags

    [LOGGED] Keying Head & Shoulders like in zoom & skype

    Feature Requests
    background virtual virtual theatre zoom skype
    8
    12
    678
    Loading More Posts
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as topic
    Log in to reply
    This topic has been deleted. Only users with topic management privileges can see it.
    • L
      liannemua last edited by Woland

      Hi, curious if this exists in Isadora, or if not, it would be great for live virtual theater. The ability to recognize & key out the background of talking head feeds imported live from Skype - like the virtual backgrounds in Zoom & Skype, but to replace the background with alpha. Thanks.

      Woland Skulpture Kathmandale 3 Replies Last reply Reply Quote 0
      • Woland
        Woland Tech Staff @liannemua last edited by

        @liannemua

        If you have your performers in front of a green screen, you can use the Chromakey actor, but otherwise, no this feature does not yet exist in Isadora.

        TroikaTronix Technical Support
        New Support Ticket Link: https://support.troikatronix.com/support/tickets/new
        TroikaTronix Support Policy: https://support.troikatronix.com/support/solutions/articles/13000064762
        TroikaTronix Add-Ons Page: https://troikatronix.com/add-ons/

        | Isadora 2.6.1 + 3 | Mac Pro (Late 2013), macOS 10.14.6, 3.5GHz 6-core, 1TB SSD, 64GB RAM, Dual AMD FirePro D700s | Macbook Pro (Retina, 15", Mid 2015), macOS 10.11.4, 2.8GHz Intel Core i7, 16GB RAM, Intel Iris Pro 1536 MB |

        1 Reply Last reply Reply Quote 0
        • Skulpture
          Skulpture Izzy Guru @liannemua last edited by

          @liannemua said:

          <p>Hi, curious if this exists in Isadora, or if not, it would be great for live virtual theater. The ability to recognize & key out the background of talking head feeds imported live from Skype - like the virtual backgrounds in Zoom & Skype, but to replace the background with alpha. Thanks.</p>

           Give this a try: https://www.chromacam.me/ 

          Graham Thorne | www.grahamthorne.co.uk
          RIG 1: Windows 11, AMD 7 Ryzen, RTX3070, 16gig RAM. 2 x M.2 SSD. HD. Lenovo Legion 5 gaming laptop.
          RIG 2: Windows 11, Intel i19 12th Gen. RTX3070ti, 16gig RAM (ddr5), 1x M.2 SSD. UHD DELL G15 Gaming laptop.
          RIG 3: Apple rMBP i7, 8gig RAM 256 SSD, HD, OS X 10.12.12

          1 Reply Last reply Reply Quote 1
          • Woland
            Woland Tech Staff last edited by Woland

            While we're on the topic, does anyone know of any open-source, cross-platform tools for this? Having open-source code as a starting point would both decrease the difficulty, and increase the the likelihood, of being able to incorporate this as a native Isadora feature. 


            Also, I've logged this as a feature request.

            TroikaTronix Technical Support
            New Support Ticket Link: https://support.troikatronix.com/support/tickets/new
            TroikaTronix Support Policy: https://support.troikatronix.com/support/solutions/articles/13000064762
            TroikaTronix Add-Ons Page: https://troikatronix.com/add-ons/

            | Isadora 2.6.1 + 3 | Mac Pro (Late 2013), macOS 10.14.6, 3.5GHz 6-core, 1TB SSD, 64GB RAM, Dual AMD FirePro D700s | Macbook Pro (Retina, 15", Mid 2015), macOS 10.11.4, 2.8GHz Intel Core i7, 16GB RAM, Intel Iris Pro 1536 MB |

            tomthebom mark 2 Replies Last reply Reply Quote 0
            • Skulpture
              Skulpture Izzy Guru last edited by

              Alos; https://www.xsplit.com/vcam 

              Graham Thorne | www.grahamthorne.co.uk
              RIG 1: Windows 11, AMD 7 Ryzen, RTX3070, 16gig RAM. 2 x M.2 SSD. HD. Lenovo Legion 5 gaming laptop.
              RIG 2: Windows 11, Intel i19 12th Gen. RTX3070ti, 16gig RAM (ddr5), 1x M.2 SSD. UHD DELL G15 Gaming laptop.
              RIG 3: Apple rMBP i7, 8gig RAM 256 SSD, HD, OS X 10.12.12

              1 Reply Last reply Reply Quote 0
              • tomthebom
                tomthebom @Woland last edited by

                @Woland said:

                Also, I've logged this as a feature request.

                I couldn't agree more: I find it a "must-have" in Corona-times. I am just checking out  Skulptures tip:  https://www.chromacam.me/. 30 bucks for the full version and a very slow download are not very promising ;o(

                Izzy 3.2.6 ARM on MBP14'/2021/M1 Pro/ macOS 12.3

                1 Reply Last reply Reply Quote 0
                • A
                  Aolis last edited by

                  Just tried out the Chromacam - got it working in just a few moments. May be worth the cost if needed.

                  late 2012 MacPro - Mojave 

                  Media Artist & Teacher
                  Trashcan Late 2012, Mojave / MacBook Pro 2019 5600M, Catalina

                  1 Reply Last reply Reply Quote 0
                  • Kathmandale
                    Kathmandale @liannemua last edited by

                    @liannemua If you get your remote performers to set their virtual backgrounds to a pure green image then the keying in Isadora works perfectly. Even if their local laptops aren't up to it and they have to tick the 'I have a greenscreen' option in zoom you can really good results that way. It's how we made Airlock and are using it as a technique on other projects.

                    To be honest, I actually prefer the results you get with the 'I have a greenscreen' option than the 'head and shoulders recognition' option. Zoom seems to do a pretty good job of keying out imperfect (or imperfectly lit) greenscreens (or green sheets, or blue walls, or whatever your performers can get in front of). If you then set their virtual background to an all green image then it's really easy to get the settings just right in Isadora. It also means you don't get that thing of hands, arms, hats, props or whole performers disappearing occasioanly.

                    2014 MBP Mojave 10.14.6 OS with 16GB, 2.5Ghz i7 quad core, Intel Iris Pro 1536 & Geforce GT 750m 2GB - Izzy 3.0.8
                    Gigabyte Brix Windows 10 with 32GB, i7-6700 quad core, 4GB GeForce GTX 950 - Izzy 3.0.8
                    Based in Manchester, UK.

                    1 Reply Last reply Reply Quote 1
                    • mark
                      mark @Woland last edited by mark

                      @woland said:

                      While we're on the topic, does anyone know of any open-source, cross-platform tools for this?

                       I did some poking around. The algorithms to remove an arbitrary background all require training artificial intelligence systems using datasets of people in front of web cams to work. You can get a sense of the complexity by looking at this Background Matting GitHub project or this article where this person implements the background removal using Python and Tensorflow (AI) tools. 

                      So, what I'm trying to say here is that this is a major project that would require my entire attention. If there the program mentioned above works for $30, I'd say it's a reasonable cost given the how much work it would be to implement such a feature. I wish we had unlimited programming resources to take this on, but it's not realistic at the moment for us to do so.

                      Best Wishes,
                      Mark

                      Media Artist & Creator of Isadora
                      Macintosh SE-30, 32 Mb RAM, MacOS 7.6, Dual Floppy Drives

                      mark liminal_andy 2 Replies Last reply Reply Quote 1
                      • mark
                        mark @mark last edited by mark

                        P.S. One further note:

                        This article from our friends at Touch Designer describes how you can use the Photo app in iOS to remove the background and send the image to a desktop computer. (The example is for Windows, but there is a "For Mac Users" section. They mention using Cam Twist but you should use our free Syphon Virtual Webcam to get Isadora's Syphon output into Zoom.)

                        However, this example uses NDI to capture the iPhone screen so there is going to be a substantial delay.

                        Best Wishes,
                        Mark

                        Media Artist & Creator of Isadora
                        Macintosh SE-30, 32 Mb RAM, MacOS 7.6, Dual Floppy Drives

                        1 Reply Last reply Reply Quote 0
                        • liminal_andy
                          liminal_andy @mark last edited by liminal_andy

                          @mark said:

                           I did some poking around. The algorithms to remove an arbitrary background all require training artificial intelligence systems using datasets of people in front of web cams to work. You can get a sense of the complexity by looking at this Background" class="redactor-linkify-object">https://github.com/senguptaumd... Matting GitHub project or this article where" class="redactor-linkify-object">https://elder.dev/posts/open-s... this person implements the background removal using Python and Tensorflow (AI) tools. 

                          In furtherance of this subject, I did come across an interesting project using the Tensor BodyPix, which is a popular framework for this type of work but not super helpful for us Izzy users. If I get some time, I am going to bolt on Spout / Syphon to this and try to make it cross platform, and maybe speed it up some if I can. I'm imagining you'd output a stage to the shared memory buffer, then select the stage in a console, and then it will post an alpha mask to another shared buffer. 

                          Would love to discuss this further as I do see it being helpful for online work. I worked a fair amount on soft body projection mapping / masking in pre-covid times, using pixel energy functions as heuristics to accelerate these detection algorithms. I find that for virtual shows, I am pulling out / modding more GLSL shaders than I normally do and thinking critically about compositing, and background segmentation is often a key part of that :)

                          Somehow, it all comes together.

                          Andy Carluccio
                          Zoom Video Communications, Inc.
                          www.liminalet.com

                          [R9 3900X, RTX 2080, 64GB DDR4 3600, Win 10, Izzy 3.0.8]
                          [...also a bunch of hackintoshes...]

                          liminal_andy 1 Reply Last reply Reply Quote 0
                          • liminal_andy
                            liminal_andy @liminal_andy last edited by

                            So the following is purely for fun in response to @mark 's post imagining how this would be done. I did follow up on this over the weekend and got something "working"

                            I heavily modified the project I mentioned earlier by manually rolling it over to the Tensorflow Lite c_api (a real pain!), then porting it to Windows, and feeding it the deeplabv3_257_mv_gpu.tflite model. To make it useful to Isadora, I dusted off / updated an openCV to Spout pipeline in c++ that I used a few years ago for some of my live projection masking programs, so now my prototype can receive an Isadora stage, run it on the model, and output the resulting mask to Spout again for Isadora to use with the Alpha Mask actor. 

                            My results:

                            Now obviously, this is insane to actually attempt for production purposes in its current form. I'm getting about 5fps (granted no GPU accelerated and I'm running in debug mode). I could slightly improve things by bouncing back the original Isadora stage on its own spout server, but this is just a proof of concept. In this state, it should be relatively easy to port to Mac/Syphon and add GPU acceleration on compatible systems for higher FPS and / or multiple instances for many performers. 


                            Again, just a fun weekend project but I found it very educational. 

                            Andy Carluccio
                            Zoom Video Communications, Inc.
                            www.liminalet.com

                            [R9 3900X, RTX 2080, 64GB DDR4 3600, Win 10, Izzy 3.0.8]
                            [...also a bunch of hackintoshes...]

                            1 Reply Last reply Reply Quote 1
                            • First post
                              Last post