• Products
    • Isadora
    • Get It
    • ADD-ONS
    • IzzyCast
    • Get It
  • Forum
  • Help
  • Werkstatt
  • Newsletter
  • Impressum
  • Dsgvo
  • Press
  • Isadora
  • Get It
  • ADD-ONS
  • IzzyCast
  • Get It
  • Press
  • Dsgvo
  • Impressum

Navigation

    • Register
    • Login
    • Search
    • Categories
    • Recent
    • Popular
    • Tags

    Alpha Mask with Live Video

    How To... ?
    16
    29
    24316
    Loading More Posts
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as topic
    Log in to reply
    This topic has been deleted. Only users with topic management privileges can see it.
    • C
      cbr372 last edited by

      Ok so I am fairly new to Izzy but so far I am loving it. I have an idea that I have been trying to accomplish and I feel that I am close but just not getting the results I want.

      What I am trying to do is use live video of a person (a dancer) as an alpha mask so that the silhouette of the dancer is translucent and the background is opaque (black preferably). Then I want to play content thru the mask so that there is video content playing in the form of the live dancer.
      I have played with threshold and alpha mask actors but because light levels are uneven and the background is not a solid one-color wall, I am unable to get a clear form of the dancer.
      So is there a way to do what I am thinking about? Or would does it need to be more involved (tracking with Kinect) or something?
      Thanks Guys!
      -Drew
      1 Reply Last reply Reply Quote 0
      • Michel
        Michel Izzy Guru last edited by

        I am writing this from my phone so can't give you a fixed solution but try using the desaturate actor and the contrast adjust, maybe this helps to get a consistent mask. And yes it would be very easy with a kinect. Best Michel

        Michel Weber | www.filmprojekt.ch | rMBP (2019) i9, 16gig, AMD 5500M 8 GB, OS X 10.15 | located in Winterthur Switzerland.

        1 Reply Last reply Reply Quote 0
        • JJHP3
          JJHP3 last edited by

          Well I'm no expert, but little is "easy" with a kinect and live video feeds in my experience. Doable, but had to use two MacBookPro computers and syphon etc etc. I used the kinect image into Isadora and processed it using color keying to combine the kinect "blobby" take on the human body with other footage. Go for it!

          John

          MBP'21- M1-Max; Sonoma 14.7; Isadora: 4.0.7 (ARM) ---- artwork: terragizmo.net

          1 Reply Last reply Reply Quote 0
          • C
            cbr372 last edited by

            Tried the contrast and the desaturate actors and it helped a bit more but since the system of actors still revolves around luminance I unable to key out other bright spots. I am currently attempting it in my apartment right next to the window so imagine once I'm in a more controlled  color and luminance environment that will help. I'm essentially looking for the magic lasso tool from photoshop for live video.

            http://notabenevisual.com/?p=443
            The link above is to a video of the concept I am trying to achieve. My idea of thinking is that they somehow keyed out the bodies and made a mask from it and projected their silhouettes directly to the wall.
            1 Reply Last reply Reply Quote 0
            • vanakaru
              vanakaru last edited by

              I would go for IR lighting the screen and IR capable camera that will see the moving body as a shadow. You may need to filter IR from projector.

              MBP 4.1 & MBP (Retina, Mid 2012) MBP Retina 2017

              1 Reply Last reply Reply Quote 0
              • S
                stj last edited by

                This thread has a lot of information that might help you:

                http://troikatronix.com/community/#/discussion/265/tracking-two-dancers

                1 Reply Last reply Reply Quote 0
                • Michel
                  Michel Izzy Guru last edited by

                  THIS youtube video that Mark posted about the technology used in 16 revolutions can help you aswell.

                  Best,
                  Michel

                  Michel Weber | www.filmprojekt.ch | rMBP (2019) i9, 16gig, AMD 5500M 8 GB, OS X 10.15 | located in Winterthur Switzerland.

                  1 Reply Last reply Reply Quote 0
                  • K
                    kevinjesuino last edited by

                    Hi there,

                    I'm in the same boat except I am using a Kinect + Isadora. 
                    I'm trying to create a similar effect as in this one created by Graham 
                    http://www.youtube.com/watch?v=Spd77d6yZ-s&list=UUhr7aQm3W7xQmn4YblveryA&index=21
                    Could someone give me a lead please? 
                    1 Reply Last reply Reply Quote 0
                    • Skulpture
                      Skulpture Izzy Guru last edited by

                      I never saved that patch it was a quick demo I did for some ex students but it something along these lines.... (open in a new tab its quite big) 0c5c98-screen-shot-2013-03-13-at-08.54.37.png

                      Graham Thorne | www.grahamthorne.co.uk
                      RIG 1: Custom-built PC: Windows 11. Ryzen 7 7700X, RTX3080, 32G DDR5 RAM. 2 x m.2.
                      RIG 2: Laptop Dell G15: Windows 11, Intel i9 12th Gen. RTX3070ti, 16G RAM (DDR5), 2 x NVME M.2 SSD.
                      RIG 3: Apple Laptop: rMBP i7, 8gig RAM 256 SSD, HD, OS X 10.12.12

                      1 Reply Last reply Reply Quote 0
                      • eight
                        eight last edited by

                        @Skulpture I noticed in your video that the mask has a nicely defined smooth edge, compared to what raw data from kinect will give you. Do Gaussian Blur and Motion Blur do that, or NI Mate? What happens without the Blur filters?

                        Thanks.
                        --8

                        Analysis: http://post.scriptum.ru | Synthesis: http://onewaytheater.us
                        Twitter: https://twitter.com/eight_io | Flickr: http://www.flickr.com/photos/eight_io/
                        Github: https://github.com/eighteight | MulchCam: https//mulchcam.com
                        MulchTune: https://itunes.apple.com/us/app/mulch-tune/id1070973465 | Augmented Theatre: https://augmentedtheatre.com

                        1 Reply Last reply Reply Quote 0
                        • Skulpture
                          Skulpture Izzy Guru last edited by

                          Yeah it nearly always needs some blur.

                          I used to knock old CCTV cameras out of focus slightly to get similar results rather than adding blur in the software. But with ever increasing faster machines i started adding it into the software. Normally Gaussian Blur on 1 or 2 works nicely.

                          Graham Thorne | www.grahamthorne.co.uk
                          RIG 1: Custom-built PC: Windows 11. Ryzen 7 7700X, RTX3080, 32G DDR5 RAM. 2 x m.2.
                          RIG 2: Laptop Dell G15: Windows 11, Intel i9 12th Gen. RTX3070ti, 16G RAM (DDR5), 2 x NVME M.2 SSD.
                          RIG 3: Apple Laptop: rMBP i7, 8gig RAM 256 SSD, HD, OS X 10.12.12

                          1 Reply Last reply Reply Quote 0
                          • David
                            David last edited by

                            Hi Guys,
                            Thanks to @Skulpture I was able to create this new interactive piece : Indicible Camouflage
                            I'm presenting it next week to a jazz event and I realise that in Ni Mate only the active user's alpha in active.
                            Is there a way to capture all alphas no matter how many peaple stand infront of the devise ?

                            Thank you all for your great knowledge !

                            David

                            1 Reply Last reply Reply Quote 0
                            • Skulpture
                              Skulpture Izzy Guru last edited by

                              Glad i've helped you David. Be sure to take some pictures and I wil put them on my blog.

                              :-)

                              Graham Thorne | www.grahamthorne.co.uk
                              RIG 1: Custom-built PC: Windows 11. Ryzen 7 7700X, RTX3080, 32G DDR5 RAM. 2 x m.2.
                              RIG 2: Laptop Dell G15: Windows 11, Intel i9 12th Gen. RTX3070ti, 16G RAM (DDR5), 2 x NVME M.2 SSD.
                              RIG 3: Apple Laptop: rMBP i7, 8gig RAM 256 SSD, HD, OS X 10.12.12

                              1 Reply Last reply Reply Quote 0
                              • David
                                David last edited by

                                Thanx @Skulpture I will do so !!

                                :-)
                                1 Reply Last reply Reply Quote 0
                                • bruno
                                  bruno last edited by

                                  Hello Sculpture, i tried your patch but instead of the syphon actors , I used kinnect actor and the kinnect. My goals is to project video on a dancer. Do you think it's possible with the kinnect? Wher can I find the QC syphon actors? I send you a picture of my patch, just have a look on the right side.

                                  Thank you for all your work
                                  Best wiches
                                  Bruno

                                  b87947-kinnect.tiff

                                  1 Reply Last reply Reply Quote 0
                                  • Skulpture
                                    Skulpture Izzy Guru last edited by

                                    @bruno

                                    It looks like you will need a threshold actor - this will boost the white and make the black/grey more black. (if that makes sense?)
                                    Out it directly after the output from the Kinect Actor.

                                    Graham Thorne | www.grahamthorne.co.uk
                                    RIG 1: Custom-built PC: Windows 11. Ryzen 7 7700X, RTX3080, 32G DDR5 RAM. 2 x m.2.
                                    RIG 2: Laptop Dell G15: Windows 11, Intel i9 12th Gen. RTX3070ti, 16G RAM (DDR5), 2 x NVME M.2 SSD.
                                    RIG 3: Apple Laptop: rMBP i7, 8gig RAM 256 SSD, HD, OS X 10.12.12

                                    1 Reply Last reply Reply Quote 0
                                    • C
                                      cocoberdu last edited by

                                      hello everybody.

                                      i have some troubles to use the kinect with my new mac book pro. 
                                      i have nimate license but it crashed at launch , like synapse....
                                      i plugged the kinect on the usb3 on my mac and i think its the problem.
                                      but i dont have any other usb2 port.....
                                      do you know a solution about that?
                                      thats a lot

                                      mbp i5;16 go, osx 10.9.4, isadora 1.5.3f28
                                      website: www.desequilibres.com

                                      1 Reply Last reply Reply Quote 0
                                      • Fred
                                        Fred last edited by

                                        I use a kinect on a new macbook with only USB3 ports and no problems.

                                        http://www.fredrodrigues.net/
                                        https://github.com/fred-dev
                                        OSX 13.6.4 (22G513) MBP 2019 16" 2.3 GHz 8-Core i9, Radeon Pro 5500M 8 GB, 32g RAM
                                        Windows 10 7700K, GTX 1080ti, 32g RAM, 2tb raided SSD

                                        1 Reply Last reply Reply Quote 0
                                        • Armando
                                          Armando Beta Gold last edited by

                                          Me too no problems

                                          Armando Menicacci
                                          www.studiosit.ca
                                          MacBook Pro 16-inch, 2021 Apple M1 Max, RAM 64 GB, 4TB SSD, Mac OS Sonoma 14.4.1 (23E224)

                                          1 Reply Last reply Reply Quote 0
                                          • Skulpture
                                            Skulpture Izzy Guru last edited by

                                            It won't be the USB ports. It will be software/driver related.

                                            Graham Thorne | www.grahamthorne.co.uk
                                            RIG 1: Custom-built PC: Windows 11. Ryzen 7 7700X, RTX3080, 32G DDR5 RAM. 2 x m.2.
                                            RIG 2: Laptop Dell G15: Windows 11, Intel i9 12th Gen. RTX3070ti, 16G RAM (DDR5), 2 x NVME M.2 SSD.
                                            RIG 3: Apple Laptop: rMBP i7, 8gig RAM 256 SSD, HD, OS X 10.12.12

                                            1 Reply Last reply Reply Quote 0
                                            • First post
                                              Last post