• Products
    • Isadora
    • Get It
    • ADD-ONS
    • IzzyCast
    • Get It
  • Forum
  • Help
  • Werkstatt
  • Newsletter
  • Impressum
  • Dsgvo
  • Press
  • Isadora
  • Get It
  • ADD-ONS
  • IzzyCast
  • Get It
  • Press
  • Dsgvo
  • Impressum

Navigation

    • Register
    • Login
    • Search
    • Categories
    • Recent
    • Popular
    • Tags

    How to make the content of a mask follow the mask

    How To... ?
    5
    23
    10470
    Loading More Posts
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as topic
    Log in to reply
    This topic has been deleted. Only users with topic management privileges can see it.
    • gapworks
      gapworks last edited by

      I have a quite simple setting. a ghost image from processing / kinect is the mask. A text draw actor scrolls the "karaoke" style text vertical. being in a quite static position in the beginning the dancer now moves quite a bit on stage.

      how can i make the text follow the x/y of the syphon mask? do i need to do this in processing or is there a way in isadora? which i would prefer because i´m new to processing.
      thanx for your help

      Running MBP2017 / Ventura Osx 13.6.7 / 16 GB 2133 MHz LPDDR3 / Intel HD Graphics 630 1536 MB / Latest Isadora Version / www.gapworks.at / located in Vienna Austria

      1 Reply Last reply Reply Quote 0
      • Fred
        Fred last edited by

        You can use eyes++ to get the position of the mask in Isadora, however it may well be better to do this in processing with openCV, which will give you quite a bit of information about the position and shape of the mask.

        http://www.fredrodrigues.net/
        https://github.com/fred-dev
        OSX 13.6.4 (22G513) MBP 2019 16" 2.3 GHz 8-Core i9, Radeon Pro 5500M 8 GB, 32g RAM
        Windows 10 7700K, GTX 1080ti, 32g RAM, 2tb raided SSD

        1 Reply Last reply Reply Quote 0
        • gapworks
          gapworks last edited by

          @Fred

          how would i connect the syphon receiver with eyes++. so far i only used it with a webcam. i think being not a programmer i need a little more detailed information :(
          sample patches are the easiest to increase understanding and knowledge about isadora. or a more detailed description.
          best

          Running MBP2017 / Ventura Osx 13.6.7 / 16 GB 2133 MHz LPDDR3 / Intel HD Graphics 630 1536 MB / Latest Isadora Version / www.gapworks.at / located in Vienna Austria

          1 Reply Last reply Reply Quote 0
          • Fred
            Fred last edited by

            You will need to use the GPU to CPU video convertor, that is one thing about this method that is not as good as doing it in processing. It will be more efficient to run the tracking in processing and send the values via OSC to Isadora. The conversion you will have to do from GPU to CPU in Isadora is pretty taxing on your computer.

            There is pretty simple processing example called contours that will do what you need,

            http://www.fredrodrigues.net/
            https://github.com/fred-dev
            OSX 13.6.4 (22G513) MBP 2019 16" 2.3 GHz 8-Core i9, Radeon Pro 5500M 8 GB, 32g RAM
            Windows 10 7700K, GTX 1080ti, 32g RAM, 2tb raided SSD

            1 Reply Last reply Reply Quote 0
            • gapworks
              gapworks last edited by

              @Fred ...within my examples of processing2  there is no contours :( nor can i find it in the add library.

              Running MBP2017 / Ventura Osx 13.6.7 / 16 GB 2133 MHz LPDDR3 / Intel HD Graphics 630 1536 MB / Latest Isadora Version / www.gapworks.at / located in Vienna Austria

              1 Reply Last reply Reply Quote 0
              • Fred
                Fred last edited by

                You need to add the opencv library to processing and then you will find a contour example (or something close to that name I not next to it now). It is a bit of a jump from Isadora but there is a lot of good help on the processing forums and tutorials to get you going. Oh and I am using processing's latest version 3.

                http://www.fredrodrigues.net/
                https://github.com/fred-dev
                OSX 13.6.4 (22G513) MBP 2019 16" 2.3 GHz 8-Core i9, Radeon Pro 5500M 8 GB, 32g RAM
                Windows 10 7700K, GTX 1080ti, 32g RAM, 2tb raided SSD

                1 Reply Last reply Reply Quote 0
                • gapworks
                  gapworks last edited by

                  @Fred thanx again for your support, but i stick to processing 2 as all the kinect/isadora support is based on processing2 as far as i understood.

                  i´ll give it a try tomorrow...

                  Running MBP2017 / Ventura Osx 13.6.7 / 16 GB 2133 MHz LPDDR3 / Intel HD Graphics 630 1536 MB / Latest Isadora Version / www.gapworks.at / located in Vienna Austria

                  1 Reply Last reply Reply Quote 0
                  • Fred
                    Fred last edited by

                    There is opencv for processing 2, I was just looking at 3 as that is what I have. It will be pretty much the same.

                    http://www.fredrodrigues.net/
                    https://github.com/fred-dev
                    OSX 13.6.4 (22G513) MBP 2019 16" 2.3 GHz 8-Core i9, Radeon Pro 5500M 8 GB, 32g RAM
                    Windows 10 7700K, GTX 1080ti, 32g RAM, 2tb raided SSD

                    1 Reply Last reply Reply Quote 0
                    • gapworks
                      gapworks last edited by

                      @Fred i managed to install, but this sketch simply finds the contour of an image. it won´t move my text or follow the ghost image of my kinect. maybe i explained badly.

                      i attached a pic and a short movie for better understanding. eyes sound ok too, but i have no clue where to start. syphon receiver provides me with a single video out. what after? :( or can i define a "ghost image" like a camera?  I have a week to go but if worst comes to worst i shift the text with a slider. Although its a fake then. 

                      6be337-screen-shot-2016-02-25-at-10.33.35.png 42aeab-text2.zip

                      Running MBP2017 / Ventura Osx 13.6.7 / 16 GB 2133 MHz LPDDR3 / Intel HD Graphics 630 1536 MB / Latest Isadora Version / www.gapworks.at / located in Vienna Austria

                      1 Reply Last reply Reply Quote 0
                      • crystalhorizon
                        crystalhorizon Beta Platinum last edited by

                        Ni Mate sends OSC and a ghost image

                        Alexander Nantschev | http://www.crystalhorizon.at | located in Vienna Austria

                        1 Reply Last reply Reply Quote 0
                        • gapworks
                          gapworks last edited by

                          @crystalhorizon i´m trying to avoid Ni Mate as it has not proven to be very stable on my MBP. And i like the possibilities of Processing. Unfortunately i´m a newbie to this program. Yet ;)

                          Running MBP2017 / Ventura Osx 13.6.7 / 16 GB 2133 MHz LPDDR3 / Intel HD Graphics 630 1536 MB / Latest Isadora Version / www.gapworks.at / located in Vienna Austria

                          1 Reply Last reply Reply Quote 0
                          • Fred
                            Fred last edited by

                            I know what you want to do, there is a little more work to do than open the example, what is happening is that the contourFinder is going to iterate through all the contours (the shapes that are found in a video frame). At the moment in the example it just gets all the points of the contours and draws them. However for each contour you can also call contour.getBoundingBox(); this will return a rectangle that is the smallest rectangle that can fit around the shape of the contour. From this rectangle you can calculate the centre of the rectangle and hence the centre of the contour and hence the centre of the body (well this will change depending on the positions of the limbs).

                            It should get your pretty close to what you need, you will probably have to add some smoothing as when the body does not move but say the arms are outstretched you will get a change in the centre position as it will be dealing with the bounding box for the whole shape.
                            You can also do some more precise calculation based on the spread of the points that make up the contour if you are feeling tricky.
                            There is another great free open source option for doing this instead of processing, http://www.tsps.cc/ this is a fantastic product and in my opinion better than the paid Ni Mate, more options and more intelligent, and of course open source if you wish to make changes.

                            http://www.fredrodrigues.net/
                            https://github.com/fred-dev
                            OSX 13.6.4 (22G513) MBP 2019 16" 2.3 GHz 8-Core i9, Radeon Pro 5500M 8 GB, 32g RAM
                            Windows 10 7700K, GTX 1080ti, 32g RAM, 2tb raided SSD

                            1 Reply Last reply Reply Quote 0
                            • dbini
                              dbini last edited by

                              @gapworks NI Mate v1 isn't coping well with the new USB structure in OSX 10.11 El Capitan. And v2 has a problem with Kinect sensors plugged into USB3 ports. Delicode are working on the latter problem, I assume the former isn't going to be solved as v1 is no longer supported. good luck with Processing.

                              John Collingswood
                              taikabox.com
                              2019 MBPT 2.6GHZ i7 OSX15.3.2 16GB
                              plus an old iMac and assorted Mac Minis for installations

                              1 Reply Last reply Reply Quote 0
                              • Marci
                                Marci last edited by

                                @gapworks The kinect/processing tutorial sketch (I assume that’s what you’re using) sends torso position over OSC by default... use that to control your text’s X/Y position.

                                From the default Izzy file within the download, add 2x OSC Listeners, listening to channels 7 (x) & 8 (y). Link value output of these to x / y position of your text draw actor. Use calc actors in-between to apply any required offsets or scaling to the values. The torso z position (distance from sensor) should be on OSC channel 9 if needed.

                                rMBP 11,3 (mOS 10.13) / rMBP 11,4 (mOS 10.14) / 3x Kinect + Leap / TH2Go
                                Warning: autistic - may come across rather blunt and lacking in humour!

                                1 Reply Last reply Reply Quote 0
                                • Marci
                                  Marci last edited by

                                  If you remove the following two lines (365 & 367) from the processing sketch:

                                  canvas.stroke(userClr[ (userList[i] - 1) % userClr.length ] ); drawSkeleton(userList[i]);
                                  ...then you can run with skeletons enabled without actually drawing them on screen, which will initialise sending of the OSC data.
                                  If the centre of gravity still shows up (that should say WHEN, cos it will), then delete lines 375 to 390 as well.

                                  rMBP 11,3 (mOS 10.13) / rMBP 11,4 (mOS 10.14) / 3x Kinect + Leap / TH2Go
                                  Warning: autistic - may come across rather blunt and lacking in humour!

                                  1 Reply Last reply Reply Quote 0
                                  • Marci
                                    Marci last edited by

                                    @dbini - I sent them links to the published fixes for the whole USB3 issue probably over a year ago (possibly over 2 years ago now that I think about it), along with links to fixes for kinect motor not working. I was told then that they wouldn’t fix the issue. These issues have been around since NI-Mate v1 and are all down to the version of libusb that they’re utilising. Same issue as with the SimpleOpenNI Processing Kinect modules. I’m presuming they don’t have the ability to update the version or implementation of libusb that’s included within the OpenNIv1 libs that are necessary to provide Kinect v1 support that allows skeletons, which means it probably won’t ever get fixed in v2 and they’ll end up dropping Kinect v1 support. The specific problems are within Delicode_NI_Mate/Contents/Frameworks/libOpenNI.dylib, libusb-1.0.0.dylib and stable_libusb-1.0.0.dylib

                                    S’why I ended up abandoning several hundred pounds worth of investment in their software (Ni-Mate & Z-Vector) - they weren’t prepared to fix known bugs, or even add an announcement to warn future purchasers of the unavoidable compatibility issues - a practice they continued when they released v2 with no announcement re: USB2 Kinect & USB3 ports. S’why I moved over to Processing for Kinect stuff in the first place. *shrug* - Joys of commercial software relying on out-of-development open source software.
                                    TL:DR;  ANY & ALL software relying on OpenNI1 to provide Kinect rev1 support will have issues with USB3 ports. That means any & all software which provides skeleton output from a rev1 Kinect.

                                    rMBP 11,3 (mOS 10.13) / rMBP 11,4 (mOS 10.14) / 3x Kinect + Leap / TH2Go
                                    Warning: autistic - may come across rather blunt and lacking in humour!

                                    1 Reply Last reply Reply Quote 0
                                    • dbini
                                      dbini last edited by

                                      @Marci - this kind of stuff is all beyond my capabilities. I read your posts and am constantly impressed by your level of detail. Thanks for your contributions to the Isadora community.

                                      I just want to get a toolkit that works, and I don't mind paying a bit for something that's going to be plug and play and solve my problems. i do object to buying a license for something that's going to be useless in 1 or 2 years. here's what Delicode said in reply to my questions:
                                      "OS X El Capitan is still proving to be a huge problem due to the operating system USB system having been changed. We have a few ideas on how to fix this, but this will take a while. For now using Kinect for XBox 360 on El Capitan is not recommended, and staying in Yosemite is a safer bet."
                                      fortunately I just got my MYO working nicely, so am going to focus on that for a while and hope to find a simple solution for kinect sometime in the future.

                                      John Collingswood
                                      taikabox.com
                                      2019 MBPT 2.6GHZ i7 OSX15.3.2 16GB
                                      plus an old iMac and assorted Mac Minis for installations

                                      1 Reply Last reply Reply Quote 0
                                      • Marci
                                        Marci last edited by

                                        If you want any form of longevity from Kinect, bin the rev1 (or resign it strictly to no skeleton/user detection and use Freenect based solutions rather than OpenNI 1) and get a KinectONE instead... openni 2 is maintained & active and has no issues.

                                        rMBP 11,3 (mOS 10.13) / rMBP 11,4 (mOS 10.14) / 3x Kinect + Leap / TH2Go
                                        Warning: autistic - may come across rather blunt and lacking in humour!

                                        1 Reply Last reply Reply Quote 0
                                        • gapworks
                                          gapworks last edited by

                                          @Marci I have only KinetctONE. Two of them! And please do keep in Mind that you are talking to a designer and Photographer who is trying his best to learn some programming Languages. Processing at the Moment. So i failed after rev1.....:)

                                          best

                                          Running MBP2017 / Ventura Osx 13.6.7 / 16 GB 2133 MHz LPDDR3 / Intel HD Graphics 630 1536 MB / Latest Isadora Version / www.gapworks.at / located in Vienna Austria

                                          1 Reply Last reply Reply Quote 0
                                          • Marci
                                            Marci last edited by

                                            (That was for @dbini)

                                            rMBP 11,3 (mOS 10.13) / rMBP 11,4 (mOS 10.14) / 3x Kinect + Leap / TH2Go
                                            Warning: autistic - may come across rather blunt and lacking in humour!

                                            1 Reply Last reply Reply Quote 0
                                            • First post
                                              Last post