• Products
    • Isadora
    • Get It
    • ADD-ONS
    • IzzyCast
    • Get It
  • Forum
  • Help
  • Werkstatt
  • Newsletter
  • Impressum
  • Dsgvo
  • Press
  • Isadora
  • Get It
  • ADD-ONS
  • IzzyCast
  • Get It
  • Press
  • Dsgvo
  • Impressum
FORUM

Navigation

    • Register
    • Login
    • Search
    • Categories
    • Recent
    • Popular
    • Tags

    Lowest latency method for tracking a projection surface

    How To... ?
    tracking izzymap infrared mapping motion tracking
    5
    7
    3186
    Loading More Posts
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as topic
    Log in to reply
    This topic has been deleted. Only users with topic management privileges can see it.
    • T
      thatmattrogers last edited by Woland

      What I'm trying to do, is have a performer cross the stage holding a card (think of the women holding up cards at boxing matches) and then tracking the card for projection.
      I was wondering if placing IR ledson the corners would be enough to track the mapping.

      also I'm concerned about the latency inherent in using a camera to track the corners; is there a better way to do it?

      Thanks.


      I'm running Isadora 3 on Win10
      Machine 1- AMD 1700x 8 core CPU, 32GB RAM, and an AMD Radeon RX580 GPU
      Machine 2 -AMD 5900HX , 8 Core APU, 16GB Ram and an integrated GPU

      dbini Fred mark_m 3 Replies Last reply Reply Quote 1
      • dbini
        dbini @thatmattrogers last edited by

        @thatmattrogers

        i should imagine that's probably the best way to do it. are you using Mac or Windows? I talked to Frieder Weiss about this a few years ago. He has a thing about latency and would use a Windows machines with a PCI capture card. (https://frieder-weiss.de/eyeco...) I use Mac and have found that the low-light b/w ELP USB cameras are pretty quick (http://www.elpcctv.com/) and with good image quality. 
        Projectors tend to add a bit of latency into the system as well. Some are better than others. In a recent project I used a Sony and a Benq mapped together, but the Benq was visibly slower to respond.
        How do you plan to differentiate between corners? you can get different wavelength IR LEDs, but that wouldn't that need more than one camera with different filters on each...?

        John Collingswood
        taikabox.com
        2019 MBPT 2.6GHZ i7 OSX15.3.2 16GB
        plus an old iMac and assorted Mac Minis for installations

        1 Reply Last reply Reply Quote 1
        • Fred
          Fred @thatmattrogers last edited by

          @thatmattrogers absolute speed and accuracy will come from using a mocap system like optitrak and using active markers synced with the cameras, there is a hardware kit you can use for this. If you need single camera based tracking and you want to project on the card gigevision cameras will give the lowest latency, next option would be using a capture card, magewell have some of the lowest latency, then you just need a low latency camera, a lot of consumer cameras also have a frame or 2 on the output but look at the Marshall SDI cameras and using a reference signal to the camera and capture card will also bring down latency as the frame sync is not needed.

          Last for tracking accuracy you would want to get a transformation matrix for the tracked object (a set of numbers that properly describe position and rotation). I think Isadora shaders are not yet equipped to deal with this yet but are getting very close. This would allow you to apply a matrix transformation so the rotation and position of the tracked object are taken care of, then you need to get a second matrix to translate between the tracked space and the projection space.

          Another good option and a little cheaper would be to use a vive tracker and an HTC vive system. This is cheaper than mocap and can give a trackable areas of a reliable 6m *4m. There would be a bit of work to do this to get something like osc out, but it is very fast and gives very good results.

          All in all Isadora is not really equipped for tracked reprojection, to get accuracy you need to use GL transforms via 4*4 matrix for the tracked object (Isadora does not handle matrix transformation), and you need to get a second matrix describing the transformation between the projectors output and the real world 3d space.

          You can try with a camera and 2d planar tracking, but Isadora does not understand distance and attempting to calculate scale and rotational offsets from four tracking points is tough when the points are only recognised in 2d space (position coordinates you can get from using eyes or eyes ++). If you don't mind a slightly messy and slow output you might get something usable in some circumstances.

          The fastest cheat I can imagine is having your card made of a material that is highly reflective to infra red light and tracking that with an ir camera. Then using the live feed both as a mask and with eyes ++. You will find that once you find an offset for projection in one position it will not be very on when you move around but soft edges, smaller projection and careful positions for cameras and projectors may get you close enough.

          http://www.fredrodrigues.net/
          https://github.com/fred-dev
          OSX 13.6.4 (22G513) MBP 2019 16" 2.3 GHz 8-Core i9, Radeon Pro 5500M 8 GB, 32g RAM
          Windows 10 7700K, GTX 1080ti, 32g RAM, 2tb raided SSD

          1 Reply Last reply Reply Quote 2
          • GertjanB
            GertjanB Beta Platinum last edited by

            I had a similar idea for a show once but they didn't have enough time to set everything up while touring. (only 4h to build everything) so we went for a big tablet (ipad pro) and an app to have the tablet as a second screen.

            www.gertjanbiasino.be

            1 Reply Last reply Reply Quote 1
            • mark_m
              mark_m @thatmattrogers last edited by

              @thatmattrogers

              I remember the first time I (knowingly) saw Isadora in action was in a piece that @mark was showing at a Digital Dance festival in the UK in 2014, where images were projected onto pieces of paper that moved along a clothes line. I remember being very impressed by how accurate it was. I don't know how he did that, but maybe he'll chime in here.

              Otherwise I think that @GertjanB 's solution is perfect! I would be curious to know what app you used: I use Duet Display a lot, but the iPad has to be physically connected to the computer. This one, Luna Display, looks promising...

              Intel NUC8i7HVK Hades Canyon VR Gaming NUC, i7-8809G w/ Radeon RX Vega M GH 4GB Graphics, 32GB RAM, 2 x NVMe SSD
              Gigabyte Aero 15 OLED XD. Intel Core i7-11800H, NVidia RTX3070, 32GB RAM 2 x NVMe SSD
              PC Specialist Desktop: i9-14900K, RTX4070Ti, 64GB RAM, Win11Pro
              www.natalieinsideout.com

              GertjanB 1 Reply Last reply Reply Quote 2
              • GertjanB
                GertjanB Beta Platinum @mark_m last edited by

                @mark_m


                I think i used air display (than version 2). Luna display looks very promising indeed.

                www.gertjanbiasino.be

                1 Reply Last reply Reply Quote 2
                • T
                  thatmattrogers last edited by

                  thanks everyone, You've given me some paths to start investigating.

                  I'll either come back with a bunch more questions or i'll let you know what I settle on.


                  I'm running Isadora 3 on Win10
                  Machine 1- AMD 1700x 8 core CPU, 32GB RAM, and an AMD Radeon RX580 GPU
                  Machine 2 -AMD 5900HX , 8 Core APU, 16GB Ram and an integrated GPU

                  1 Reply Last reply Reply Quote 0
                  • First post
                    Last post