assurance-tunnel
assurance-tunnel
assurance-tunnel
assurance-tunnel

Help - motion tracking for moving surfaces



  • hi folks

    i´am working with an artist on a peace this days - first time i realy work with the new mappingfeatures - it is realy great and works perfectly - we map different surfaces like triangle, circles (1mx1m) and so on - the idea (for the next rehearsels in may 2016) is to have a motion tracking system for three of these 7 surfaces that the actors can take them and move with them and the mapping is following them in each axis....how you guys would realise this - with ir-led on the corners of the surfaces + ir camera and ir light, kinect or is there a better system, do i need a nother software infront of izzy - would this be very difficult - especialy because they will cross each other? - i never worked with motion tracking before...

    thanks a lot for any help!
    ciao bodo


  • Tech Staff

    Hi,

    Quite a big task and arguably one of the hardest things to achieve. It is possible - but with limitations/adaptations. There is only one artist I know who has achieved this who is:
    Marco Tempest
    http://www.marcotempest.com/
    One of the many ways he has done this is using the method you suggest in your post. 
    I would start with a basic rectangle, IR LED's and webcam with IR filter and use the publishing feature of mapper, or even the 3D quad distort actor at first to get the hardware and method working. Then work towards other shapes and multiple instances. 
    Kinects are good at working out depth and people; not so much tracking exact shapes. I hope this helps for now. 


  • Hi Skulpture

    thanx for your thoughts - but how to get the signal fom the camera into data which i can use in isadora? i remember (10 years ago) Mark used Eyesweb infront of isadora for it - is it possible only with isadora this days...???

    ciao Bodo


  • Tech Staff

    I asked this task myself a lot...


  • Tech Staff

    Personally I would use another computer; mac or PC with isadora or max/msp or similar doing the tracking and then another computer doing the mapping (isadora probably).

    The first computer would need to do the tracking using IR and then send the XY and Z (depth) co-coordinates of each corner over OSC. 
    OBJECT ONE (rectangle) > Corner One X Y Z > Corner Two X Y Z , etc etc, 
    OBJECT TWO (square) > Corner One X Y Z > Corner Two X Y Z , etc etc, 
    The only thing I can't think how to solve is how to distinguish between Object ONE and Two? Because all the camera will see is a load of IR lights... know what I mean?


  • thanks a lot! - how you would track in izzy - can eyes actor handle those ir led`s?



  • Isadora may be able to do moving projection control but it will probably not be accurate. A few problems you will encounter is that Isadoras 3D control is quite limited (you are moving these objects in 3d space and should move the projection in a virtual 3d space as well to match propoerly). The biggest problem is that to avoid gimbal lock when tracking rotation you will need to describe the rotations with a matrix or quaternion numbers neither of which can be used in Isadora. Any good tracking system will recognise the object and and give you proper rotations (you will have a lot of trouble just using corner points and trying to give the idea of rotation to your projection).

    You can achieve this a few ways, one would be to use an IR reflective coating to paint a marker only visible to the IR camera and use a tracking system with orientation to track the object (something like AR toolkit or some other marker recognition). Other options include some kind of rigid body motion tracking system such as optitrak arena. These solutions will easily know the difference between any objects.
    Also whatever tracking and projection system you use will need to be fast if you want it to match - very fast. This is possible but requires some hardcore gear and skills.


  • @Fred - thank you for all those detailed informations! - to let you know: rotations are not realy nessesary there are just a 2d triangle, circle and rectangle - which will be moved by an actor mainly from one side of the stage to the other - maybe a little in the hight and depth - do you think this make it easyer and make itpossible to do it just with izzy???



  • It is up to you and your clients- how clean they want the tracking. In a theoretical scenario when everything else on the stage is very well lit so if the tracking is off it is burned out by lights and the actor never rotates the object at all - in any vector (keeps parallel to the projector at all times and does not rotate the shapes). Also depends on what you are projecting on them and how much you care if they are off. Marco Tempest stuff is super slick and does not allow for this kind of inaccuracy(well it does but a much lower margin of error than izzy can do. 

    If you can calibrate your projector and camera (a difficult task but that means you know where each pixel in the camera relates to each pixel in the projector this means compensating for lens distortion if you do it properly) then I would cover the boards in a material that is highly IR reflective, I would make sure that nothing else in the view of the camera is IR reflective and then you have a mask and a blob to track . It will be slow, there will be a lag when you move the item the video will follow several frames later. If that is ok then you are on the way and you can use Izzy kind of. Getting your board to be the only IR reflective source on the stage will be a difficult task and negotiation with costume and set. You would be surprised what reflects IR light and you need a laptop, IR light and small IR camera to test all the materials that will be on stage in that scene.
    In the end it depends if you are willing to settle for a kind of OK solution, but the illusion works best by taking care of all the little details, this requires other hardware and software.


  • thank you so much Fred for taking your time to give this detailed information - i will do a test with izzy and maybe we have to hire an expert...



  • Hi Bobo,

    I've a similar project, how is yours ?
    Thanks for the feedback
    Claudio


  • To keep different shapes IR pointers not mixed up you may want to use different IR wavelength LEDs - like 850nm; 750nm - with appropriate filters and separate cameras. The other possible help could be a GPS or even Wii attached to the shape. This would give you additional clue about specific objects location in space.

    I have not done any of this - so it is just a thought.


  • OpenCV can handle 2D primitives tracking from any mjpg (jpg) src... there are Python examples out there: http://stackoverflow.com/questions/11424002/how-to-detect-simple-geometric-shapes-using-opencv Might be some sketches on the processing forums covering it using the Java OpenCV libs that could cast a mask into Izzy via syphon... will have a ratch n' see what I can find. As per @Fred: calibration of camera to Kinect to projector will be crucial for quality results, and will need recalibration for every rig. RGBD Toolkit / DepthKit provides this for Kinect & KinectONE respectively... see http://www.depthkit.tv/rgbdtoolkit/



  • Hi clmuller,

    because of budget we decided to map the stuff manually (i realised that there is a bunch off fast and expensive hardware nessasary and of course a ot of skills) to realised a good working trackingsystem  - so no motiontracking right know - but would be happy to get some news if you find a good and inexpensiv way - we are going to start the project next week...

    ciao Bodo


  • Tech Staff

    Mark mentioned in the Werkstatt Berlin at his IR tracking workshop that he would love something like eyesweb for Mac and that he probably will write it on his own. That would be really great.



  • I did this quite a lot a few years ago and I used IR leds in the middle of the surface for the version where things stayed on their side of the stage (no risk for accidental swapping). And geometry was not important (no clean mapping, just roughing it in)

    Having a good camera and well prepared LED was key. I had to upgrade from using the hacked PS3 camera from peau productions to using a CCD camera as recommended on frieder weiss homepage: http://frieder-weiss.de/eyecon/infrared.html, to get enough resolution for a smooth track and the width of a whole stage.
    I also prepared the LED with some tubing to concentrate the emitted light, which stabilized the track a lot. I posted images of how that was made in [this post](http://forum.troikatronix.com/cgi-bin/forum/gforum.cgi?post=8697;search_string=fubbi%20IR%20led;guest=22417115&t=search_engine#8697) but they don't load for me, I hope they load for you.
    I also tried a show where we had a processing application looking for corner leds and trapezing the image between the points, Now you can probably do the same by publishing inputs for your isadora map and tracking individual LED. 
    My two cents on keeping multiple LED identified would be to blink them in different frequencies and count the blinks. Either in isadora or processing
    F

  • Tech Staff

    @fubbi RE: "My two cents on keeping multiple LED identified would be to blink them in different frequencies and count the blinks."

    This is how the [blacktrax](http://blacktrax.cast-soft.com/) system work in conjunction with the d3 media servers.

Log in to reply
 

Looks like your connection to TroikaTronix Community Forum was lost, please wait while we try to reconnect.