Lowest latency method for tracking a projection surface
-
What I'm trying to do, is have a performer cross the stage holding a card (think of the women holding up cards at boxing matches) and then tracking the card for projection.
I was wondering if placing IR ledson the corners would be enough to track the mapping.
also I'm concerned about the latency inherent in using a camera to track the corners; is there a better way to do it?
Thanks. -
i should imagine that's probably the best way to do it. are you using Mac or Windows? I talked to Frieder Weiss about this a few years ago. He has a thing about latency and would use a Windows machines with a PCI capture card. (https://frieder-weiss.de/eyeco...) I use Mac and have found that the low-light b/w ELP USB cameras are pretty quick (http://www.elpcctv.com/) and with good image quality.
Projectors tend to add a bit of latency into the system as well. Some are better than others. In a recent project I used a Sony and a Benq mapped together, but the Benq was visibly slower to respond.
How do you plan to differentiate between corners? you can get different wavelength IR LEDs, but that wouldn't that need more than one camera with different filters on each...? -
@thatmattrogers absolute speed and accuracy will come from using a mocap system like optitrak and using active markers synced with the cameras, there is a hardware kit you can use for this. If you need single camera based tracking and you want to project on the card gigevision cameras will give the lowest latency, next option would be using a capture card, magewell have some of the lowest latency, then you just need a low latency camera, a lot of consumer cameras also have a frame or 2 on the output but look at the Marshall SDI cameras and using a reference signal to the camera and capture card will also bring down latency as the frame sync is not needed.
Last for tracking accuracy you would want to get a transformation matrix for the tracked object (a set of numbers that properly describe position and rotation). I think Isadora shaders are not yet equipped to deal with this yet but are getting very close. This would allow you to apply a matrix transformation so the rotation and position of the tracked object are taken care of, then you need to get a second matrix to translate between the tracked space and the projection space.
Another good option and a little cheaper would be to use a vive tracker and an HTC vive system. This is cheaper than mocap and can give a trackable areas of a reliable 6m *4m. There would be a bit of work to do this to get something like osc out, but it is very fast and gives very good results.
All in all Isadora is not really equipped for tracked reprojection, to get accuracy you need to use GL transforms via 4*4 matrix for the tracked object (Isadora does not handle matrix transformation), and you need to get a second matrix describing the transformation between the projectors output and the real world 3d space.
You can try with a camera and 2d planar tracking, but Isadora does not understand distance and attempting to calculate scale and rotational offsets from four tracking points is tough when the points are only recognised in 2d space (position coordinates you can get from using eyes or eyes ++). If you don't mind a slightly messy and slow output you might get something usable in some circumstances.
The fastest cheat I can imagine is having your card made of a material that is highly reflective to infra red light and tracking that with an ir camera. Then using the live feed both as a mask and with eyes ++. You will find that once you find an offset for projection in one position it will not be very on when you move around but soft edges, smaller projection and careful positions for cameras and projectors may get you close enough.
-
I had a similar idea for a show once but they didn't have enough time to set everything up while touring. (only 4h to build everything) so we went for a big tablet (ipad pro) and an app to have the tablet as a second screen.
-
I remember the first time I (knowingly) saw Isadora in action was in a piece that @mark was showing at a Digital Dance festival in the UK in 2014, where images were projected onto pieces of paper that moved along a clothes line. I remember being very impressed by how accurate it was. I don't know how he did that, but maybe he'll chime in here.
Otherwise I think that @GertjanB 's solution is perfect! I would be curious to know what app you used: I use Duet Display a lot, but the iPad has to be physically connected to the computer. This one, Luna Display, looks promising... -
I think i used air display (than version 2). Luna display looks very promising indeed. -
thanks everyone, You've given me some paths to start investigating.
I'll either come back with a bunch more questions or i'll let you know what I settle on.