Leap Motion Video stream
-
Is there a chance to get the leap motion video streams read by isadora? Currently the device shows up as a video capture device in isadora(and other apps), Isadora(or any other app is unable to display the streams.It would be a helpful thing since the cameras run up to 200fps it's a ready made (dual) infrared camera. The super wide lenses would allow placement in new places in proximity to performers, or on performers - as I'm looking to do for my thesis piece.The device has two cameras it sends both of the streams simultaneously. They are high frame rate and odd resolution. there is a utility that merges the two feeds and corrects for distortion but currently the only way I can work with the feed is using the visualizer app that comes with the leap motion device which I live screen capture and feed into isadora with slight delay but probably lower frame rate. -
+1 thumbs up
-
Dear @LPMode,
Have you written Leap Motion to ask for help on this? If you're seeing the device in Isadora, it probably means they've created a Virtual Web Cam to allow applications to grab the video. If it doesn't work in Isadora or any other app, then it would seem that this software is buggy and needs to be fixed.
I would send them a support request and see if you can get any answers on this first. Let us know what you find and out and we'll see what we can do.
Best,
Mark -
-
Check this out also:
http://syphoner.sigma6.ch/ -
Thanks all. I'll try to compile an app using openframeworks as there are examples form requesting the stream as well as for the Spout server window and keep you posted. the stream the leap motion sensor sends is non standard and it requires some processing on the host end
-
TO have sypho spout compatibility would be the best option but you can in fact almost out of the box.
1) buy lea motion (of course)2) install the latest driver and app. Restart3) Go to menu bat leap motion icon and launch VisualizerAs Skulpture said launch syphoner and decide what of then 2 images you want to see in Isadora (as you probably know leap motion delivers 2 different images because it has 2 cameras.Et voila -
the problem with screen capture, as I mentioned, is that it adds some delay and one probably loses the high frame rate option. I'm on windows (hence Spout) so the capture may not work as good as syphoner as it's locked to the screen refresh rate.
-
I have been working with a custom LeapMotion app which unpacks the hand data as well as the stereo camera signals. My friend assured me the SDK is very well written and easy to use... if you know how to program? (Thats why I am collaborating with him!)
In our app we can turn on and off the camera syphons and individual had data streams in order to save processing if you don't need to use them all.But out-of-the-box, the LeapMotions camera streams are not being served to Apples QKT Kit or AVFoundation and so Isadora will not yet be able to use them natively like that.
To see the hands as 3D objects, we wrote another app to send the two streams to a 3D projector or TV as a top/bottom or side/by/side pair and then projector recombines them into a 3D video image.
Our tests so far, by the way show that the range of the cameras is not very far. They really are only optimised for the hands. Even using the camera to look down instead of looking up affects the accuracy since the hand algorithms are not set up to be seen that way up.
Its definitely an exciting tool however.
The joys of research!! -
I made a sketch in Processing that works (almost). I had some help in troubleshooting because the image class is used in both leap motion and spout so processing was getting confused.
There is some problem however where the sketch runs fine but the executed executable does not. I will see if I can figure it out.