How to have a person move an object that is generated/projected by Isadora
-
DusX's 2d Collision detection routine should give some hints on how to vaguely achieve this... would need to sit down and think it through thoroughly but am sure it'd be doable via some NI-Mate / OSC action... you'd just need to calibrate your space to your projection and ensure it stays fixed.
http://dusxproductions.com/blog/simple-collision-detection-with-javascript/ -
So, XY input (args[0] & args[1]) of the 'point' would be OSC hand position from NI-Mate... then set upper left and bottom left of an interactive square.
You'd need a JS actor per interactive square.Use the true or false output state to trigger your... er... hover state.[EDIT] Hmmm... now I want to recreate the Nine Inch Nails Echoplex projected drum sequencer using Kinect & NI-Mate and Izzy. Dammit. https://vimeo.com/61602944 -
Also shouldn't be too difficult to read Z if memory serves (does NI-Mate return XYZ for a hand, or just XY? Can't recall) also, and then create an invisible interaction 'plane' at a static Z point so that you can work in front or behind.
Might get the Leap Motion out tomorrow and have a play small scale. -
If you have a timer running on the scene you could also measure velocity of the movement at the time of interaction with 2x z planes... when the first z plane is broken the timer starts, when the second is broken the timer ends.
Subtract start time from end time.Use distance formula to measure the distance between the start and end point: http://www.mathwarehouse.com/algebra/distance_formula/index.php (although In an installation scenario, these would likely be a fixed constant.)Divide distance by time difference. Higher the result, faster the velocity. Use this to determine the severity of whatever movement you apply to the interacted with rectangle.Javascript opens up a h00j world of fun... and maths. -
[NB: ALL THEORETICAL!]
-
Hello Monty! Good and busy summer. Hope yours was also! Thank you everyone for your feedback, and help. I am going to be utilizing this community. And hope someday I can offer help.
I've got some work to do.vanakaru- good question and it is making me think more about this idea. which is what it is at this point, an idea and it may prove beyond my current capabilities on Isadora or not doable at all in Isadora, but here goes. When tracking a person using a skeleton overlay and when a point on the skeleton (say the hand) moves, then an object moves that has been connected to it = continuous connection. I don't want the object to move for ever movement. Only when someone walks over to the objects being projected and "pushes" them= static, moved when pushed.So in re thinking my lay out, the grid design wouldn't necassarily cover the wall, it may be in the downstage left area only, so the other side of the stage could be empty for the dancer to dance in. Then the dancer would walk over and start dancing by the grid. When their hand "touched" one of the projected objects, it would be moved. Hope that helps and thanks again all for helping. -
My experience with kinect is somewhat limited. Mostly 'cos the system is too unreliable. I hate to loose connection for whatever reason/interference during a show.
The most trustworthy solution have been where performer has IR led attached to the hand and camera with IR filter(sees only IR lightwaves) doing the tracking.Kinect is great and fun when it works, but look at the official games. Players loose connection all the time. The performance could be designed to make this chaos as part of the show but for minimalistic stuff ...I don't know. -
Agree with single IR light point tracking - infinitely more reliable, mainly due to simplicity - s'how I do head tracking for simracing (rFactor 2).
That said, attached is a quick vague (_very_ rough and no where near fully functional) PoC for collision detection of Left_Hand as OSC Controller from NI-Mate v1 & Kinect360 (represented by red dot) against a green rectangle using a fixed presumed Z plane. No velocity tracking etc, altho technically NI-Mate sends velocity as a value scaling DOWN from 127 the faster it goes on /Hand/Left/V if one cares to tap & utilise it.When you're in the zone the rectangle will move with your hand. When you're not, it won't.Based on @DusX's 2D Collision Detection as mentioned above. -
Would probably have been simpler just to have the rectangle change color when you're in the zone and back again when you're not, but I've only just thought about that... meh
-
Very cool Marci...I will check this out. Of course IR led's.
-
I defiantly need to catch up on this thread!
It's something I've been wanting to achieve for over 10 years inside isadora. -
The trick would be utilise bits of Isadora I've never really delved into... 3D Projector etc.
It's deciding that Z plane that makes it usable. Without it, the triggering is too vague and unpredictable, but as soon as you say 'I no longer want to look just at X and Y, but Z too' it refines it in a bit and becomes infinitely more useful. The NIN video above is actually incredibly simple to pull off in isadora.There would need to be a more refined loop to accomplish what I was going for in the patch - that is, to be able to grab and move and let go of something, akin to left click drag on a window. The position needs feeding back within the JS rather than using actors to smooth it all out... I was a bit ambitious for a quick 2 minute job. Changing color as you pass over however should be simple, therefore, having each "rectangle" as a paused movie with a still frame at the front for the "steady state" which then plays as you "hover" should be equally simple to achieve. Disconnect the hot and vert position inputs from the top shape actor to stop it dragging (badly). Output 10 on the upper-right JS sends a 1 state when you're hovering over the rectangle. As you pass over it it'll go from 0 to 1 back to 0 again as a hard switch.NB - also used @DusX's umbilical chord / data multicore JS in there too... check it on his blog to get to grips with what's happening between the two javascript actors.I've said it before and I'll say it again - this is where some semblance of a DOM is desired in Isadora; a transparent webbrowser actor overlayed on the stage with ability to scale resolution (i.e.: render a browser window of 1024x768 on a 640x480 stage). It could make it incredibly easy to utilise existing user-friendly frameworks (e.g.: jQuery) to get some very easy interaction effects, and also animation libraries... and there are PILES of collision detection libraries etc out there in web land. If such an actor existed, you'd then have WebGL to throw into the mix, and all the power of the browser and the knowledge of web developers & LOTS of tutorials everywhere. *I dreamed a dream...* -
Pretty darn impressive Marci! This all makes sense more or less. Need to get into it to see how it works. The Z plane does appear crucial unless I would imagine the dancer was say flat against the wall or being shot from above and using their feet to kick an object.
I'm using Processing right now and never had tried NI- Mate. As I said I'm new to this. I downloaded a free version of NI-Mate 2 but it doesn't send/receive OSC. For basic or Pro they are now all subscription based?!! Curious tips on NI-Mate.Also looking into building my little IR system to go onto my dancers hand. Figure on on front of hand and one on back should work. Any suggestions re NI-Mate? Looks like batteries and IR Led and I have my light for IR detection on the Kinect. -
Slow reply but in relation to (@Marci )
"Also shouldn't be too difficult to read Z if memory serves (**does NI-Mate return XYZ for a hand, or just XY**? Can't recall) also, and then create an invisible interaction 'plane' at a static Z point so that you can work in front or behind.Might get the Leap Motion out tomorrow and have a play small scale."Ni-Mate sends XYZ yes. -
@joejdrums - For Ni-Mate you want v1 ideally, which has a lifetime license but basically will receive no further updates. I despise subscription licensing... sadly I think Delicode have just ruled themselves out of the market at our level really. I'd have to have a ratch to see if their are any more user-friendly offerings out there, otherwise yer stuck with continuing with Processing. If memory serves I think simpleKinect on GitHub was the go-to for this...
@Skulpture - yep I discovered that in the end! Ta! -
downloaded simpleKinect and am going to look into it. Not able to find NI-Mate v.1. The project has potentially changed a bit. In meeting with the choreographer this was shown to me as the latest idea. It seems canned? Possibly using tracking but curious all of your opinions as it is the evolution of my initial question. Trying to learn what Izzy is capable of before going too far down rabbit holes.
Look at the 2:20 mark specifically of this [video](https://www.youtube.com/watch?v=-wVq41Bi2yE) -
Nice effect. Doubt it's canned if AEF were involved...
I'll have a ponder.At the moment my mind's heading off to MagicMusicVisuals if I'm honest, converting the OSC to MIDI to use as a deformer on an Interactive GLSL Shader patch. Finding / authoring a shader patch to give the visual effect you want however would be the domain of... well... a GLSL Shader authorer type d00d. Beyond me... I just swipe ones freely available online. (Is Izzy capable of this yet or do we have a timeline for it @Skulpture?) -
Marci...you have sufficiently lost me. I do not know what you are talking about. Sorry. GLSL Shader authorer type d00d. I have heard of open GL, understand basically what a shader is after looking it up. Wonder if Max/Msp/Jitter might be useful here. How does Izzy interact with Max?
-
Just thought I would wade in with a historical reference that might just trigger some other lines of thought. (admitting i need to go back thoroughly through the thread)
In the early 1990's I used an Amiga based system called Mandala. It was based on an 8bit video input card 'Live'. The Mandala software achieved very fast and reliable collision detection by allowing one to assign certain of the bit planes to the incoming video signal and others to graphics in ones environment. When different bitplanes collided in the same pixel space a range of functions could be triggered. it included gravity and attachment to left and right X coordinates. Surprisingly useful, a performer could grab looping anims move them around then throw them off with a flick of the wrist. There are many videos of performance and installations i did back in the day on my site.One of the first things I did when I came to Isadora in 2009 was try unsuccessfully to recreate facsimiles of some of that work of mine. I use Kinect, Leap, InfusionIcubeX and a bunch of other sensing devices. Like vanakaru i am not overly fond of kinect and find video camera solutions more reliable out of the studio.I think there would be a strong place for a collision detection actor as described by Marci. I too dream a dream! -
I agree. What has always baffled me is this:
There are ways of using javascript and other advanced calculations to detect two flat edges and when they it each other they bounce off each other. this can be done using sprites and envalope generators and inside range floats, etc.But with a kinect the edges are of course not just flat; they move and change quite rapidly - as well as having many edges. No idea where to even start with that!