[ANSWERED] Track rotation
-
there will be some maths that you can apply to a bunch of actors to achieve this. Value Delay Line into Comparator is a good way to compare a value with what it was a few frames ago, so a few of these can be used to set up some logic system to track rotation. @bonemap has worked with rotation of 3D models in Isadora, he might be able to help out......
-
@gaspar said:
a "simple" way to track the rotation of an object
Hi,
I have a prototype wireless motion tracker that uses IMU sensors worn by a dancer or integrated into a set or costume to provide rotation data as OSC stream. But do you mean from a video source or as live interaction? One video source method is tracking using a depth camera.
Best Wishes
Russell
-
Personally I say the same regarding spinning movement. Use a sensor (Wireless) that you attach to a place that moves with the joint. If you for example want the entire body rotation you can attach it to the back of the dancer. An Gyroscope breakout and an NodeMCU can go a long way. Sending it out of OSC allows you to quickly have a working prototype in Isadora that basically has zero issues, since it is clearly connected to the body of the dancer, so if you calibrated it correctly it should read the degree's of the dancer.
Regarding body tracking using a depth camera. Let me see or I can have a spinning motion detection using the Kinect that I have lying around here in the office.. Since I really don't know or an spinning motion is not going to really make the plugin upset about the movement / give a lot of noise in return.
-
@juriaan said:
Use a sensor (Wireless) that you attach to a place that moves with the joint. If you for example want the entire body rotation you can attach it to the back of the dancer. An Gyroscope breakout and an NodeMCU can go a long way. Sending it out of OSC allows you to quickly have a working prototype in Isadora that basically has zero issues, since it is clearly connected to the body of the dancer, so if you calibrated it correctly it should read the degree's of the dancer.
You can use a smartphone sewn into a costume (snug pocket on their back, sew into a leg/arm warmer/etc.) + TouchOSC to get x, y, z accelerometer data as OSC over a wireless network. You'll still need math and logic to figure out which way they're spinning, but it's a start that gets you the data about their movement with extremely accessible (and theoretically very reliable) hardware and software that's cheap(ish), quick, dirty, unaffected by light levels, wireless, and doesn't involve any sort of camera calibration.
-
Hi,
Here is a link to the DIY instructions for a wearable sensor that we make. It gets built into gloves, costumes and mobile properties during live performances and installations. The components are dirt cheap so it is viable to build them in multiples.
https://reprage.com/post/DanceDanceSense
The sensors are similar to those found in mobile /hand held devices so they are good for Localised rotations ie the when the subject is moving body parts fitted with the sensor - But does not provide tracking for travel through space.
We are in the process of developing a new version of this with an upgraded that incorporates an IMU sensor.
Best wishes
Russell
-
Hi and thank you everyone.
I do have a an adapted Arduino novecento (made by HANGAR.ORG) with a wifly and an accelerometer attached to it, that I used back in 2013. I've also used smartphones with the very simple ANDOSC app (love it). But now I was looking to do this without any other sensor than the camera.I once saw TRINITY, by Oscar Sol. AFAIK he only used one cenital camera and projector (and strong infrared washers and an IR-reflecting costume on the dancer). He did the magic with MAX-MSP (music made by another guy with other systems). I could not find any video where the rotation analysis is evident, but I remember the feeling it was (maybe I'm wrong). Anyway, here's the link.
@dbini yes a differential analysis is for sure the way to go (thanks for the value delay tip), the thing is, which parameter could I feed in?
Maybe you're all right and I should get the dust off my old arduino and just add it into the data stream. I was just hoping it could be done without it. -
Heiiii Russell, thanks for the link! What are your thoughts for improving? I happily join with my students to improve...Best wishes, Tom
-
if you get the lighting right, and your dancer is indeed rotating about the vertical axis of the image produced by the overhead camera, you can split the video feed into 2 halves and use Eyes++ to blob track the movement in each half. if the left half produces movement that travels down the screen whilst the right half produces upwards movement then you can safely say that the dancer is rotating anticlockwise.
-
@tomthebom said:
What are your thoughts for improving?
Hi Tom,
I intend to replace the accelerometer with a IMU that includes accelerometer, gyro and magnetometer. That I am anticipating will provide a greater level of information/accuracy for the body movement.
I have all the components now and will start to build them soon.
Best Wishes
Russell
-
@dbini Thanks a lot for that idea. I would need to dynamically split the window to do that, since the dancer might turn in any place on stage. Maybe I could on one side find the blob center and use it to place the split and have several eyes++ running on different virtual stages. I'll have to check out some day. Unfortunately I'm not very constant in my dedidaction to dig into izzy.