Wearable Inertial motion sensors
-
Hi,
the documentation says you can use a Kinect as one of the options:
https://github.com/YCAMInterlab/RAMDanceToolkit/wiki/Structure-of-RAMDanceToolkit
Best -
@stj, Right, one can use any OSC source, including kinect. However, inertial motion sensors are superior to kinect in dance applications (distance, stability, possibly frame rate). For example, inertial motion sensor can be reliably used up to 100 meters away from the server, they don't get confused by complex movements the dancer makes, there is no theoretical limit on the number of dancers.
--8 -
this is awesome stuff, but i'm not intelligent enough to get it to work. i've been trying to find out how to get OSC into the RAM dance toolkit app, but so far, i've failed. its all so new that i can't even find out where to look for answers.
-
@dbini Please provide more details on your setup: I am going to play with it tomorrow, I might be able to shed some light if I know what you are doing.
--8 -
RAM Dance Toolkit is absolutely amazing!!!
-
hi @eight - i'm using NI Mate to generate the OSC data, but can't seem to find a way of getting the OSC into RAM Dance Toolkit. i've tried the RAM-OSCServer, but can't get my head around what i'm supposed to do with it, apart from changing the IP and port address - it seems that its used for sending OSC rather than routing it into the Toolkit.
Bruno - its great playing with the recorded Actors - have you managed to get any live data in?cheers,john -
@dbini RAM-OSCServer is not going to route OSC, it is a simply an OSC playback application (you can get recordings in one of the folders in the application directory, for example, /Applications/RAM-OSCServer_mac-v1_0_0/MotionData-OSCServer/Yoko+Cyril/Ando.21-17_53_53.xml). For live feed you could in principle use NI-Mate (in this case, RAM-OSCServer is out of the picture), sending OSC directly to RAMDance Toolkit app, however, NI-Mate should be taught a proper format. I don't have NI-Mate, so cannot comment on whether it is possible to set it up with the format that RAMDance Toolkit app understands (if not, I guess a separate OSC routing app could be used for that, such as osculator), however, in a few days I am going to tackle this with my own NI-Mate-like application, and should be able to figure the format out -- I intend to look at the RAMDance Toolkit's source code.
--8 -
thanks @eight - awesome - that's what i suspected. i'll have a go at linking it through OSCulator, or even through Isadora??
my main problem is that there's no options for setting up the live feed in the Toolkit - only for loading prerecorded actors.i'm hoping that if i manage to get an OSC stream formatted properly (there's details in the wiki somewhere) then it might just magically appear in the Toolkit.j -
.... i was considering having a look at OpenNI - i think that might be an answer - but its looking too complex for my little brain. i'm an artist, not a coder. thats why i use Izzy. the visual nature of patching in Izzy makes total sense to me. using terminal to install libraries is beyond me.
-
@dbini: RAMDance Toolkit contains an OpenniOSC app, which, when compiled should work out of the box (that is getting data from kinect and passing them to RAM Dance directly and in correct format). I am going to look if I can compile it as a standalone, so that no libs install is required. I am assuming you are on Mac.
--8 -
that would be nice. i've also been in touch with those nice guys at NIMate to see if they'll consider writing Toolkit support into their application alongside the existing support for stuff like Blender and Animata.
-
@dbini Attached is an OpenNIOSC app with all the libraries included. It runs, but I cannot test it further than that, since my kinect went for a walk with the friends.
Please try to run it and report what you see.Thanks.--8Edit: removed the attachement (it did not see the kinect). -
thanks @eight. when i run it i get a blank grey window. i would suspect that its not working on my machine.
-
@dbini: That's what I am getting when there is no kinect connected to the computer and xcode is telling me that device is not found. We'll have to wait for my device to come home to test it further.
-8 -
John, i was not looking at the post for a few days, I have not being doing anything any-more with RAM, waiting to see if a friend that could help would pass by (from 100 Km away) on Thursday evening. The only thing I'm able to get is the multiple data from RAM OSC to Izzy...
-
@eight I also could only get a grey screen from your file. Any way I suppose you need to create a new dance actor on RAM to be able to link it on incoming OSC control, correct?
-
We compiled the empty example, but it gave us errors in the LIBRAM.a compilation.
We tried to compile the libram, but it continued with errors... I wrote them asking for help... -
After some trial + error I managed to get the examples to compile. Don't forget to patch the ofxUI + checkout the submodules as detailed at the bottom of the 'how to setup' page.
This is a really nice tool + would be great to add a syphon outlet to it if there's a OF guru out there who's up to coding it? -
@bruper –– I got my kinect back and see that the app I compiled does not see it. I am removing that attachement. At the moment I am modifying my own application to send kinect skeleton to RAM Dance via osc. I see that the actor appears in RAM Dance Toolkit as soon as the signal is received by it - no need to create new actors. I need to flash out some stability bugs, before I can post this app here, though.
--8 -
@eight this is great news !! Looking forward