Kinect + Isadora Tutorials Now Available
-
Ah cool - in which case I’ll look to reproduce for win. Never really bothered hooking it up to a PC as it was always simpler to get things going on OSX.
-
Just skimmed through the windows sketch - yeah it'll be no bother to port. Will get it done this week.
-
Something I cannot find is the way to change the Processing sketch to start with camera in RGB mode or IR MODE.
I tried with no success to change it in the OSC multitrasmit of the "Isadora Skeleton Test Mac" supplied in
https://github.com/PatchworkBoy/isadora-kinect-tutorial-files
and the Processing sketch of 9 days ago and the today one.I searched in the Processing sketch if there was something I would be able to change but nope... It starts on kCameraImage_Depth = 3.
I would like to try the others kCameraImage.
-
Dear @bruper,
Look at this part of the code:// --------------------------------------------------------------------------------// CAMERA IMAGE SENT VIA SYPHON// --------------------------------------------------------------------------------int kCameraImage_RGB = 1; // rgb camera imageint kCameraImage_IR = 2; // infra red camera imageint kCameraImage_Depth = 3; // depth without colored bodies of tracked bodiesint kCameraImage_User = 4; // depth image with colored bodies of tracked bodiesint kCameraImage_Ghost = 5;int kCameraImageMode = kCameraImage_IR; // << Set this value to one of the kCamerImage constants above// for purposes of switching via OSC, we need to launch with// EITHER kCameraImage_RGB, or kCameraImage_IRYou should be able to set it to kCameraImage_RGB to show the RGB channel.Best,Mark -
thanks, yes how silly of me.. totally overlooked...
-
@Marci thanks for the fast reply ! i will give it a try on Wednesday when i return from venice as i don´t travel with my kinect!
-
"tried with no success to change it in the OSC multitrasmit of the "Isadora Skeleton Test Mac" supplied in https://github.com/PatchworkBoy/isadora-kinect-tutorial-filesand the Processing sketch of 9 days ago and the today one.”
Like I said, if you start in RGB you can’t switch to IR. If you start in IR you can’t switch to RGB. Limitation of the Kinect hardware, not Processing. Nothing anyone can do about it regardless of what software you’re working in.Both can’t be activated and switched between in a Processing sketch. You must choose one to work with. You can have (IR _OR_ RGB) & Depth & User & Ghost. You can’t have IR & RGB & Depth & User & Ghost. -
The confusion everyone is making here is passing the camera images to Isadora full stop.
If you’re creating and affecting visuals in Isadora, pass OSC data to Isadora, and do everything visual in Isadora.If you’re creating visual effects in processing, just use processing. Then only pass the syphon feed to Isadora to integrate that into an Isadora scene. Pass the minimal OSC data as needed to control anything additional you may want to control in Isadora, if anything at all.The majority of folks dabbling in Kinect at the moment here seem to be wanting to produce visual effects that are done wholly in processing (i.e.: stuff from openprocessing.org, but substitute mouse for hand/kinect).Isadora exists so people don’t have to learn code to create visuals. When using Processing, you’re just using Isadora as a camera / media player switcher / sequencer, and hardcoding all your visuals and interaction within Processing.You have to abstract how you think about it. Work out what you want to achieve and WHERE to achieve it... -
UPDATES
Have brought the Windows Processing Sketch up to date with the Mac Processing Sketch. Both now identical bar the syphon / spout bits (i.e.: Windows uses Spout, Mac uses Syphon).
Also, both sketches now output all detected skeletons via OSC to the address:/skeleton/[useridnumber]/[limbpart]I’ve updated the Mac Isadora file to reflect this in the stream setup FOR THE 1st SKELETON ONLY! I can’t do the Windows Isadora file as don’t have a windows machine handy I’m afraid (if someone wants to open up the Windows Izzy file and replicate across the OSC MultiTransmit actor and comments from the Mac file, and prefix all the OSC streams in stream setup with /skeleton/1 and attach it here, I’ll add that to the gitrepo also.)To receive multiple skeletons in Isadora, users will need to head to stream setup and add the channels for /skeleton/2/ thru to /skeleton/6/, and then duplicate the various Kinect Point actors and set their channels correctly (again, if anyone wants to do that and send me the files I’ll happily add them to the gitrepo).https://github.com/PatchworkBoy/isadora-kinect-tutorial-files -
Marci -- when I pull the izzy file (sadora Skeleton Test Mac.izz) from github it tells me that the file can't open because it was written in a later version... I have 2.1 -- what can I do? I have a mac and just got 2.1 in Nov...
-
download the file from github again... sounds like an interrupted upload, so have just reuploaded it.
-
(For reference, it was written in 2.1)
-
I also have this problem with both the Mac and Windows files, and I don't understand it. I will pm you. -
Well I haven't touched the Windows Izzy file, just downloaded and uploaded to GitHub... the Mac file works for me, and I’ve just redownloaded it from github, and that one works for me also.
Will fire it over momentarily... -
files now in hands of the Izagods!
-
@Marci --
Thank you so much for your AMAZING contributions to the Kinect project! -
No probs - all work I'd already done in Sensadora - an Open Source CC0 licensed Processing sketch specifically for getting all the various Kinect usage cases into Isadora - not finished yet. Single sketch with options for blobs, skeleton, depth thresholds, pointclouds, gestures, 2D & 3D physics engines, & multiple kinects. Also with added leap motion support - idea being to allow the obvious Minority Report display interaction with a media library of audio, video, stills and live streams, along with generic OSC, serial, & http outputs for controlling Ptz cameras (VISCA / IP / Galileo) and link to Domoticz home automation for physical control over mains sockets, inputs from contact sensors etc & other smart devices. I got sidetracked by homebridge-edomoticz development, but that's launched & support requests slowing down now so should be able to get back onto it in a week or two. S'one of those ideas that just keeps expanding! Joys of a severely autistic son who demands LOTS of sensory input! The playlist_sketch at https://www.dropbox.com/sh/lyxj60kf0k6jke7/AAAF5MszGamDSuiSP4fgH9t5a?dl=0 is the experimental playground for a lot of it.
-
So there is a issue, with downloading individual files from Github.
If you save a .izz from a link it will not open.Individual files must be downloaded from the 'view raw' option.Otherwise, downloading the ZIP package is also fine.For example, right clicking to save file as, from this link:
https://github.com/PatchworkBoy/isadora-kinect-tutorial-files/tree/master/Isadora_Kinect_Tracking_Mac
Will deliver a unuasable Isadora file.To get the same, you need to click the filename, and then 'view raw' -
Ah, yeah. Standard / by design. GitHub is a version manager / online source viewer. Not designed for pulling individual files. Download via provided Download as Zip button top right of the repo frontpage, via view as raw / open binary link on individual files, or use any git client (https://desktop.github.com) to properly clone the repo... which will then handle incremental updating etc within the client, along with ability to roll back to previous versions, switch to alternative branches (eg: master, beta, stable), allow end-users to submit proposed changes back to the repo, issue tracking etc. Any right-click > save as action will just download the JS/htm src for the file viewer at a guess (open it in notepad to verify)
-
I'm sending here an updated version of my very old Isadora Kinect Skeleton Part user actor. (for History I put it in the ni-mate forum a long time ago while I still used their public beta before the 1.0. They liked it so much that they offered me a licence of the version one when it came out).
It is essentially what Mark has done for the new tutorial files but you can see in the actor's output the actual name of the part which I find useful and there is an additional input I called osc offset. I added it because kinect might not be your ONLY Bosc input and if you have fed Izzy with another device ( like an iPad with ouch osc BEFORE you send the kinect data in... you're screwed. Kinect point actor won't work unless you change your osc input inside the actor on inside the Stream Setup window.With my actor and you just give to osc offset input the last Bosc port number you already have.The actor will automatically shift all three osc listener channels inside so you have the right value for the right body part