[ANSWERED] Immersive installations+Isadora
-
Hello everyone ! I wanted to ask about your experience and considerations about how to be able to carry out immersive/interactive experiences with Isadora like those of these artists, for example: Adrien M And Claire B https://www.am-cb.net/en
Jen Stark https://cascadeshow.com
What other software or programming languages do you think are involved and what hardware?
Thank you so much !
Maxi-RIL -
For the augmented reality (AR) stuff I saw on their website, you need to be able to write applications for a smartphone/tablet I believe, and it looks like they either bought or rented a ton of tablets (which can get expensive quickly).
Other observations:
- Infrared tracking (such as using the prop vacuum to affect a particle system projected from above) was heavily used in the works shown on both websites. It's one of the only ways to track multiple bodies in space regardless of what you're projecting onto and around those bodies and, importantly, also being unaffected by both bright and low-light conditions
- Edge-blending of multiple projectors and projection mapping (projecting on the cube, onto the floor and up the walls, and covering large areas)
- Projectors higher than commercial-grade that are capable of top-down projection (projecting straight onto the floor is something that some commercial-grade projectors aren't designed to do, so they can overheat and you might end up with several different problems)
Sadly the most important components for large-scale, interactive installations like this are money (for equipment, technicians, space rental, etc) and time (which is also basically money, as in the several months it takes to make such large-scale installation work, you still need to pay bills, rent, buy groceries so you can eat, etc.)
This may not be applicable in your case, but for anyone else reading this who might find this advice helpful: You can make some neat interactive installations, or test out the basic concepts for larger installations, with one or two smaller projectors, a laptop (which has an RGB camera, a built-in display, speakers, and a microphone), and (optionally) a depth camera like a Kinect. Learn how to do body-tracking with IR, create interactive content, perform edge-blending & projection mapping on weird surfaces, and refine your ideas/experiment with techniques on a small scale before trying to go huge.
-
Thank you @Woland for your detailed response !!...and the suggestions to start with something smaller.
big hug,Maxi RIL
ps I will surely come back with more questions haha -
@ril We just saw a pretty big show in Montreal. It was a destination for our visit last weekend. There were three galleries, one huge and two smaller. All of them used all four walls and the floor. One of them had motion tracking on the floor and the animation responded to the position of gallery participants. The visual and projection design was gorgeous as was the sound. The animation was a bit rudimentary, but the overall experience was really spectacular. https://oasis.im/en/dreaming-a...
I wondered if they did this all in isadora or touch designer or....
-
@jtsteph thans for shearing!
Amazing this inmersive instalation...It would be great to know and learn more about this proceudure (motion tracking on the floor) also about hardware.
It is evident that this is cuting edge and Isadora should be part of this in some how....also with reactiv visuals
What do you think?
Best,
Maxi RIL
-
@jtsteph said:
@ril We just saw a pretty big show in Montreal. It was a destination for our visit last weekend. There were three galleries, one huge and two smaller. All of them used all four walls and the floor. One of them had motion tracking on the floor and the animation responded to the position of gallery participants. The visual and projection design was gorgeous as was the sound. The animation was a bit rudimentary, but the overall experience was really spectacular. https://oasis.im/en/dreaming-a...
Looking at that the first thing I think of is how weird the content must look as a video or picture file on a computer. It'd be such a strange resolution and shape. Very cool though!
-
@jtsteph said:
I wondered if they did this all in isadora or touch designer or....
You could contact them and ask. They might tell you.
-
@ril Adrian m and Clair b use software they make themselves. They had released a version of this called emotion a long time ago. I guess they continue working with their own tools.
I think you can get really close to making things like this with little or no investment. You can find or make some videos that simulate tracking camera feeds. This can be done in after effects or blender (blender is great because you can make a scene and render out a few different view posts to show different cameras. You just need to make a moving image/object/body in white on a black background. This can be tough to achieve in real circumstances when you do tracking, and that is one of the costs
You can also then arrange image planes in 3D in isadora and to simulate a real installation. This will let you go quite far, learning how to deal with tracking and make your visuals react. You can also have your outputs ready to feed to projectors.
Making this kind of simulation is very common, other software also provide a wisywig interface as default. Once you have a good simulation it is much easier to get support for a project so you can move on with production. It’s not a great idea to work with all the expensive gear from the start anyway.
Once your simulation and outputs are running you can look to replacing the fake tracking files with real camera input.
Here are some recordings I made with a Kinect to do some tracking training. Some challenges are seeing if you can properly identify bodies even when their paths cross or they pass close to each other. They are also useful for learning to make reactive content from input data.
https://www.dropbox.com/sh/clssqyxk7m0n0b5/AAADxVY0mVuuBFqKxCWGTAr6a?dl=0
There are versions at 2 resolutions, but the footage is from the Kinect depth map so it is 640*480. The high res is good for checking system capability if you use higher resolution cameras to get more smooth detail.
Fred
-
Thanks Fred for all your insights !
Your overhead recordings with the Kinect are tremendously helpful!
Clearly the question of interactivity/reactivity is one of the most difficult points because that involves other software besides Isadora?The other situation that I don't know how to deal with (and what elements will be required) is the following: how to make any wall reactive? Is it through shadow tracking? Using Kinects in an overhead position is it possible?
Several people have already asked me about this type of immersive experiences and I am missing answers. And clearly it is a spectacle trend with greater demand.
I feel quite lost really.big hug
Maxi-RIL -
@ril You can do some pretty good work with Isadora. You should start to play with these videos and see if you can make some visuals that react to them using Eyes++. You should track the moving objects you want to track. In my humble opinion, this is best done with an IR camera and IR lighting (and also best with an IR bandpass filter). Kinects are not great to use on stage; use a camera with SDI or HDMI out and convert it to SDI. A camera with interchangable lenses is best so you can adjust it to spaces as required.
After you have played with some of these videos and made some interactive material (you can edit the videos to create a set of scenes for a better kind of simulation), just imagine what you want to do.
What is the format of the projection - wall, floor, transparent layers? What do you want the visuals to be? What do you want them to react to?
Once you have a concrete scenario, you can start to solve the issues or ask specific questions one by one.
-
Thank you very much Fred! I agree that step by step is the best option to learn. And small-scale simulation is an excellent way to better understand the sensing and processing processes for creating visuals.
Big hug,
Maxi-RIL