Isadora in action on Windows 10 - some findings
-
Hello all,
I just returned from a performance at Sound Symposium in Newfoundland for which, for the first time, I used Isadora as my video tool. The performance was for a "generative audio/visual group" that I have been a part of for the better part of 15 years. In the past, I have used Resolume, Arkaos, and VDMX. I have been using Isadora for stage stuff and thought to use it for this work as well - the idea being that I could completely customize the workflow to this project and my thinking.
Preamble
Here is a link to a brief documentation of the rehearsal (note, audio/video not synced in this video):The group, Faceless Forces of Bigness, has 3-4 musicians using analog electronic instruments set up in a fashion to create generative music. The performers change the generative conditions which changes the patterns and tonalities of their performance. The concept behind the video is to set up audio reactive imagery and sequences that responds to each performers audio and explores tensions between continuous and discontinuous things (language and nature, digital and analog, words and picture, whole and fractured etc).
To accomplish this, Isadora is setup to take 4 live audio inputs using a USB audio interface (I used a Roland Octo-Capture). There are also 4 video tracks. The audio can be routed to modulate various parameters in the video effects of each track: clip time position, scale, x/y position, transparency and hue - all with influence determined by me. There is a toggle to combine the top two tracks using either an (FFGL) ADD50 actor or an FFGLAddAlpha. This is the same for the bottom two tracks - so I have 4 video tracks combining into two projectors:
V1+V2-> Projector 1
V3+V4-> Projector 2
I kept the effects as simple as possible. Each track has the possibility of using:
- Transparency
- Inverter
- Shapes (audio reactive - I used these initially to measure levels of each audio input, but they looked so cool, I used them in the performance)
- FFGLPanSpinZoom
- FFGLColorizer
- FFGLMotionBlur
- (FFGL) LoRez
- (FFGL) Slide Glitch
Each of these have a toggle for bypass.
I ran the show from a 2014 Asus G750JH 32GB GTX780M-4GB, W10PRO, Izzy 2.6.1.Findings
I was initially concerned about being able to work with 4 tracks of video with effects and audio reactive parameters all running from my (older) laptop. The video footage is material from years of collecting material an, as such, has variable frame rates and pixel dimensions.I did manage to achieve a good solid 60 FPS (28-29 fps minimum) though the following choices. Please keep in mind that these choices are likely mostly applicable to windows:
- footage is all compressed to 640X480 HAP QUICKTIME. I use an automatic scale of 138 on the projectors to get my 16X9 frame.
- Outputting a stage of 1280X720 over HDMI. I used an Ethernet extender to get the signal to the rearscreen projector - worked like a charm.
- Using Quicktime rather than the windows native playback. The windows native playback ate more cycles, was glitchy and unpredictable when looping and using effects etc. I think that the quicktime implementation is more robust - even on windows.
- Setting the project target frame rate to 60FPS, and the general service tasks to 4X
- Used the third party FFGL versions of plugins - even the "not recommended" versions from Marc's collected FFGL plugins. I found them to be MUCH faster than the core actors in windows. I have a Resolume license and pulled in a number of their FFGL plugins to use in Isadora.
- SET UP BYPASSES FOR ALL EFFECTS!! This was a huge one. It is possible to run multiple effects on all channels, but if they are all on at once, it gets choppy.
- Set up a track by track and a global reset. For when things get weird, it's helpful to be able to reset each track or everything.
- I ended up using a lot of "shift" keys on my controllers in order to fit all the functions and parameters for which I wanted control. I have been using the Behringer CMD series up to this point. While I managed well for the performance, I wished for a less fiddly workflow, so I am going to change out controllers for ones with more buttons, knobs etc.
- Using the Data Array actor with textfiles that contain the midi controller values makes it easy to divide the patches into tracks. My main controller has four "tracks" and I used these four tracks as my framework. Each "track" has it's own line in the array and contains the controller values for each pot, encoder and button. The very last line contains the values for "special" buttons and encoders. I counted buttons and encoders index from the bottom up. I would rearrange this in the future as the Data Array actor lists from 1 to n - top to bottom. It's a small thing, but will avoid some serious spaghetti.
- Visible feedback on the controllers (IE change button colors) was helpful. I didn't do a lot of this, because time. Where I did use it, it was super helpful. I will endeavour to add more of this.
- pots suck for this type of work. If you are using shift keys to access parameters on the same knob, a pot will cause jumps in your parameter values - rotary encoders are the way to go. I will likely go for the Launchpad to chose bins and a BCR2000 for parameters.
- I did find, when I had a lot going on, that midi was not as responsive as I would like. I will need to optimize my patches a bit more and perhaps provide some visible feedback on the controllers (button colors etc) for confidence.
Wrap up
The show was super well received. We packed the house for the show, and for the next day's workshop (which I think is a good sign). There was great interest in Isadora as the video tool. Many of the performers have rigs that use Resolume with Live, or VDMX or Arkos. The feedback I received was the video had a strong generative, audio reactive feel to it which many of the attendees at the workshop hoped they could incorporate into their shows. I think that the incredible lack of latency and the ability to route four discrete audio inputs to parameters really add to the feeling of integration. One of our members, John D.S. Adams, who really lead the charge to perform at Sound Symposium, won a best of festival award for innovative intersection between technology and music!
From my own perspective, working with Isadora this time, I was really excited about the process and the results. I have been aiming for a strong audio reactive link between the music and picture for 15 some years. I feel that I finally achieved it with this iteration of our work. The latency was undetectable, and having the ability to route each performer's audio to different parameters added a great deal of visual interest.
It was like painting with moving pictures. Such a pleasure.
I hope that these finding might prove useful to some of the forum members!There are a number of upcoming performance opportunities for us, I will post the development of this work moving forward.
- Justin
-
@jtsteph thank you for taking the time to write such a great review of your project. I have forwarded this to the team.
-
Great to read a detailed account of your project. Congratulations on achieving a high level of success and satisfaction using Isadora. It would be interesting see more visual documentation of the production. I am very intrigued!
Best wishes
Bonemap
-
Thanks for your feedback @Skulpture and @bonemap !
I recorded the whole output of the show, we also had a lockoff and did some aftershow b-roll with a drone and DSLR. The guys pulled a 2 channel mix for me, too (the show is performed in quad). I will get a short cut down together in the coming week or so and post a link here for you. We're applying to some more festivals here in Canada and will be using the documentation for our applications.I will also have a few specific questions about some math stuff that I will post (on a separate thread) in the coming days as I rebuild things a bit to optimize - specifically about using audio to modulate "around" a parameter (using audio to modulate in both + and - directions). For the show, I mapped the center point of a sine wave to my parameter value (IE X Position) and then multiplied the audio against the wave to get a modulation that was both positive and negative. It worked but wasn;t great - much too "regular". I think I might need to take a page out of digital audio and work with random noise (IE noise shaping). In any event, this is a question for another thread.
Best,
_J
-
@jtsteph
I like it. Very much my type of thing.
It's great that you found you could move to Isadora from VJ-specific tools and come out ahead.
Isadora really does allow you to create tools to create art, rather than just put on a show.
I often find there is something in creating a tool, that helps you find the meaning in the content.
Anyway, great work, I look forward to seeing the footage. -
@DusX You hit the nail on the head, "I often find there is something in creating a tool, that helps you find the meaning in the content." I love Resolume etc for what they do, but they do force you into a "performative framework" that may not be aligned to what you are trying to dive into. With Isadora, I was able to make a tool that suited the collective dynamic environment we have been working on for years...finally. As you put it, it certainly helped me find meaning in the material.
BTW, I just - this evening - picked up a BCR2000 controller (recently discontinued) on Kijiji. I've done a run through of it, and it internally sorts all the midi logic shenanigans that I have been doing through spaghetti shift registers, toggles and gates. This will likely improve my midi latency issues.
-
@dusx said:
Isadora really does allow you to create tools to create art, rather than just put on a show.I often find there is something in creating a tool, than helps you find the meaning in the content.
This statement provides a great insight and reflection on the creative process possible with Isadora! It is not always easy to articulate the impact of a particular creative processes, on personal artistic practice.
best wishes,
bonemap
-
Added a bit to my original post about using the Data Array actor with textfiles, containing my controller and note values, to map my midi controller to parameters. It is a super useful actor that made quick work of mapping the controllers, and allowed for flexibility in making changes.