assurance-tunnel
assurance-tunnel
assurance-tunnel
assurance-tunnel

Audio Features for Isadora: What Do You Want?



  • @bonemap said:

    An eight channel file might be in the format of eight stereo channels and may then require panning to to isolate the monaural tracks...

    First of all, let's make it clear that AIFF and WAVE files do not themselves support the notion of "stereo" channels. A mono file has one channel, a stereo file has two channels, a quadrophonic file has four channels, etc. Your example above would end up being expressed as 16 individual channels in those file formats.

    It is true that QuickTime movies support the notion of a multiple tracks, and each track can have an arbitrary number of channels. (None of the Windows formats support this idea as far as I know.) My proposal for an movie with eight stereo tracks is that we would view them like the AIFF files: as 16 invididual channels. Then you can route anything anywhere you want. 

    For example, here's eight mono tracks routed down to stereo. Here the panning would be clear because you end up with two outputs.


    Or a different routing, where all eight channels are being route to all eight outputs. What does panning mean in this situation?


    Best Wishes,
    Mark


  • Beta Gold

    I think panning is only useful in 2 speaker setup. Once you got to multichannel output directions go in 3D. I still think the modular way is the most flexible. Suppose you have a background running in 5.1 with 6 discrete WAV channels.  But you want to pan around 360 with a live input over it. 

    Input blocks, mix/routing/panning blocks and output blocks.


  • Beta Platinum

    @mark said:

    Your example above would end up being expressed as 16 individual channels in those file formats.

    Thanks for correcting that - these posts are not user editable. The intended comment was meant to read four stereo pairs becoming an 8 channel file.

    If stereo pairs are going to be irrelevant to the ‘sound player’ then Stereo panning is irrelevant too, I would have thought.

    Best wishes

    Russell



  • i would think that panning is not a necessary feature to be included inside actors. presumably a panning effect can be achieved by patching something together that combines the matrix with separate level controls for each channel anyway. if the way you have approached sound routing doesn't ever mention Left and Right, it doesn't limit your setup to pan-able stereo.

    Ableton Live has a feature where you can assign A or B labels to different tracks and use a crossfader between the 2 (groups) - unassigned tracks are unaffected. 



  • @mark panning for more than 2 tracks is pretty irrelevant without some kind of spatial audio engine and an idea of speaker locations. Systems like spat, that allow for that, understand the locations of speakers and use something parallel to ray casting to calculate if a multichannel sound was rotated in a multi speaker environment what would it sound like from each speaker. Without all this extra data this panning is irrelevant. With individual volume controls for each channel sounds can be rebalanced to suit a speaker setup, or re-routed for miss-matched channel mappings, or where the multichannel is used to carry sub-mixes or headphones feeds create sends and sub-mixes. This is a pretty big step forward and when serious audio work in a spatial environment needs to be done then other tools are needed.



  • @mark panning a stereo file also need a -3db on central position. Panning a multichannel audio need something more complex so at least for this first iteration can be left out. Just left to 1 3 5 7 and vice versa can be enough.



  • @maximortal said:

    panning a stereo file also need a -3db on central position.

     Yes -- the panning uses the -3db "equal power" formulas. There are actually a few panning formulas.... but that one is common.

    Best Wishes,
    Mark



  • @bonemap said:

    If stereo pairs are going to be irrelevant to the ‘sound player’ then Stereo panning is irrelevant too, I would have thought.

     Well, if you're outputting to a pair of channels, then I would expect panning to work, and it does. 

    It seems like the general consensus is that this is the only situation I should worry about. If you're outputting to more than two channels, I think the pan input will show as "n/a" to indicate it is not applicable.

    Best Wishes,
    Mark


  • Tech Staff

    @bonemap said:

    these posts are not user editable.

     You don't get these two options by clicking on the three dots at the bottom right of your comments?



  • @woland said:

    You don't get these two options by clicking on the three dots at the bottom right of your comments?

     It's because this thread is in Isadora Annoucements -- I think this has some limitation for the users in terms of editing. We could move the thread to another category and that would probably solve it.

    Best Wishes,
    Mark



  • Audio and Timeline


    I know this would be a FUTURE request; but for me one of the most important features missing in Isadora is the concept of timeline and events.  Audio and Video are to me obvious ways in which to implement this approach to Izzy.

    I would love to be able to synchronize multiple events (triggers of numerous media, controllers, etc) in exact relationship to TIME.  

    If the audio or video had a correlated grid where one could place multiple events, my live performance creations would progress dramatically with less programing time.

    If any of you recall Macromedia Director program (long gone) that interface was then absorbed into Flash.  This timeline based software is incredibly powerful but does not have the flexibility and programming possibilities that Izzy has.  To me, if this were added to Izzy... it would move Izzy into a new category of usability.

    My 2 cents worth :)


  • Tech Staff

    @kdobbe said:

    one of the most important features missing in Isadora is the concept of timeline and events. 

    Isadora is Scene-based and while it can do linear-cueing, it's not timeline-based linear cueing. In turn this allows for greater flexibility and the possibility to do non-linear cueing. "Events" though can be created with Timer actors, Trigger Delay actors, Clock actors, Comparators, etc.

    @kdobbe said:

    I would love to be able to synchronize multiple events (triggers of numerous media, controllers, etc) in exact relationship to TIME.

    You can build your show to run off of time. It's not a graphical timeline interface, but there's the Timecode Comparator (and the afore-mentioned actors).