Speed min/max in 2.1 Movie Player only assignable scope -2/2?
-
the Movie Player (GPU) Speed parameter seems to be set so that only double-speed forwards and backwards is the max. it won't accept -10/10 like the Classic Movie Player, whether it's optimized as Performance or Interactive. Is this a bug or a real thing? cheers m
-
I believe this is a limitation with AV Foundation. Quicktime just isn't a reliable and valid framework anymore.
Try using a classic movie player :) -
hmm. i'm remaking a multimixer sequencer patch (foundation is DusX's sequencer!–thx DusX!) and swapping out the old CPU actors for GPU. it runs amazingly well and can handle much more stacked interactivity (running interactive, not performance, so using QT.) however the speed and zoomer controls (both size and position within the frame) are used a lot in the show. the FFGL PanSpinZoom is also more limited in its parameters, and cannot be set higher than 100 (which seems to perceptually equal 200% but nothing approaching the original Zoomer actor.) so i'm using GPUtoCPU->Zoomer->CPUtoGPU inline to retain the the zoom capabilities required. but there's no modular CPU actor that handles speed; it's just built into the Movie Players, which cannot accept a video input. so i can't figure out how to access this live without sacrificing all the useability. all the other interactive elements–especially jumping and scratching backwards forwards–work so much more effectively that i can't see go back to the CPU version. i get it that so far QT is being retained in the new versions only cause AVF does not work well interactively and a lot of the Izzy community uses those fx. it sounds like no new QT-based interactive components will be forthcoming. but since the Movie Player can still switch between QT and AVF to fulfill interactive capabilities, do you think there's a possibility of retaining the GPU Movie Player speed-change potential when in 'interactive' mode? ...or a speed actor that can be used inline without changing the engine for the whole patch?
-
Since you are using @DusX 's method I might ask him to shed some light on this.
In fact he may have had to swap CPU and GPU actors around for his sequencer? Is this true Ryan?