Type of video stream (ci / gpu) and connecting them
-
Thank you DusX for your explanations.
Does the image to texture actor exist somewhere ? -
Currently in 2.0.5 the way to convert an CI video stream to a GPU stream is to use both the 'Image to Video' actor and the 'Video to Texture' actor.
I believe this process is identical to what an 'Image to Texture' actor would be doing internally. -
Dear @stephanegattoni and @DusX,
First, some clarifications about vid-ci. @DusX's is correct when he says that Apple's Core Image video can be either CPU or GPU based. But, in Isadora, all ci-video should end up on the GPU. (It is possible that some Quartz Composer output may not be this way... but even QC generally ends up on the GPU.)He is also right to say that the only way to convert an CI video stream to a GPU stream is to use both the 'Image to Video' actor and the 'Video to Texture' actor. But, unfortunately, this conversion is going to be very, very inefficient (i.e., slow.)I need to create both a gpu -> ci and a ci -> gpu conversion actor. If I do the conversation internally, it will be very fast. I will make sure that the next release has these actors.Best Wishes,Mark -
Thank you Mark for your detailed answer.
All best,
Stephane -
Dear @stephanegattoni,
I created the Texture to Core Image and Core Image to Texture converters. We're testing them now, and they will be in the next release.Best,Mark -
Great THX !
Stephane
-
Still, Mark and DusX,
I don't get the need of these conversions with the 2.0.5 version of Isadora : since now we have cpu and gpu in all inputs inputs and projector why do we need a second type of gpu tratment of the image (i.e.) the CI actors. Is this in order to be able to integrate the Quartz composer actors that the community develops? -
I think it comes handy if you want to mix cpu, ci and gpu actors in the same chain.
Best Michel
-
Hi everyone,
I'm actually testing this great new Isadora 2.0.5.I think I get the most of new features and the difference between GPU and CPU preferences.I just wonder if there is (or will be) a way to use all the "classic effects" (dots, motion blur etc...) in a GPU chain to keep best performances ?Best.Matt -
The 'old' effects are cpu based... and many are being ported to GPU. There will be many GPU effects available in a soon to be released version, however; they are not exact clones of the previous cpu effects. I still use the old effects as needed. This purpose is why I requested that the video converter from gpu to cpu include setting the size so that a smaller cpu stream could be easily used along side a gpu stream (I typically 1/2 the size)