Most efficient brightness calculation
-
What is the most efficient way to obtain the brightness of movie that is being played with movie player? I am trying to obtain the brightness of three full HD movies playing to three projectors on a 2.8GHz MacBook Pro. I can get 25 fps until I use the calc brightness actor with the three movies, then the frame rate drops significantly. I have tried reducing the incoming video using the scaler actor (to 10x10 pixels) or chopper (a 1x1920 crop), but the frame rate still drop. All actors are using GPU video. Anyone know of the best way to do this? Thank you.
Don
-
The Calc Brightness actor is inherently inefficient because it must calculate the frames on the CPU (it's not possible to do this on the GPU). To reduce it's workload, use a Scaler actor to resize the video to a much smaller size. The scaling will be done on the GPU. Then measure the brightness of the output of the Scaler. Given that you're measuring the brightness of the entire frame, this should not be terribly inaccurate.
Best Wishes,
Mark -
As Mark has mentioned, scaling before measuring is your best choice.
I have found that scaling by an even factor seems to work best... so divide the width and height by, 2, 4, 8, or 16 etc... until its small enough to allow a good framerate.
You could also, try adding a 'multi blocker' before scaling, and limiting the frames thru.
A 'time' setting of 0.08 for the MultiBlocker will give a framerate of 12.5 fps thru [ (1 / (25 / 2)) where 25 is your isadora target framerate ]
Half as many frames to process. This will help, but adds some latency.