OK, glad you solved it. As long as you use only vid-gpu, you shouldn't encounter any of these problems. But again, take a look at the input or output ports. If they are green, they will "mutate" to whatever video type you use. But once you make a connection, they'll turn blue to indicate that they can no longer be changed.
(I know it's all a bit of a pain, but it was necessary to allow old patches to run.)
"Syphon Receiver"is already there in Isadora as a node actor. the device running the EpocCam app and the computer running Isadora must be on the same network. The connection is automatically available and indicated by the EpoCam app on your device.
Click into the 'Server ' field of the 'Syphon Receiver ' actor to select the EpocCam stream.
Gotcha, good to know! I think they already have the projectors they plan to use, I unfortunately wasn't given those specs. So was trying to figure out what would be the easiest and most familiar aspect ratio to work with. Thanks for the info!!
I have been quite lucky with my projector placement i'll be honest. When its off axis I know you can have problem with focus and anti-aliasing, etc. But it obviously depends how much off axis we are talking.
@bonemap -- I haven't read your thread yet, but we figured out that AirServer takes possession of the screen on the computer whenever the iDevice's screen is rotated. The solve for this is to lock the screen orientation on the device.
@barneybroomer -- Setting the iPhones into Guided Access mode is essential to avoid the screen getting locked out or the app getting closed by touching the Lock or Home buttons. Guided Access also allows you to disable parts of the screen on the device which the performer might accidentally touch and cause the app to open a display you don't want to appear.
Regardless, if possible I recommend training performers on how to use AirPlay and Guided Access. In our process the performers were a quite overwhelmed by it, so that burden fell to our diligent ASM. However, if your actors comfortably understand how Guided Access and Air Play mirroring work, they can easily reestablish a lost connection.
Thanks for the info! I'm in need of an upgrade, and deciding between the Blade and a MacBook too. I'm grateful for all the insight, and it's good to hear from people who are moving away from MacOS, and who are pretty pleased with the Blade.
The most attractive thing to me about eGPUs is getting more outputs - more powerful cards will see a bigger reduction in performance over TB3. The Blade, for example, is difficult to upgrade with an eGPU because faster cards than the built-in 1060 want more bandwidth than TB3. It's still possible though, and at least offers the extra outputs & quieter fans.
I solved this issue using LG Plasma TVs (specifically 50PK350) and controlling them via RS232... there was a hex code you could send to disable the OSD, and then do all the input switching over serial, but sadly can’t find the command anymore in my archives.
I'd be very interested in your application especially if it will be useable with minimal programing. The possibility to have the tweet/user name/tweet time as separate OSC messages would offer useful flexibility. Could there be an option to have the the tweets available as soon as they are posted as opposed to every 30 secs.
I'm planning an interactive performance in which the public/audience contribute text messages that are integrated into the performance. I wasn't thinking of twitter but I suppose using social media exponentially extends the time/space/memory of participation beyond the present which is a good thing.
@Maximortal I can make it work, but. Streamline solution would be a great addition. Same place as the current fade controls at he bottom of the scene, but just separate audio and video times. Other solutions are not easy to reorder. The actual fades now are.calculated separately, by we can only set one time for both audio be video right now.
Looks like your connection to TroikaTronix Community Forum was lost, please wait while we try to reconnect.