Hello Troikatronix Team,
Thank you for version 4 . . . enjoying the new features.
One issue I'm having with the upgrade:
- I use the 'artnet receive' in a user actor to control my theatre lighting. In V3, I could use an 'User Input' to specific which channels to listen to. With V4, the 'channels' input no longer accepts an input from the 'User Input' (even if it's changed to a text input).
Since I am using one user actor to control a bunch of different lights, it was great to be able to specify which channels to listen to from outside the user actor. Is there any way for me to set the 'channels' on the 'artnet receive' from outside the user actor?
Thank you for any suggestions you might have.
Sincerely,
William
@dusx As I work through learning more about Python development, I have learned about using a "message bus" to send things back and forth from modules.
I'm making a modular Python project for some housekeeping here and have made some stateless utility modules to do things like generate xxhash values, read csv files, update SQlite records etc. I learned about working with these modules using a "message bus" through Python pubsub. It's super cool as you can make things that don't need to be integrated into everything else, you can just publish messages to a topic in one module and then subscribe to the topic in another to get the operation return values. Mind blown.
This made me think about the Isadora Control Panels and our conversation about using a text-base addressing system. Something flexible like Python pubsub would be amazing to send and receive parameter values to the Control Panels. It might also be a useful flexible alternate concept to send things around an Isadora project alongside globals and broadcast/listener networks in a programmatic way.

There's also multiple ways of triggering cues without the "Edit Go Triggers". Here are some files of mine that I believe are relevant to that (and some related to the Cue Sheet function):
spacebar_go_non-linear_cueing_example-2023-10-16-3.2.6.zip
sequential_trigger_with_double-trigger_prevention-2023-05-12-3.2.6.izz
cue_sheet_mock-up-2023-05-12-3.2.6.izz
cueing-cue-stack-user-actor.izz
using-comparators-for-cueing-v2-2023-02-07-3.2.6.izz
control-watcher-scene-transitions-v2-2024-07-19-3.2.6.izz
cueing_example-2023-05-12-3.2.6.izz
manual-override-for-automatic-cues-2024-02-06-3.2.6.izz
Additionally, if you're just talking about Scene Navigation, the "Scene Select" control can be clicked to jump back to a Scene, and the "Next Cue" and "Prev Cue" controls can be used to jump to the next Scene and Previous Scene respectively.

I replied to your other post, but based on this description, and for the benefit of other people wanting to do the same thing who see this post, I'm going to point you to:
- The Alpha Mask Tutorial that we have on our Add-Ons Page: https://troikatronix.com/add-ons/tutorial-alpha-mask/
- This should be helpful if you want to use the body of the performer as an alpha mask rather than having an image follow them around.
- I believe this or the tutorial mentioned below also has a "video follow spot" Scene near the end that causes a Shapes actor to follow the performer around. Replace the Shapes actor with a Picture Player and then use a Zoomer's 'horz center' and 'vert center' inputs and you should be able to move your .png around in the same way.
- The Motion Tracking Tutorial that we have on our Add-Ons Page: https://troikatronix.com/add-ons/tutorial-basic-motion-tracking/

I think Communications > Serial Port Setup > Port # > Speed is what you are looking for, but it looks like the max speed is 230400

@bonemap said:
Many Isadora users will have developed techniques for isolating a performer from the background of a live video feed.
@gmk0318 I think these tutorials may be helpful, particularly the Freeze method for motion tracking:
- Basic Motion Tracking Tutorial for Isadora: https://troikatronix.com/add-ons/tutorial-basic-motion-tracking/
- Basic Alpha Mask Tutorial for Isadora: https://troikatronix.com/add-ons/tutorial-alpha-mask/

There are many considerations to develop your work using these techniques. The use of live video input of a performer has a lot of potential for tracking with particles. But if it is the silhouetted outline - shapes of the performers body in motion and in realtime - that you want to use to ‘draw’ with particles, it will be important to establish a threshold video effect first.
Many Isadora users will have developed techniques for isolating a performer from the background of a live video feed. But it remains a challenge due to the many variables associated with video capture. There are techniques for separating a moving performer from a background that require the camera to be completely stationary/locked down so that a difference of changed pixels can be used to isolate movement. Depth cameras like those used with the OpenNi Isadora plugin can produce a silhouette calibrated to remove a background at a defined distance from the device. More recent techniques using OpenCV are now possible through the Isadora Pythoner integration. However, there are very few step-by-step tutorials available and a lot of the development you are seeking to achieve will be through trial and error, making gains based on your specific set of circumstances and variables, spatial, physical and in preparing and programming your project.
Best wishes
Russell
@bonemap I am trying to use live video so that wouldn't work the same way would it?
Hi there,
I would like to share a project I am trying to bring to life which might be of interest for some of you. If you ever tried or like to acquire sensors and/or buttons in software, this project is for you.
It’s a plug-and-play sensing platform for creative and audiovisual applications, with MIDI, OSC, wireless capabilities and a convenient web interface.
You can turn movement, touch, distance, buttons or other sensors into wireless controllers quickly and reliably.
After using arduino for years to tinker around with sensors, but encountering so many limitations and struggles, I decided to develop and share a much easier, convenient and robust solution for all.
The project will be open source, and is currently trying to launch. There will be a pre-order campaign soon on Crowd Supply (a funding platform for open source electronics).
If you are interested, check out the pre-launch page and follow the project updates !
https://www.crowdsupply.com/pi...
You can also check the Instagram
Any feedback, suggestions, support or questions welcome !