@zimt said:
isadora 4.0.9 (rosetta):
I haven't tried this on Mac in Rosetta mode to be honest. Its possible that Python is looking at the system architecture during the install of the python modules, and they are ARM specific. Rosetta may introduce a mis-match.
Have you opened a support ticket? If you do, I'll go through it with you to figure out what's going on.
@zimt said:
do you have an idea what's wrong?
- Have you tried this not in Rosetta?
- The lines in the error in the second screenshot that says "but is incompatible architecture (have 'arm64', need 'x86_64')" makes me think that perhaps some of the modules used in those tutorials are Windows-only.
These files/tutorials may help:
My Personal Files (should be a quick and dirty way to get started)
- Scaling Isadora Example File: automatic-scaling-explained-2023-07-19-3.2.6.izz
- Color Tracking Isadora Example Files: color-tracking-and-chroma-key.zip
- Live Drawing Isadora Example File: https://troikatronix.com/add-ons/live-drawing-example
TroikaTronix Materials (more in-depth information)
- Basic Motion Tracking Isadora Tutorial File: https://troikatronix.com/add-ons/tutorial-basic-motion-tracking/
- Alpha Mask Isadora Tutorial File: https://troikatronix.com/add-ons/tutorial-alpha-mask/
- Value Scaling Isadora 101 Introduction Video: https://youtu.be/MG3pbjPHOGc?si=EAw29nukhJaoc59H
- 1 Hour Isadora Live Drawing Livestream Recording: https://www.youtube.com/watch?v=LHn5g6hlHNU
- 1.5 Hour Isadora Guru Session on Motion Tracking with the Eyes++ Actor: https://www.youtube.com/watch?v=RKSwV4vjq7o&t=5s
Just wanted to share with everybody what we ended using for our latest A/V show (2x MBP Silicon, one running Ableton Live, one running Isadora) to trigger scenes and events from Ableton into Isadora. This solution revealed to be reliable, and never failed us. We brought it on tour, and it worked 100% from the moment we would plug it in. This interface has USB-C so no need for adapters.
mioXC - USB-C 1x1 MIDI interface from iConnectivity. https://www.iconnectivity.com/...
We had tried many other solutions (Ableton Link, Network (wired, wireless), and while some worked most of the time, they randomly failed, which didn't make for a reliable enough solution. MIDI often got saturated when using network. And the specific scenario (short Soundcheck times, our gear staying on tables backstage between soundcheck and changeover without continuous power, having to reconnect in the dark...) was already stressy enough for all the things that could go wrong...
I received MIDIs in Isadora from Ableton Live on a specific/separated channel to trigger scenes. Other inputs were used to trigger transitions/events. Also, some MIDI notes triggered lighting cues directly in Lightkey on the same laptop as Isadora, which allowed for more complex lighting looks (especially when using moving heads, that was gold!).
That´s all! hope it helps!
Re: [Connecting eyes++ to Live drawing](/topic/9033/connecting-eyes-to-live-drawing)
I'm having an issue getting eyes++ and chroma tracking to work with live drawing. I am new to Isadora and only know the basics. If anyone had a quick tutorial/run through I'd very much appreciate it.
Thanks
dear troikatronix,
thanks for your youtube tutorials!
i encountered a problem on osx sequoia 15.2 / mbp 2021 M1 Pro (arm) / isadora 4.0.9 (rosetta):
i followed both pythoner tutorials Getting Started with pythoner and Beginner’s Guide to MediaPipe Tracking in Isadora
the installation and activation of the virtual environments with the apple-scripts worked fine (numpy/pillow/mediapipe "...install OK").
"Get a list of IMAGE URLS via webscraping" in the examplefile works.
BUT the interpreter shows a red (!!)-sign whenever numpy or pillow is involved. when i click on [OK] the interpreter in the example-files (eg. pythoner_examples_v0.4.izz / "Load an Image from the WEB and output to Isadora") says:
ImportError: Error importing numpy: you should not try to import numpy from
its source directory; please exit the numpy source tree, and relaunch
your python interpreter from there. (Module_A0LRGrXqdi.py, line 1)
pillow also produces a strange error. i did not set any arm64 or x86_64 variables. shouldn't isadora automatically chose x86_64 running in rosetta or maybe the "installer script" with pip recognized the M1 chip and installed the arm versions which do not work because i'm in rosetta? (see screenshots).
do you have an idea what's wrong?
thanks

@mschwenker said:
variables that are created outside the main function?
Yes, these are Globals to the active instance of Javascript. The Javascript actor does not have a built in method of sharing globals between actor instances.
You can bridge this by outputting values to a Set Global Values actor, and feeding the value into another Javascript actor via the Get Global Values actor. This works well when you want to share values primarily in one direction.
I updated my file and User Actors to make them even easier to use python-rename-scene-after-media-v3-2025-08-19-4.1.3.zip
@mschwenker said:
I have one more question about the memory management of the JavaScript actor. What about variables that are created outside the main function? I would assume that these are only available to the actor instance.
@liminal_andy Didn't you have a method for passing global values between JavaScript actors?
@DusX and @Woland thank you very much for your very detailed feedback and also for create this 8 Notes on and Off Watcher.
This helped me a lot to better understand how Izadora works, especially when it comes to data flow and actor execution.
I was able to use the actor as a basis and adapt it to my needs. I change it from 8 to 12 Notes and reduce it only the pitch and note on / off information which i combined to a On/Off integer because i only need to track the current pressed notes.
I have one more question about the memory management of the JavaScript actor. What about variables that are created outside the main function? I would assume that these are only available to the actor instance.
Thank you very much.
Malte