
Hi Bruno. it might be a good solution to use IzzyMap to crop a selection of the Input and Output of your Projector.
If you make a User Actor with a GPU Input and a Projector in it, then use this whenever you need video in a scene, you can update the mapping and when you close the user actor, every instance of it will update.
(I've got a bunch of User Actors called: Shop Projector, Rave Projector, Tiny Theatre Projector, etc....)
Best, john
@citizenjoe yes that's pretty good, thanks! (I said it was dumb question.) Is there anything like that, that can be applied globally to all scenes?
I want to have four separate light sensors each trigger a separate video. I attached my arduino code I am trying to use the serial in watch actor in isadora but I'm not entirely sure of the connections a what all use to get it to work. I attached what I have at the moment but it is nothing really valuable



I seem to have fixed it by triggering the 'read' input once. all the zeros disappeared!
Hiya,
feeling a bit of a noob, is it possible to crop the stage? My output ends up on a 4x3(ish) projection screen (sigh). I can use the stage setup to drag in the sides/corners of the image, BUT this is squeezing the image - is it possible to crop the image?

Hello folks,
I have a .tsv file with 2 sets of numbers (nearly 2500 lines, 2 columns, tab separated) and would like to read this file with the Data Array.
But, as I recall different rows, some of the values in the output are accurate, but some are 0.
For example:
looking at the tsv file, it is clear that none of the numbers is zero.
Any ideas what's going wrong?

I just connected the output from my Mac Mini to the HDMI input of the USB BlackMagic Intensity Shuttle, connected to my Windows 11 machine.
I opened BlackMagic Media Express, to find the correct video format. In my case it is 1080p30. I closed Media Express, opened Isadora 4 and setup live capture with the blackmagic drivers, and the correct input format, hit start live capture, and everything worked as expected.
So it seems with the current BlackMagic software installed. It is working with Isadora 4 as expected.

You asked what I was doing with the software. Here is a short video about it:
https://www.guiton.de/seite-2....
It works pretty well when I only have one person in the installation, but as soon as there are several people, the detection keeps popping on and off, even if the people are not covering each other. What can I do about this? Is it a light or distance problem or both?
I have several questions:
What does “Minimum score” do?
Is it normal that I don't get z parameters in MoveNet Multi-Pose Lightning mode?
If I select “MoveNet Multi-Pose Lightning” and Bundle xyz Array I still get 33 parameters for x,y, z (z without data). Is there a way to get only the parameters for the box (x, y of the center, width and height)?
How many people can be recognised at the same time?
Is there a maximum distance?
It seems that the detection is light ratio dependent but what about infrared light?
And a completely different question. Do you know a method/software to recognise the people from above. I tried with MovementOSC and it doesn't work when you see only head and shoulder from above, which is quite normal.
Thanks a lot
Best regards,
Jean-François

Hi all, I'm the creator of MovementOSC. I'm super excited to see that it's useful to you all.
First, @Armando, thank you so much for posting about it and sharing how you've set up Isadora with it.
I actually designed the "Bundled Message Per Axis" format with @princeCarlosthe5 specifically for Isadora, and we used it extensively in a residency at Jacob's Pillow a couple weeks ago to track dancers (both from archival footage and live), visualize the data, and transform it into a format suitable for a cable robot. I'm not an Isadora expert (yet!), but as far as I know you should be able to use that format directly without having to use the JSON Parser actor or do any other kind of string parsing that might slow things down. Hopefully Carlos might be able to share what his configuration looked like.
I responded to @jfg in the other thread about how settings aren't yet saved when you quit the application, but it is a feature I'm planning to add when I have some time.
@dbini, I'm glad to hear it was interesting to you, and I'd love to hear how your robotics students used it!