
Welcome to the Isadora community :)
- Part of what will help us help you is to know the specs of your computer and the version of Isadora with which you are working: https://community.troikatronix.com/topic/6764/everyone-read-add-your-operating-system-and-hardware-to-your-signature-line
- Have you tried all these methods for finding/placing the actors? Gif: https://jmp.sh/dpYV5HBR
- Comparator and Eyes++ should never be missing from the program, they're always included in the download. We've gotten maybe 2-3 support tickets in the past five years that I can remember where people were missing base actors, and I believe the solution then was to re-install the program, so that's definitely something else to try
Best wishes,
Woland
@skuven muchas gracias por su respuesta Estoy usando openni tracker y estoy tratando de conectar esto a un actor que puede ayudarme a medir las diferencias en mi escena para activar una salida que envía señales a otro programa Como mencioné antes, soy un novato que usa Isadora pero estoy muy sorprendido de no poder acceder al paquete inicial de Isadora (por ejemplo, no tengo actores como comparador u ojos++ o actores matemáticos, etc.) Curiosamente, no puedo acceder a más actores, probablemente podría ser mi problema, pero he intentado de muchas maneras Actualmente he intentado utilizar los actores que están disponibles en la página de isadora, buscando un uso secundario para lograr lo que busco pero no ha sido posible realmente cualquier comentario podría ayudarme en este momento
@dusx said:
If you need fast access to it, please open a support ticket, and I can help you out.
That would be great! I am neck deep in this now and hitting some roadblocks. While I have made headway with the python build based on the prototype here, I would love to be up and running. I have been playing with Touch a bit for my blender workflow and it looks pretty straight forward. Isadora would be the treat, though.
I have opened a ticket.
Thanks,
- J

We have an updated Leap Motion actor coming soon. If you need fast access to it, please open a support ticket, and I can help you out.

@skuven said:
maybe I should do a scripting course
If this is something you are considering, you might look at Python rather than Javascript. When working in Isadora, python will open up many more possibilities, and once you learn one it will make the other much easier.

Since it appears your EOL is causing the buffer to fill, you could try not using Serial.println() to automatically add a EOL (end of each text segment), and instead use Serial.print("\r") to add an EOL that you have control of. Often "\r\n" are used, these correspond to the ASCII values 13
(CR) and 10
(LF), in this case we only need to add/find "\r" or value 13
EOL = 13, sets the actor to look for a carriage return (13).
You can learn a bit about using these here: newline - Difference between \n and \r? - Stack Overflow
Once you are adding your own EOL, you can makes changes and test in Isadora until you get the output you are expecting.
I am working on a couple of projects using the Leap Motion 2 controller: parameter modulations and position coordinates for Isadora, and motion capture for Blender. Since the current Gemini drivers don't seem to work with the Leap Motion Actor, I started researching alternative approaches and OSC seemed the best way to transmit the Leap data to both Isadora and Blender.
I thought to document my progress in some posts here hoping it helps other users. Using GitHub Copilot and the Ultraleap SDK, I developed a simple Leap-to-OSC prototype Python script, attached below as a zip (i am unable to upload .py or even .txt documents). It's heavily commented with setup and implementation instructions. You'll need some familiarity with Python and Git to get this running. While I'm not a Python developer, I am familiar with Python structures and working in VS Code. I am certain experienced developers will be horrified by the inefficiencies and overall inelegance of the code.
This script outputs hand positions and orientations:
For each hand detected:
- Determines hand type (left/right)
- Sends position as [x, y, z]
- Sends orientation as quaternion [w, x, y, z]
I use Protokol for OSC testing: https://hexler.net/protokol
I've also created a Blender script that receives and can record this information. I think it will be a powerful tool for graphic animation work, camera movement, etc.
My next step is to expand the Leap-to-OSC script with a Tkinter interface featuring status messages and the ability to select between sending all hand tracking data or just fingertips for motion graphics work.
Prototype script here:
It's a zip file as I was unable to upload the .py

As you're new, I highly recommend that you take advantage of the tutorials here: troikatronix - YouTube. Going through these will teach you all the basics.
For more advanced tutorials try here: Guru Sessions : TroikaTronix
Cheers,
Hugh
Thanks for this, it's gone over my head though. I simply don't know enough about scripting. Thanks for your efforts though!
I'd love to say this makes sense but it doesnt, however, it's very helpful to have the solve. Thankyou.
maybe I should do a scripting course...