Seeking Beta Testers: Rokoko Smartsuit Pro and Isadora
-
OOOh! WoW! This looks exciting!
-
Hi,
it would be great to see a demonstration of the smart suit running through the Isadora plugin as a screen capture. It would be helpful to see some evidence of the systems capability to assess and evaluate its effectiveness.
Best wishes
Russell
-
Hi there,
In the next week I'm going to expirement with the suit with my collegues at the HKU. Will make screen recordings and record the movement.
-
I am interested in exploring this with a local dance troupe. A couple of quick questions: (1) In terms of live integration, would the input data coming from the suit be able to go to 3D objects to say animate an Unreal Engine Model? (2) What are the anticipated data sets designed to integrate to in Izzy? (2) What is the current latency?
Looking forward to exploring the possibilities.
-
As a follow up: Is there an investigation of using Izzy and Unreal Engine-4 in this capacity via their development of Live Link as discussed in this video at about 17:47. UE4 mentioned they are looking for software partners. Izzy, Rokoko, and UE4 would be a spectacular interconnection of arts technology with an amazing amount of creative possibilities. Hope this is or can be integrated sometime down the line.
-
Here is that link from Unreal Engine:
-
@kdobbe said:
) In terms of live integration, would the input data coming from the suit be able to go to 3D objects to say animate an Unreal Engine Model?
Wow! That is an ambitious expectation from Isadora. In the screen grab provided at the beginning of this thread we see the data module output to the Skeleton Decoder. This is also implemented with the beta OpenNi plugin - we can then anticipate that the output is a series of xyz floats for each joint. Unfortunately, the 3D Player in Isadora does not pass grouped geometry or implement any hinged constraints. You would have to use a 3D Player module for each bone and apply your own math to the transforms of each bone - and that is not a great workflow IMHO. It would seam more efficient to stream real time rendering into Isadora from a 3D animation engine using NDI or Syphon,Spout. Well, that is my guess based on working with 3D models in Isadora over the last couple of years.
Where I would be excited to work with the moCap data is through the 3D Model Particles module in Isadora because it does pass grouped geometry. It is not too difficult to imagine some credible real-time effects that take advantage of the dynamics associated with the Isadora 3D particle system.
I would be happy to hear of future developments with the 3D geometry line up in Isadora.
Best Wishes
Russell
-
@kdobbe said:
A couple of quick questions: (1) In terms of live integration, would the input data coming from the suit be able to go to 3D objects to say animate an Unreal Engine Model? (2) What are the anticipated data sets designed to integrate to in Izzy? (3) What is the current latency?
As @bonemap correctly indicated, what you're going to get in Isadora is a list of X/Y/Z points, one for each of the 19 sensors on the Smartsuit Pro. Isadora's 3D capabilities are relatively basic -- and certainly very basic when compared to something as comprehensive as Unreal Engine. It is not our intention to try to build something comparable to Unreal Engine in Isadora -- that simply isn't realistic and that functionality has been deeply addressed by software programs that focus on doing so. In short, you shouldn't expect to see the ability to rig up animated character skeletons coming in Isadora any time soon.
There is already a link between Rokoko and Unreal -- you can check out this video tutorial below to see it in action. That would seem to be a far more profitable route to go if you wish to do animated avatars in real time.
In terms of latency, I haven't measured it. But my personal experience is that it's pretty fast -- certainly as fast as any camera tracking system I've used.
One limitation of all systems that rely on a combination of gyroscopic, acceleration and magnetic measurement (e.g., XSens, Perception Neuron and Rokoko), is that the sensors are sensitive to magnetic interference. That means, if there's a lot of ferrous metal (steel, iron) in your environment you can run into problems. How well such systems perform in traditional theaters is something I cannot yet address, because I've not used the suit in such an environment. But for sure, it is a topic you should research carefully before investing in a suit that uses this kind of sensor technology.
I hope that helps.
Best Wishes,
Mark -
hi,
The OpenNi tracking skeleton is up-to 15 points, could anyone clarify what the 19 points of this smart suit are - or just the differences. For example does it add wrist and ankles?
Best wishes
Russell
-
@bonemap said:
The OpenNi tracking skeleton is up-to 15 points, could anyone clarify what the 19 points of this smart suit are - or just the differences. For example does it add wrist and ankles?
It's kind of a different setup entirely. The names they give the sensor points are as follows:
"head"
"neck"
"left-shoulder"
"left-upper-arm"
"left-lower-arm"
"left-hand"
"right-shoulder"
"right-upper-arm"
"right-lower-arm"
"right-hand"
"stomach"
"chest"
"hip"
"left-upper-leg"
"left-lower-leg"
"left-foot"
"right-upper-leg"
"right-lower-leg"
"right-foot"Best Wishes,
Mark -
I was just looking at the XSENS website and they claim that their system is immune to magnetic interference...maybe a new development?
@mark, do you know anything about that system, and if it outputs data that can stream into Isadora similar to a Kinect sensor, or would it need its own custom Actor?
John
-
According to the Rokoko people, the claim about being immune is not very well proven -- but of course they would be inclined to take this position. I know for a fact that one performance I attended that used Perception Neuron suffered numerous failures because it was in an underground location surrounded by tons of metal. But in the end I can neither say yes or no to XSens's claim.
Regarding input from XSens: unless they offer OSC, it would probably take another custom actor. I've contacted them to find out if this exists.
Best Wishes,
Mark -
@mark Does Rokoko output OSC?
-
@jtoenjes said:
Does Rokoko output OSC?
No, which is why I made the actor to allow a user to get the data from the suit. It is certainly possible, however, to use this actor to convert the Rokoko values to OSC in Isadora and send them on somewhere else.
Best Wishes,
Mark -
@mark Hi Mark- Kathleen from the NYC workshop. Was wondering if I would be able to get the Rokoko actor as I will have access to a suit in a few weeks. And please put me on a list (if you have one) to test out the perception neuron actor if you end up making one. Cheers-k
-
Please open a support ticket using the link in my signature and we can get started on the process of adding you to the beta-testing program.
-
This post is deleted! -
This post is deleted! -
Hi Mark and Team,
I'm trying to experiment with Rokoko Smartsuit and Isadora but am having some issues. Since I'm using a MacBook Pro (Intel Core i9-9880H CPU @ 2.30GHz 2.30 GHz) and it is running on Monterey, I had to install Windows 10 and Rokoko Studio Legacy - the BETA version does not support Monterey... So, I reinstalled Isadora on Windows 10 and tried to install Rokoko Studio Live Watcher. Sadly it is not installing. Here is the screenshot of the unzipped plugin folder. Aren't those folder supposed to look like Isadora logos...? When you have a chance, please help! Thank you very much.
-
@ytanokura
On windows they look just like folders. Nothing is wrong, just drag them onto the thing that says you should drag them onto it.