Ni-Mate>Blender>Syphoner>Isadora or Adobe Face tracking>Isadora
-
Hi,
Has anyone used the Ni-Mate addon for Blender and then ported to Isadora using Syphoner? Is that the way to do it or is there a more direct pipeline? For example a direct syphon addon for Blender?
cheers
bonemap
-
I have posted to the blender forums asking for either/both Syphon and Spout support without a reply.
I don't believe anything exists. There's a github bge awesome list with many blender game engine resources but nothing for spout or syphon.
-
Yes, I could not find anything either. But thought someone may have dug a bit deeper. Thank you for your advice as always.
Cheers
Bonemap
-
You can easily output syphon from Unity 3D to Isadora and obtain all the features you hope from blender, plus others.
I use it for a show on tour since 2 years because I needed a 3D mapping following a movement.
Unity is free if you are bellow 100K$ of revenue with it…
Best, Jacques
-
Ah yes! Thank you for the tip. I have used the Unity engine a little but not with Isadora.
cheers
bonemap
-
Hi,
In the same vein as motion tracking with ni-mate in Blender, I have also started looking at the face tracking functionality in Adobe Character Animator. It has syphon and NDI output as well as input triggers (midi & osc) for activating animated events aimed at realtime output of face tracking 'performances'. Just seeing if anyone has gone down this path?
cheers
bonemap
-
This post is deleted! -
@bonemap said:
Just seeing if anyone has gone down this path?
Hello, yes, I have been. We've wanted to use Character Animator in a piece for ages, and getting the output into Isadora has been a kluge until now. I've had a bit of preview time combining the NDI output for Adobe Creative Cloud with Zeal's Spout to NDI to bring output from Character Animator and Premiere into Isadora: it is looking reliable and pretty lag-free.
I think it's a bit of a game changer: for example I can do a quick edit in Premiere and play it back from the timeline directly into Isadora, switch to Isadora and manipulate it further there for live output to stage. Current Premiere will keep playing back even when it doesn't have the focus: so long as your machine has enough grunt and you weren't playing a lot of different movies, you could conceivably reduce rendering out of Premiere or even After Effects.
I don't know if you've seen the announcements of new After Effects features from IBC, but one is the ability to bring live JSON data in and visualise it in After Effects. I'm keen to be trying this out with the JSON data from a Kinect via @DusX 's Kinect2Go utility. -
Thanks for sharing your experience with these tools. I am only just starting to work with expressions in After effects mostly through using the DUIK character animation plugin. I have seen that After Effects has realtime output (including Syphon). I know it also has the Adobe face tracking functionality. I imagine it will be far more extensible than Character Animator but the area of realtime output and expressions functionality in After Effects is still a bit opaque to me. If you come across any useful tutorials I would appreciate the link.
best wishes,
bonemap