Use an online streaming from web as trigger
You can't stream websites live in Isadora. If this is required you have to use external tools and use a program like Spout / NDI to get the image inside Isadora.
If you wish to go down this route, let us know and we know some people that can make this for you.
- Edited -
Didn't mean to get over as 'rude', just tried to answer the question regarding Live streaming feeds inside Isadora and that you can't really fetch Live feeds from Websites inside Isadora.
bonemap last edited by bonemap
when a spectateur enters on a certain zone in a room, the web stream just automatically starts projecting...
your enquiry is really in two parts, 1/ using the web video stream in Isadora and 2/ detecting a body in a defined space as a trigger.
Isadora has options to quickly attain both of these conditions and build a functioning prototype.
For displaying the web video stream have a look at this knowledge base article:
For detecting a body in a particular place in a room have a look at this tutorial article by Graham @Skulpture :
These techniques will get you a good working prototype. If you are like me and are not skilled at programming, I would recommend working with someone like @Juriaan to realise a more critical/show ready implementation.
Advertising or support? doesn't care which site this is.
Fred last edited by Fred
@leonor you have a few options, basically once you have the stream in Isadora you can use the built in tracking tools to look for movement, however, needing to reliably identify if the movement came from a person or not is going to be very tough and out of the bounds of someone you pay to solve it for you without a lot of money - this kind of work is done using ML at the moment and this scenario is also kind of hard and the toolchains are tough to setup, especially outside of linux. ( @Juriaan lets answer questions on the forum and reply to work offers when they are solicited). If detecting movement is enough then you just need to get that feed in to Isadora, there are some simple options - you can have a second screen and use syphon screen capture to get the video in, you can run the video stream full screen on a second computer and capture it via an HDMI capture device, or even on a phone and cast the screen via google cast or apple TV and capture that (note google cast is HDCP protected and cannot be captured without a device that strips HDCP - pretty easy to solve, some HDMI splitters do this by "accident").
Checking this stream the address is in https, and the embed and video source urls are only players not streams so need some stream control. It is probably more expensive to have someone create software to get this directly into Isadora than it is just to find an old laptop and capture it or just use syphon capture.
The second part of your problem, reliable person detection from such a distance with an RGB only camera feed is much harder to solve, this work is being done now but mostly research and not much ready for prime time, let alone anything that will easily interface with isadora and pick up people at such a low resolution. You can try to filter out cars and trucks from size that may help. Dealing with changing light conditions will also be a problem, you will have to regularly update your background subtraction.
@bonemap the tutorials you link to are a great start, but do not detect bodies in space just movement (same limitation with processing), but they are a good place to start understanding. The tracking in Isadora is pretty good, and with some clever work processing will not get you much further.
I would suggest breaking the problem down, make screen recording of a few hours of the video, load it into isadora and treat it like a live stream, follow the tracking tutorials and see how close you can get. If that is working ok then move on to getting the stream in real time and dealing with the changing light.
Let us know you go and how deep you want to dive into solving this.
dbini last edited by
http://www.tsps.cc/ might be helpful.
Fred last edited by
@dbini TSPS is pretty much dead, last work was done 2 years ago and nothing since, the release notes even say it is buggy and it is (and will not run on high sierra or later), also without a depth camera and skeleton tracking or some heavy ML stuff this is just a cool blob detector, it cannot understand what a person is from an RGB camera (or depth camera as far as I remember), just if things move in the image compared to the background, so Isadora's built in tracking with some well made patches is going to do a similar job.
bonemap last edited by bonemap
The second part of your problem, reliable person detection from such a distance
I guess I understood the question very differently. As Leonor suggests “when a spectateur enters on a certain zone in a room, the web stream just automatically starts projecting...“ My interpretation, perhaps I am wrong, is about tracking motion in a room that then triggers a projection of the web stream. I did not read the question as tracking bodies in the web stream feed. But you always have such rich answers, it is a pleasure to read.
If you have access to the webdevelopment part of this project, you may be able to share your data with WebSockets.
Then using an intermediary language (Python, maybe Processing), you can watch the realtime stream, and send OSC commands to Isadora.
Again, I don't know what exactly you are trying to do, but WebSockets are another possible bridge between your Website application and Isadora.
I have done this in the past to stream real-time update data from Wikipedia, and it worked great.
@bonemap thank you so much for the help and the sites! i'll go check right away
@bonemap yes, in fact you understood it right! is was about tracking motion in a room that triggers a projection of the web stream!
But both answers were actually really helpfull!
Thank you both so much for yout time to reply to my problem!