How to get precision and wireless/remote control with IzzyMap?



  • Hi,

    I am doing an architectural facade projection that will use 3 x projectors across the front face of a building. The middle projector is perpendicular to the building while the two outside projectors cross-over at oblique angles to extend over the remaining left and right sections of the facade. 

    I am wondering how I can get precision using IzzyMap to do this task? Is the only way to make alignment by eye? Or am I missing something?
    Also, the features associated with the 'stage setup' such as the 'alignment grid' and numerical calibration inputs are what I am missing in IzzyMap. Or is it possible / intended that we use the 'Stage setup' first and then do the mapping with IzzyMap?
    While it is great to have 'IzzyMap Output' points publishable. I am struggling to find enough IzzyMap 'Input' features to work remotely from the main computer and link to it wireless through an interface such as touchOSC or a laptop to complete the alignment tasks on a large area projection mapping? My issue is the computer interface serving the projectors can't be in front of the building therefore I don't have good line of sight using the IzzyMap 'Input' interface. Previously, (prior to IzzyMap) I experienced many times the issues with sightlines and seeing alignment and mapping projection in media arts installations and theatres. I have used touchOSC to map '3D Quad distort' actors across multiple projectors. This got me in and around large sets and buildings to map projector alignment, although still limited to Output points. I can reuse the TouchOSC patch with published IzzyMap 'Output'. Using a Shared Screen over a LAN network is another option. However, being able to publish points for initial alignment work using 'Stage Setup' and 'Quad distort' style 'input' to IzzyMap, would be a welcome feature particularly where it is impossible to view projector alignment from the media servers physical location
    A couple of recent projects I have worked on have been setup with a team using '[Watchout](http://www.dataton.com/watchout)' and they are able to wonder around buildings with a wireless/remote calibration interface. I know Isadora is fantastic, I am just wondering if there are enough publishable mapping points in 'IzzyMap' or 'Stage setup' to facilitate large scale spatial projections where sight lines are limited?
    I will persist with this as I may have missed a critical piece of the workflow with using IzzyMap. Linked here is a [photo of a projection](http://www.cairns.qld.gov.au/festival/news-events/events/visual-arts/enlightenment-ocean-and-earth2) we did recently using Watchout - is it possible to do this with IzzyMap?
    Regards,
    bonemap

    b6cabb-x4.jpg



  • It seems that a few projects would benefit from publishable input points. lets feature request it....



  • @dbini,

    Done

  • Tech Staff

    I think TouchOSC is a good solution but the problem you seem to be describing is getting very accurate steps.

    Its a long work around but you could try touch OSC with a back and forth button; so you would have to go through each point one by one 1>2>3>4>5, etc and if you wanted to go back to a point you'd have to go in reverse order.
    Also; rather than using sliders on TouchOSC which are great, can move as you let go with your finger. So again buttons with very small increments may be best. 
    The isadora patch would be quite large and require router actors, etc. 
    ^ All the above is quite hard to explain via text/words on here. I will try and explain better or create something to explain this for you at some point.


  • @skulpture, Yes you are right. TouchOSC is going to be frustratingly inaccurate with 2d sliders. I have already created a patch (2011) in Isadora and touchOSC that selects between four projector stages. It is a huge Isadora patch that returns the 'quad distort' actor settings of four Isadora stages back to touchOSC and allows each corner to be manipulated through an iPad. What is really going to be useful is a wireless interface for the 'Stage Setup' feature of Isadora. Perhaps Screen Sharing is the only way to go. But a tablet/ mobile app would be really useful. Cheers Bonemap


  • Tech Staff

    I know what you mean now. If you are doing pure mapping then you can bypass the stage set up and just use IzzyMap. But I know how/why the Stage Set up is useful (stage blending for example). This is an interesting conversation for sure. Lots to think about. Whilst a tablet/portable PC would be handy - there are obvious limitations such as WIFI/connectivity and lag.



  • @skulpture, I didn't really get the relationship between 'Stage Setup' and IssyMap until recently. But the Stage Setup features are critical for multi projector alignment. Once the Stage Setup alignment is done IssyMap is much simpler to wrangle. My issue is often having the main Isadora machine located where it is impossible to see projector alignment, therefore some kind of remote interface is imperative to the success of stitching and blending multiple projectors. So perhaps I am not thinking of 'pure mapping' but even so optimising projectors for different facets of architectural projection are part of the mapping experience. Cheers Bonemap



  • Hello,

    I would opt for 3D quad distort, because you start from a geometrically accurate mesure (0,0,0,0,0,0,0,0 it's the rectangle). after that you have to play with percent, not pixels, strange way of measuring (you need to reverse some entries) but you can obtain a quite usable remote mapping.
    I was using that for a performance with a beamer left wing projecting right wing. I stay on stage with my iPad, touchOSC and a dedicated WiFi router, I was able to adjust the projection with pixel accuracy, touching the screen.
    In touchOSC I was using + and - for each of the eight parameters and in Isadora I was using Data Array to record the settings.
    It is very useful to do it with Osculator, making an easier way to route OSC.
    Jacques 


  • Thanks Jacques, Yes, that is the same method I have been using all these years. However, after using the Stage Setup I can see there is potential for a more integrated way to interact remotely with these Isadora features. Regards Bonemap


  • Tech Staff

    Yeah I hear you @bonemap

    I guess the difference is here a lot of software/media servers use UV and texture mapping, where as Isadora is what I consider the more traditional mapping method of slicing and placing images onto a surface manually. Know what I mean?
    So these other softwares (and hardware - such as D3) don't really need a user to walk around a space and accurately map a structure or surfaces. 


  • Thanks @skulpture

    You may have more experience/exposure to the latest software for architectural projection. So thanks for sharing your experience. Perhaps I should do more homework. I have found the Isadora approach a limitation considering what it takes to emulate uv point mapping. Software that matches 3d modelling and virtual space to physical objects and space is well beyond what I am doing. But then I get really excited about what Isadora might do in the space of architectural projection in terms of realtime visual compositing and responsive sensor driven interaction. I don't know of other software that can offer so much control and begin to approach architectural projection. However, it could be that I haven't looked around enough to see what alternatives are out there. Another issue has got to be the cost of commercial grade 'virtual projector' software. I guess Isadora remains in reach of the independent artist.
    Thanks again for your insights 
    Bonemap

  • Tech Staff

    @bonemap

    Have you looked at the "Edge Blend Mask" and "Global Edge Blend Mask" actors.
    You could setup an OSC interface to these actors. This would help with the lack of line of sight for the initial projector/stage setup.


  • Thanks @Dusx

    Now that you point to these I will definitely check them out....
    Regards,
    bonemap


  • We have the "recent" habit to use an app called TeamViewer on a laptop to remotely control the main computer. It works much better than remote desktop and you have full control of your isadora patch. The nicest features is that you can choose which screen you have on your remote computer, and that works really fine with "stage live edit".

    No complicated osc to setup and full control of everything… I personnally find it to be the best way to do precise mapping. On the down side, you need a second computer…

    Cheers

    Jerome



  • @zedociel Thanks for your advice. I have used TeamViewer for operations/administration, so will check it out for this purpose as well. Reducing the amount of kit required to setup is sometimes a real bonus. I guess it depends on so many factors around what attributes a project has or is trying to achieve and how often a similar situation is repeated. I have had an ongoing annual festival commission that has a central high spec architectural projection running over 10 days, with a series of one night guerilla projection events as 'suburban satellites' produced by student teams. The one-off outdoor events are good to limit the amount of gear to set-up and down. It would be great to have a mobile/ tablet interface running on a wireless hotspot for those situations where multiple projector alignment needs to happen quickly as the sunlight fades. However, for a few days a year I am not sure if it is worth the development of a dedicated app? Cheers Bonemap



  • Hi,

    I have tested the use of TeamViewer for working within the Isadora interface with a remote device. I can confirm that TeamViewer will work for this purpose. I also used an iOS install of the TeamViewer software and a hotspot generated by the iPhone 6 at a distance of around 30 mtr (not line of sight). TeamViewer MacOS and TeamViewer iOS are free for personal use. The other OS and mobile flavors are also available so I can't see why it wouldn't work for PC and Android.
    It is not perfect (latency and accuracy) but it appears to do the job somewhat - publishable points and calibration settings within stage set-up would still offer a way to customise the stages remotely using OSC transmissions.
    cheers
    bonemap


  • @bonemap

    I've been using TeamViewer for some things as well. Another method I've used, (not quite on the same scale as you of course), is setting up a private network with a wireless router, connecting my Mac Pro and my Macbook Pro to it, and then using the [built-in Screen Sharing](http://osxdaily.com/2012/10/10... on Macs to control my Mac Pro while wandering around with my Macbook Pro adjusting my mapping. Best wishes, Woland


  • @woland

    Thanks. I will look at the suite of nodes provided in Isadora : Global Keystone and Global Edge Blend Mask, and put something together with TouchOSC that I can push around with a mobile hotspot. It just means less kit to deal with for projection projects that are meant to be quick and dirty guerrilla style events.
    cheers
    bonemap

  • Tech Staff

    Perhaps, a feature request, would be an official OSC api to any Global/Scene setting. I can at least see this as useful for stage setup.


Log in to reply
 

Looks like your connection to TroikaTronix Community Forum was lost, please wait while we try to reconnect.