• Isadora
  • Get it
  • Forum
  • Help
  • ADD-ONS
  • Newsletter
  • Impressum
  • Dsgvo
  • Impressum
Forum

Navigation

    • Register
    • Login
    • Search
    • Categories
    • Recent
    • Popular
    • Tags

    Ensemble BLURB: The Nazi Strike

    Showcase
    2
    2
    1246
    Loading More Posts
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as topic
    Log in to reply
    This topic has been deleted. Only users with topic management privileges can see it.
    • Woland
      Woland Tech Staff last edited by Woland

      Hello all,

      I'm updating my resume, LinkedIn, Website, and whatnot, and so I'm digging up links of past performances and thought I'd share a few.

      This piece was called "The Nazi Strike", as it was influenced by and draws heavily from a public domain film of the same name. For this piece I was the
      projection designer, but also a performer (operation of projection and sound manipulation live onstage). I sourced, edited, and performed live projection manipulation of the public domain film "The Nazi Strike", Part II of a seven-part WWII propaganda series commissioned by the US Department of War called "Why We Fight" (the animations for which were created by Disney :D), and also included a section from Orson Welles' "The Stranger". I created an iPad interface to control projections for rehearsals (so that the others could control aspects of the projections), as well as having the dancers manipulate the projections with a Wii remote during the performance. I also constructed a MIDI instrument out of a bandolier (ammo belt), plugs from theatrical lighting instruments, scrap wire, and a Makey-Makey that I wore and played (some of you saw me wearing this, fine-tuning it, and playing with it at Isadora Werkstatt Berlin 2017).

      BLURB is a multi-discipline and multi-media ensemble composed of a media artist, two musicians/composers, and two actors/dancers. We create and perform original works with NYC-based composer Arthur Kampela. Our performances are generally composed of pre-determined sections in which we all improvise.

      Video Link

      Note: The video is almost two hours long, but our performance is only the first 25 minutes or so.

      Some points of interest in the video:

      Wiimote at ~6:15 - 7:27
      Bandolier 7:40 - 9:18
      Bandolier 14:30 - ~15:30
      Bandolier 23:00 - 24:45

      Best wishes,
      Woland (Lucas Wilson-Spiro)

      TroikaTronix Technical Support
      New Support Ticket Link: https://support.troikatronix.com/support/tickets/new
      TroikaTronix Support Policy: https://support.troikatronix.com/support/solutions/articles/13000064762
      TroikaTronix Add-Ons Page: https://troikatronix.com/add-ons/

      | Isadora 2.6.1 + 3 | Mac Pro (Late 2013), macOS 10.14.6, 3.5GHz 6-core, 1TB SSD, 64GB RAM, Dual AMD FirePro D700s | Macbook Pro (Retina, 15", Mid 2015), macOS 10.11.4, 2.8GHz Intel Core i7, 16GB RAM, Intel Iris Pro 1536 MB |

      bonemap 1 Reply Last reply Reply Quote 1
      • bonemap
        bonemap @Woland last edited by bonemap

        @woland said:

        Our performances are generally composed of pre-determined sections in which we all improvise.

        Thanks for sharing that. I was really interested in the use of disruption by the dancers and musician, how they would at times invade the space of each other and appear to undermine the performance. Similarly, the manipulation of the film disrupted its flow and montage.

        We use a lot of structured improvisation in our performances.  But I do find it challenging to develop visual systems that can approach the fluidity and variation that is apparent when dancers and musicians improvise. Using live feeds to transpose the improvised moment is one obvious solution. Using generative visuals that respond to the dynamics of the performance is another approach. I am still looking for a technique for a visual engine that can match the nuance of improvising dancers and musicians.

        Conversely, the challenge of developing responsive systems has also formed in me an appreciation for an approach which is more controlled and defined - storyboarding, scoring and rendering a fixed video file for playback-. However, as GPU performance improves and we develop the capacity for responsiveness and real-time rendering our visual engines will appear more and more alive to improvisation.

        Best wishes

        Bonemap

        http://bonemap.com | Australia
        Izzy 3 STD/USB 3.2.5 | MBP 16” 2019 2.4 GHz Intel i9 64GB AMD Radeon Pro 5500 8 GB 4TB SSD | Mac Studio 2022 M1 Max 32GB | OSX 12.5.1 Monterey

        1 Reply Last reply Reply Quote 0
        • First post
          Last post