Ensemble BLURB: The Nazi Strike
I'm updating my resume, LinkedIn, Website, and whatnot, and so I'm digging up links of past performances and thought I'd share a few.
This piece was called "The Nazi Strike", as it was influenced by and draws heavily from a public domain film of the same name. For this piece I was the
projection designer, but also a performer (operation of projection and sound manipulation live onstage). I sourced, edited, and performed live projection manipulation of the public domain film "The Nazi Strike", Part II of a seven-part WWII propaganda series commissioned by the US Department of War called "Why We Fight" (the animations for which were created by Disney :D), and also included a section from Orson Welles' "The Stranger". I created an iPad interface to control projections for rehearsals (so that the others could control aspects of the projections), as well as having the dancers manipulate the projections with a Wii remote during the performance. I also constructed a MIDI instrument out of a bandolier (ammo belt), plugs from theatrical lighting instruments, scrap wire, and a Makey-Makey that I wore and played (some of you saw me wearing this, fine-tuning it, and playing with it at Isadora Werkstatt Berlin 2017).
BLURB is a multi-discipline and multi-media ensemble composed of a media artist, two musicians/composers, and two actors/dancers. We create and perform original works with NYC-based composer Arthur Kampela. Our performances are generally composed of pre-determined sections in which we all improvise.
Note: The video is almost two hours long, but our performance is only the first 25 minutes or so.
Some points of interest in the video:
Wiimote at ~6:15 - 7:27
Bandolier 7:40 - 9:18
Bandolier 14:30 - ~15:30
Bandolier 23:00 - 24:45
Woland (Lucas Wilson-Spiro)
bonemap last edited by bonemap
Our performances are generally composed of pre-determined sections in which we all improvise.
Thanks for sharing that. I was really interested in the use of disruption by the dancers and musician, how they would at times invade the space of each other and appear to undermine the performance. Similarly, the manipulation of the film disrupted its flow and montage.
We use a lot of structured improvisation in our performances. But I do find it challenging to develop visual systems that can approach the fluidity and variation that is apparent when dancers and musicians improvise. Using live feeds to transpose the improvised moment is one obvious solution. Using generative visuals that respond to the dynamics of the performance is another approach. I am still looking for a technique for a visual engine that can match the nuance of improvising dancers and musicians.
Conversely, the challenge of developing responsive systems has also formed in me an appreciation for an approach which is more controlled and defined - storyboarding, scoring and rendering a fixed video file for playback-. However, as GPU performance improves and we develop the capacity for responsiveness and real-time rendering our visual engines will appear more and more alive to improvisation.