At the start of this project, we threw multiple proposals around before settling on the idea that we wanted to put on a sort of hybrid acoustic/electronic performance with aspects of both that were intertwined and interdependent on one another. This, we decided, would lend to the feeling we wanted to convey that when we performed our piece, it would seem as if one person was putting on an electronic performance while the other two, playing acoustic instruments, would simply be a part of the first performer’s “tool belt” or more appropriately, instrument rack. We wanted to design our piece such that the performer (Ticha) could change something digitally in TouchOSC (explained below) that would affect the live instrumentalists (Matt and Coby).
It was also decided pretty early on that we would use a melody and accompaniment that Matt composed in Logic as the”backing track” of our piece. This track would be ever present, and Ticha would be able to layer audio effects from Ableton Live onto it to change the sound and mood.
The other half of the sound part of our performance would be covered by Matt and Coby on the Guitar and Bassoon, respectively. They would improvise a new melody over the backing track, but only when Ticha “switched them on.” She would be able to do this, as well as adjust the intensity of their playing, all from the TouchOSC interface. Below Matt and Coby are represented by the Warrior and Philosopher.
The bulk of the difficulty in this project lied in the integration of TouchOSC with Ableton, and later on the Media Lab lighting grid. Our goal was for Ableton to recognize OSC as just another MIDI controller and allow us to map various buttons within Ableton to the OSC interface, which was difficult in itself even before we decided that we wanted to do it wirelessly. Ticha was invaluable during this stage of the process, as it was her who was familiar with TouchOSC and another “software sketchbook” called Processing, which allowed her (after many hours of programming wizardry) to eventually link TouchOSC’s buttons to the on/off switches of the audio effects in Ableton. And after another many hours of wizardry, she figured out how to control two lights in the grid individually through TouchOSC.
It took us a while before we agreed upon how we wanted to present our piece- how we would convey our efforts to the audience, how we would demonstrate that Matt and Coby were simply part of Ticha’s electronic performance, and how, if at all, we would let the audience interact and affect the way the piece sounded. We ended up with a simple narrative, one where we could make it look as if Ticha’s performance was telling a story, and Matt and Coby represented characters in that story. We were not trying to be to complex, but we also needed a way for the audience to see the TouchOSC interface in order to more easily understand the story. Seeing as it also received a huge chunk or our attention leading up to this point, we wanted to display it in a prominent spot throughout our performance.
We worked through several options, but strangely enough the only one that seemed to work feasibly was mounting a projector upside down onto a tripod and using an Apple TV to receive Ticha’s ipad screen and send it to the projector.
This way, the audience could see what Ticha was doing in TouchOSC, and immediately see what effect it was having on the performance as a whole. This was our final product:
The roles for this project were as follows:
Ticha Sethapakdi – Software Programmer/Experience Director
Matthew Turnshek- Sound Designer/Experience Director
Coby Rangel-Sound Editor/Scribe