Electric Counterpoint : Part 2

This is the second post of a five-part blog on my recent performance of Steve Reich’s Electric Counterpoint, a classic minimalist composition written for electric guitar and tape. In order to make the 15 minute piece more accessible to fresh listeners, I dressed up the performance by visualizing the score in real-time. In this post, I’ll chat about the process of encoding the score and getting things setup in Ableton.

Screen Shot 2016-08-31 at 5.24.30 PM.png

After listening to a batch of Steve Reich tunes in my 20th Century Music Theory course, I decided to cruise our university library stacks to see what scores we owned. I was thrilled to find a copy of “Electric Counterpoint;” a piece that I had considered performing many times. The only disappointment was that our library’s copy was missing the CD with the backing tracks on on it. I suppose it was a sign. Rather than simply contacting another library for the CD (which would have been way too easy), I started scheming… I knew that I would either have to record all of the parts myself (Metheny-style), or program all 12 parts, 83 pages in all (yikes!), into a MIDI file.

M051390700.jpg

No substitute for the score.

Recording all the parts seemed like the proper thing to do, and my first thought was that I would perform the entire piece thru a multi-channel system in which each part was assigned to it’s own speaker. For some reason, encoding the score just seemed like cheating.   However, encoding the score also offered a few unique possibilities: if the score was available in MIDI format, there would be nearly unlimited control over the final performance. Changing timbres, tweaking tempos, and expressing fine control over the dynamics could be done with a few clicks (or even on-the-fly).

The main idea that put things over the edge for me was the possibility of using the encoded score data to simultaneously “visualize” the score. In previous installations/performances, I had experimented with mapping sounds to sights via DMX light control data. The idea of visualizing the score proved to be too cool; the end result could be a rich multi-modal performance that not only engaged the audience, but also guided the performer through the score (always nice to make performing easier…).

One catch…I really did not want to undertake the timely process of encoding the score. To be honest, I tried everything to avoid it. I spent hours scouring the web looking for rogue midi transcriptions. I contacted handfuls of internet strangers with blogposts or YouTube videos that seemed to suggest they had the data I needed. No Joy. Eventually, I did manage to find a small excerpt from the third movement that had been transcribed into a MIDI file, but I still needed the remaining 90% of the piece.

score.jpg

Page 1 of the score.  The first of 83 (many of which did not look this kind).

Encoding the score was actually one of the most valuable experiences of this whole project. Not only did I discover how the piece unfolds over time, but I also learned quite a bit about the horizontal aspects of the piece that I likely would have overlooked if I was merely playing my part. To be honest, sitting with a cup of coffee and whipping the pencil tool across the piano roll editor in Ableton Live started to become a relaxing daily ritual. I encoded the piece one rehearsal number at a time, starting from the top of the score and working towards the bottom. As you can image (for minimalist music), there was a lot of copy and pasting not only within parts, but also between parts. After encoding all of the melodic patterns, I would return to Ableton’s dynamic/intensity view, and then use the pencil tool to draw the dynamic markings that were listed on the score. I repeated this process one rehearsal number at a time, typically three or four rehearsal numbers per day, and about a month later, VIOLA! I had my very own fully encoded digital score.

Screen Shot 2016-08-31 at 4.18.52 PM.png
“Live” View in Ableton.  You can see each rehearsal number listed in the last column.  At the bottom of the screen, you can see the shape of the amplitude curve drawn with the pencil tool.  This data was ultimately converted into DMX light data.

Looking at the piece from Ableton’s arrangement view (below), you can see some of the global structural features of the piece. The three movements are fairly distinguishable, and in general, they all progress from textural sparsity to a richly woven musical fabric. Again, I should reiterate that taking the time to encoding your scores (regardless of what you are working on) might potentially lead to a number of insights that would be otherwise impossible to find.  I know a number of pedagogues that swear by assigning their students  the task of making a handwritten score copy.

Screen Shot 2016-08-31 at 4.17.32 PM.png

The entire piece as seen from Ableton’s “session view.” Can you pick out the three movements?

The last step in prepping the Ableton file for performance involved duplicating each part so that it could eventually be assigned to a DMX-controlled light. Therefore, when looking at the encoding pictures above, note that the textures are doubled in “thickness,” as they include both sound and light data.

There are all sorts of fascinating details about the musical score that were revealed thru the encoding process, but I’ll save them for one of the remaining posts in this series. However, the upshot here is that in hindsight, I’m really glad that my quest to avoid encoding the score failed. There really is no substitute for getting intimate with the music that you’ll be performing. That being said, if you are looking to perform this piece and you’re now thinking about writing to me to ask for the encoded files, you can already image what my sermon will sound like…

Advertisements