Visual Music: Audio-driven DMX Light Control

Back in 2012, a good friend and I decided to shake things up in the concert hall and explore the possibility of transforming sound into light control information.  This is a quick write-up on our efforts.Screen Shot 2016-09-20 at 11.15.36 PM.pngI won’t cover the rich story here, but the notion of visualizing music has a long history stretching at least back to the first mechanical color organs in the 18th century.  In these early attempts, it was common to map sound frequencies onto light frequencies, oftentimes as a way for synesthete composers to share their unique experiences of music.  Beyond frequency-frequency mapping, there are a number of other ways to map sound into visual information.  As an initial foray into the many possibilities, we started with a simple mapping of sound intensity to light intensity.

The setup for our experiment was very straightforward:

  1. Collect and digitize real-time instrument amplitude information using contact microphones and a simple USB audio interface (1 mic for each guitar).
  2. Write a simple MAX/MSP program (see below) that converts the amplitude information (values between 0.0-1.0) into DMX light control information (values between 0-255).
  3. Output DMX control data via David Butler’s imp.dmx external, Nullmedium’s dmxusbpro external, and a USB DMX interface (Enttec Pro).
  4. Display the DMX data on simple dmx LED spotlights.

Screen Shot 2016-09-20 at 10.47.25 PM.png

We learned a lot from this super simple setup.  First off, when rehearsing for our performance, we were able to “see” the balance between our instruments, and further, we were able to shape our musical dynamics by using the “real-time” feedback from the lights.  Secondly, on-stage, the audience was able to make the critical link between sounds and sound sources by attending to the visual feedback.  Lastly, with the classical guitar being a relatively quiet instrument, we were able to convey a certain type of dynamic expressiveness that could only be seen rather than heard.  Nice!

The video at the bottom of this post shows the end result.    Note that this was an acoustic performance; the contact microphones on the guitars were only used to drive the lights.  Feel free to contact me if you are interested in our other experiments with real-time audio-driven light control.  A more recent musical example includes a similar sight/sound mapping within a performance of Steve Reich’s “Electric Counterpoint;”  details on that performance can be found in a five-part blog that starts here.

 

 

Advertisements