How Convoluted Is Your Reverb?

As an educator, I regularly find my brain in a constant background task of trying to find better and better ways of explaining what I know to people.

Some knowledge is conveyed easily; some is best done with a humorous analogy; and for some topics, it is just plain hard to find the right words to get the ideas across concisely and accurately.  After all, we’re talking about audio, which you can’t point a finger at, or smell or taste for that matter.  It’s like the aphorism “Writing about music is like dancing about architecture” (although, truth be told, I have seen some great dances about architecture, but that is another story).  But like it or not, if you are teaching music production, you’ll be doing a lot of writing — and talking — about music.  It’s pretty much the job description.

Like it or not, if you are teaching music production, you’ll be doing a lot of writing — and talking — about music.  It’s pretty much the job description.

So when I stumble on something that illustrates a concept better than I can explain it, it is cause for celebration, and I do a little dance (although not about architecture).  It’s akin to the kind of joy you feel when you find and deploy a missing piece of a puzzle: SNAP… ahhhh!

Take this geek-tasctic cornucopia of flash animations by Dan Russell illustrating the propagation of (sound) waves in a medium (such as air particles).  When you want to visually understand what sound waves look like and how they behave, this is a great place to start.  Even if you drown in the math used to create them, you can still absorb a lot from viewing these simple, effective animations. (Yes, the fluid puns were intentional. Soak it.)

But what spurred this particular post is this intriguing video from

I’m not sure exactly what their original intent was with the video — my French is a bit rusty from high school — but this is a fantastic demonstration of the methodology behind convolution reverb.

Reverb plugins come in two flavors: algorithmic and convolution.

Both do the same job – they give your recording an artificial sense of space.  The first does it through an algorithm (aka: math) containing a series of tightly packed delays that gives you a room simulation.  A convolution reverb (such as the one included in Live 9 Suite’s Max for Live Essentials pack) uses an impulse response of an actual space.

The beauty of algorithmic reverb is how much it can do with a comparatively small overhead of processing power.  Back in the 1970’s and 80’s, when digital signal processing (DSP) technology was just becoming inexpensive enough to reach into the pro audio market, digital reverberation devices defined the sound of that era.  Although far from perfect reproductions of real spaces, their sound was dreamy and alluring, and many of these units are as sought after as a TB-303.

Since the 1980s, processing power has grown at an alarming rate, and now algorithmic reverb plugins running in a DAW — such as Ableton’s Reverb Audio Effect Device — are standard.  The past decade has seen the advent of the convolution reverb, which requires significantly greater RAM and CPU to apply impulse responses onto your audio.  Think of an impulse response like taking an “audio picture” of a place that captures its sonic properties.  A convolution reverb plugin, then, is simply a rolodex of sampled spaces which can then be superimposed on your signal.  Want to place your mix in the Taj Mahal?  No problem: find an impulse response that someone has made of that space and apply it via convolution reverb.

The more intentional your selection of space, the more your listeners will connect with your music.

In my mixing class I talk about imagining a mix as a performance in three dimensional space: yes, sometimes it is a “band” in a virtual “space” (say for rock music), and other times it is an intentional journey through many different spaces (in the case of synthesized electronic music).  But having a solid vision of the “space” your mix “takes place in” is essential for creating a cohesive mix.

Which brings us back to the WikiDrummers video.

Watching this single performance of the same drummer on the same kit playing the same beat in multiple spaces sounds a lot like flipping through impulse response presets on your convolution reverb, doesn’t it?  The contrasts are vivid and delicious.  You can clearly make out the differences in pre-delays, initial reflections, diffusion, decay time, and absorption properties.  And each one gives this simple drum kit a completely different vibe.  Which is a great reminder: even when hearing a solo instrument, what you are listening to is a really a duet between the player and the room it takes place in.  I believe space is that important.

So when selecting a virtual space for your mix — regardless of what method you use to create it — keep this kind of visual rolodex playing on the inside of your eyelids while you listen.  The more intentional your selection of space, the more your listeners will connect with your music.