Liquidsonics main man Matt Hill has a penchant for convolution reverb. As well as developing the world's first such effect for iOS, his headline Mac and PC offering, Reverberate, was recently updated to version 2, allegedly upping the realism of an already impressive plugin with the addition of the intriguing Fusion-IR technology. We pinned him down for a chat…
What's your background in music and in programming?
"I used to be a trance producer - back in its heyday - and had a few remixes on vinyl under my belt by the time I started studying Computer Systems Engineering at university. Convolution reverb had just touched down on the Mac, and my tutor agreed I'd impress him if I could make his office sound like a local cathedral in real-time on a PC. I used some of my own music gear to take some IRs from a local cathedral, wrote a basic convolution plugin over the Christmas break, and eventually got that smile from my tutor."
Reverberate 2 features Fusion-IR technology. How is a 'fusion' impulse response better than a normal IR file?
"A single impulse response can only capture the state of a reverb at a single point in time, but a great reverb algorithm will never produce the same response twice. A Fusion-IR is sampled differently, and multiple times, so you get lots of snapshots of the source reverb in different states that can be processed and modulated before fusing them together, hence the name. This brings back that rich, dynamic reverb that sits perfectly in the mix, retaining much more character of the source than is possible with static convolution. Fusion-IR offers early and late components - gaining control over these two crucial elements is key to how reverb sits in a mix."
Is it possible for somebody to capture their own Fusion IRs? How is the process different from normal IR sampling?
"It is absolutely possible, but it's more complex than traditional sampling approaches. Not only is it more time consuming, but getting a great signal-to-noise ratio is more challenging. I'm working behind the scenes with some people to bring more Fusion-IRs to market, and I intend to produce some tools that will make this process much easier for people that want to capture their own gear in future."
You took convolution reverb onto iOS with your Mobile Convolution app - where would you like to see the iOS music-making scene go now?
"To me, the mobile music scene can feel like my early days using a basic DAW with a rack of outboard gear. It's exciting, but it's an unrefined, raw experience because preset integration and user interface management is still quite poor. In iOS 9, Apple introduced a new Audio Unit specification that will give hosts good preset management capabilities and a way to embed a plugin interface - just like on desktop. Standardisation like this is what's needed to take the capabilities of mobile music apps to the next level and to really fulfil the mobile music scene's potential."
What's next from LiquidSonics?
"There's been a lot of great user feedback about Reverberate 2 and Fusion-IR, along with some really strong ideas for the next version. I'm working on these but also looking at how to use the core algorithms behind Fusion-IR with some novel techniques to develop something completely new. My notebook is currently full of system diagrams, and I can't wait to be the first to hear what it sounds like!"Find out more on the LiquidSonics website.