Kate Simko: “I looked at the screen and for the first time in my life, I saw music as a waveform”

null

Kate Simko grew up in a very musical family. One of her earliest memories is of her grandmother, who played piano and organ in the local church until the fine old age of 92. Her childhood was filled with Bach, Chopin, Mozart, scales, music theory, and lots and lots of practising - but at 16, there was a sharp left-turn. 

It was at an underground rave in her home city of Chicago that she came across a strange British record label called Warp. Soaking up the twisted grooves of Autechre and Aphex Twin, she began to form the germ of an idea. 

“At Miami University, I applied to study piano,” explains Simko, “but it didn’t take me long to realise that I was never going to be a concert pianist. At first, I felt defeated, but then I came across Logic. It was a total game-changer.

“From a classical music point of view, Logic made complete sense, because I could see what was happening on the screen, but I also felt closer to the music that I’d been listening to at the raves. With Logic, I felt like I could bring to the two together.”

Simko immediately dropped out of uni, headed back to Chicago and enrolled on a music technology course. Her first remix project, a reworking of Philip Glass’ Houston Skyline, made the Billboard Classical Top 100. Suddenly, Simko’s strange mix of electronic experimentation, classical orchestration and stylish minimalism - all guided by the spirit of her grandmother - began to make sense. 

Fast-forward a few years and Simko is now based in London, leading the appropriately-named London Electronic Orchestra - a live ensemble with Simko on keyboards and beats. Alongside acclaimed shows at Latitude, the National Gallery, Ibiza et al, the LEO recently played to a sold-out Royal Albert Hall, and have just released the soundtrack to US indie movie 20 Weeks.

In between days, she’s remixed Chicago legend DJ Pierre; collaborated with the likes of Katy B and Jamie Jones; paid her dues as a house DJ; and… spent a couple of hours with MusicRadar.com.

It’s interesting to hear about your reaction when you first saw Logic. Traditional music gets caught off guard by technology…

“Absolutely. I looked at the screen and for the first time in my life, I saw music as a waveform. And if you looked at a piece of recorded music on the screen, it didn’t seem so different to notation. You could see the gaps between notes; the difference between louder and quieter bits of music. I had spent many years studying music theory in a very traditional way, but Logic made complete sense to me. 

“I was maybe 18 or 19 at the time and, like most teenagers, I started to overanalyse what Logic might mean to me and my music. Miami University has one of the best music schools in the US, and most of the students on my course were, like… child prodigies. They’d had all the fancy lessons and been at music camp every summer since they were born!  

“Because everything up until that point had been focused on the piano, it felt like I’d completely screwed up my life. All those years of practice, for what? But seeing Logic opened up a different avenue. I had no idea what it was going to sound like, but I suddenly had this crazy idea of making my own music. With the help of technology.”

Classical music and technology do have a history. Musique concrète, the work of Stockhausen, Schaeffer, Morton Subotnick et al…

“Yeah, if you go back a few years, technology and classical music came together, and it was considered a valid musical form. But by the time I was at Miami, it felt like it had been kicked off its little throne. It was no longer cool to use technology; we had gone full circle. If you were playing a synthesiser, people would look down their noses. That isn’t ‘performing’ music, it’s just a toy.

“When I eventually went back to Chicago and started my music technology class, no one was interested in us. There were only three other people and me, and we were the black sheep. As far as I was concerned, that was an advantage, because it meant we had complete freedom with all the equipment they had there. 

“Along with places like Stanford and Columbia, my college, Northwestern, had been part of the original electronic music explosion, and they had a whole room full of things like the Moog Modular and Arp 2600, plus all the new stuff like Pro Tools and Logic.”

You’ve had over ten years of classical music training, and now you’ve been let loose with synths and DAWs. What did your music end up sounding like? 

“Very experimental. I was listening to a lot of Philip Glass and trying to emulate that repetitive, rhythmic feel. When you’re young, cocky and you haven’t had much life experience, you do that sometimes. You listen to a piece of music and you say, ‘That sounds easy’.

“Even with all my musical theory and all the technology, I soon hit the wall. I found out my limitations.”

But you made the Billboard Classical Chart nonetheless! 

“And it took me four months! I was learning how to use the equipment, learning how to remix, learning how it was supposed to sound… all without any help. My production technique was simple A-B referencing. Listening to other tracks and adjusting mine till it sounded ‘right’. 

“The fact that it did get noticed gave me tons of confidence. And it also reminded me that, sometimes, the best tool you can have is your ears.”

The fact that it did get noticed gave me tons of confidence. And it also reminded me that, sometimes, the best tool you can have is your ears.

You eventually moved to London because you were studying film composition at the Royal College of Music. I’m sure that some people would question whether you really need to study film composition. If you can write music, all you need to do is write something that fits with the images, no?

“I’m certainly not saying that you have to study film composition if you want to write film soundtracks. I actually did my first soundtrack seven years before I went to the Royal College of Music. It was a wonderfully nerdy film called The Atom Smashers. It was about the search for the Higgs Boson Particle. 

“I made that soundtrack in the way you’d expect: a MIDI keyboard and some plugins. But, even as I was doing it, I remember thinking, ‘Is this OK?’ I’d done years and years of music theory, but did I really know how to compose? I’m a perfectionist and I wanted to do it right.”

In an age where producers and musicians can struggle to earn money, soundtracks - for both films and games - are now seen as a legitimate way to make a living. 

“Yeah, but I was also interested in it from an artistic point of view. Film soundtracks offer you so much opportunity. So much scope. You’ve only got to look at something like the London Symphony Orchestra. A lot of the new works that they’re playing are film scores!”

From a purely audio perspective, are today’s plugins good enough to recreate a real orchestra?

“For me, it depends on the context. If you’ve got a big solo coming up, chances are that you’ll go for the real thing, just because it’ll be so exposed. If it’s a string pad sitting underneath everything else, you don’t need the real thing. In fact, there are times when I prefer the sound of an artificial string. I’m sure every composer would love to work with an 80-piece orchestra, but most of the time the budget simply isn’t there. So you compromise. Use the plugins for the long, extended strings, and record one or two real instruments over the top. 

“A two-minute cello solo from a plugin? That’s a tough one. You’ve got all those little scratches and buzzes… the humanity. That would take a lot of programming.”

Ah, the age-old philosophical conundrum. For Descartes, it was Mind-Body Dualism. For Sartre, the existence of human nature. For readers of MusicRadar.com, how do I make that budget cello plugin sound like it’s being played by Yo-Yo Ma?

“A lot will depend on the plugin you’re actually using. I’ve got EastWest’s Hollywood Strings and you get plenty of articulations with them… lots of parameters that you can play around with. Like I said before, sometimes, the best tool you can have is your ears. Does it sound OK?”

Where and when did you get the idea for the London Electronic Orchestra? Basically, it’s you handling the electronics, plus two cellos, two violins, double bass and harp. 

“When I arrived at the Royal College of Music, I was, for all intents and purposes, an electronic musician. I assumed that, at the RCM, I’d only be working with real instruments and I’d have to ditch the electronic stuff. I didn’t complain, because that was the reason I was there. I was studying under [veteran film and TV composer] Howard Davidson, and he was the one who kind of encouraged me.

Instead of ditching the electronics, he said, ‘Look, you’ve already found your sound with the electronics, so why not combine that with the orchestra?’ Initially, I was worried that my music wasn’t skilled enough… it didn’t have enough virtuoso elements. But it was Howard who kept saying, ‘You’ve got your sound. Stick with it’.”

How do you write for the LEO? Computer or piano? On your own or together with the whole ensemble? 

“I use Pro Tools to record any live instruments, but I write with Logic. I’ve kind of used it ever since I first saw it back in Miami. I did have a slight detour to Ableton Live around 2011, but I now look back and think of that as two wasted years.

“I got persuaded because everybody kept saying to me, ‘Ableton is so quick’. I am, by nature, quite slow in the studio, so I was immediately interested in anything that could speed up the process. Unfortunately, Live never worked for me and I ended up sending everything back into Logic. It sounded better. I do use Ableton for live shows; that’s it’s strength. 

“I generally start with a fairly tough beat and then build up the chords to create a solid eight-bar loop. That’s usually at the heart of an LEO song. I tend to work with the Garritan Aria Player because it saves me having to worry about finding a piano sound or a flute sound… it’s all there. When I’m writing, I’m not too worried about getting the different elements to sound perfect. What I need is a version of the song that is good enough to take to the cellists or the harpist; something that provides a template for them to work with.

“I write a few synth parts, too, using Logic’s Retro Synth, but they usually get replaced with hardware when it comes to finishing the track.”

How about hardware synths?

“I share a studio with Jamie Jones in East London, and we’ve got a couple of racks of cool synths and drum machines… TR-909, TR-707, Juno, SH-101, TR-8, the Korg EMX2.”

And lots of effects?

“It’s mainly EQ, reverb and delay. The FabFilter Pro-Q 2; I could use the Logic EQs, but I find this easier to work with. The Waves H-Delay is good for adding a bit of movement and space to things like pizzicato strings. Be gentle, though! You only need, like, 10 or 15%. Reverbs are either Lexicon or the Waves R-Verb. The R-Verb always works well on big string sounds.

“Anything else I use is generally there as an ‘effect’. I use it to try and make something sound different or weird. Soundtoys is my go-to… Decapitator to add grit, or Crystallizer if I want to go crazy.”

And how does all that work when LEO plays live?

“I had a lot of help from Dan Bora, who is Philip Glass’ live engineer - we first met back in Chicago. When I started to think about LEO live shows, he came into the studio for a couple of days and helped us look at mics and monitors. When you’ve got all those live instruments on stage, you need to hear what you’re playing, but you’ve also got to watch out for feedback.

“I run everything that’s not being played live from Ableton, but tend to focus most of my energy on playing live piano and keyboards from a Nord Stage 2. I’d love to be able to start manipulating the live sound, too, but I’d probably need a couple of extra pairs of hands for that.”

For as long as musicians have been recording, sampling or even playing electric guitars and synths, there’s been a debate about what is ‘real’ music, but it now feels like we’re entering an era where that debate is becoming irrelevant. As technology advances, it all becomes ‘music’. Where will you take it from here?

“Jamie Jones and I are already working on a project that we hope might get a release towards the end of the year. More of an orchestra… with more electronics. I guess it’s musical evolution.” 

Kate Simko’s original music score for 20 Weeks is out now.

Get over 70 FREE plugin instruments and effects…
…with the latest issue of Computer Music magazine