Aimi is the generative music platform using AI to reimagine how we listen to and produce electronic music

aimi
(Image credit: Aimi)

In an ongoing series exploring how artificial intelligence, machine learning and music-making continue to intersect in 2022, we’re speaking with leading industry figures to find out more about how AI powers their technology and what the future holds for AI in music-making. 

Artificial intelligence and machine learning is growing increasingly prevalent in the music production world - in fact, you may already be using it. It seems as if it's almost every day that a new AI-powered plugin pops up in our inbox, with these intelligent tools claiming to be capable of everything from drum sequencing and tape emulation to mastering and sample organisation. 

Most of these tools use AI to do things that were already possible in a faster, more efficient or more effective manner. Some are taking things a step further, however, repurposing these advanced technologies in an attempt to create entirely new music-making paradigms.

That’s what CEO Edward Balassanian is hoping to accomplish with Aimi. A generative music platform that seeks to “fundamentally change the way the art form is created, consumed and monetized”, Aimi uses AI to generate endless and immersive compositions constructed from musical material provided by artists. They purport to offer a radically new form of listening experience, one which is infinitely evolving and adapts in response to the listener’s feedback. 

Our platform allows artists to create generative programs that effectively mix, master, and produce music in real-time, replacing much of the tedious process of hand-assembling music in studios

Here’s how it works. A producer presents the software with a library of individual stems, which are analysed, organised and reassembled by the AI to create a generative musical ‘experience’. The source material is heavily manipulated and restructured by the software in real-time according to algorithms determined by Aimi and tweaked by the artist, resulting in a perpetually unfolding composition that should almost never repeat itself. The listener can offer feedback to the software (currently in the rather rudimentary form of a thumbs up or down) and it will adjust the composition in response, shaping the music to suit the listener’s preferences. 

Aimi’s mobile app offers generic experiences categorised by mood, in addition to a paid-for tier of artist-branded experiences generated from material provided by some pretty respectable producers and DJs, working across various genres and scenes. 

According to Balassanian, though, the app is just the beginning. They're working on expanding the technology into a set of advanced tools for all producers to use, in the hopes of building a creative ecosystem in which all artists can use Aimi to create their own experiences, share individual musical ideas with collaborators across compositions and monetize their stems using smart contracts. 

Aimi can express you in an infinite number of ways, using your own words

It seems as if they’re not the only ones who believe in the idea, either: late last year, Aimi raised $20 million in a Series B round of funding that saw participation from Founders Fund, a tech-focused investment group with a portfolio that includes Spotify and SpaceX. 

Their ultimate aim is to reimagine how we consume and produce electronic music. No longer will the focus be on static, fixed tracks that are enjoyed individually and in sequence. Instead, we will consume and contribute to an ever-evolving tapestry of ideas, in which the musical threads are recreated and recontextualised in infinitely variable configurations by a creative partnership of man and machine.

We spoke with Aimi’s CEO to hear more about how the software works, what their long-term ambitions are for the platform, and where the convergence of AI and music-making might lead us in the future.

Could you tell us more about the story behind Aimi, and where the idea came from?

“The concept for me really was born out of watching longform performances by DJs. I'm talking about four or five-hour sets, where you'll watch an artist essentially weave a story using music samples from all kinds of different sources that they've been collecting over their career. 

“The way that those artists tell the story and weave together music, and do it in a way that really takes into consideration the audience and the genre that they're expressing, is really where the idea was born. The basic concept was to give artists a way to do that, even if they're not performing live - to essentially allow them to have that same expressive quality without having to be behind the DJ booth.”

When an artist or producer is tasked with creating an Aimi experience, how does the process work? 

“We work really closely with artists right now. There's a fair bit of education that goes on in helping the artists understand this new world of programming music as opposed to playing music. That's really a new model. We're essentially telling artists that, in addition to being able to play your music, you can now program it. And by programming it, the software can express you in an infinite number of ways, using your own words. 

That's a critical part of this genre of music, that you're constantly stretching and twisting and repurposing content and presenting it in new ways that are sonically interesting to the listener. Aimi does a lot of that in real time

“So we work really closely with them to help them understand the way the system works. We have a set of desktop tools that provide us with the ability to import their content, and our AI will actually organise their content into various different groups like beats, bass, pads, harmony, melody, effects, etc. 

“It will also identify different textures and qualities associated with those loops. So we can understand that one might be more melodic, or another may have some vocal quality to it. All of those attributes are then used by the engine when it stitches together the music in real time. 

“Right now artists work very closely with us to help configure the AI, and to help with the tuning of the experience, once everything's in there. We do a fair bit of listening to make sure everything is tuned properly to the artist’s liking.”

How much material do you need from them to produce an experience?

“We typically ask artists to give us about 100 loops. The more content we have, the more expressive we can be. So you can imagine there's this kind of combinatorial explosion that happens once you give us a lot of content. You know, with the artists that have given us 100 loops, these loops are usually four bars in length.

I see a future where the art of music is elevated by the science of technology

“So you can kind of get an idea of the total amount of content. That amount will play for a very long time, hours and hours, before you get any kind of sense of track fatigue, or the feeling that you've been hearing things over and over again.”

Does the material they provide need to be the same tempo, or the same key? Or does the technology account for those factors? 

“Aimi is effectively mastering and producing in real-time. So we do have pitch-shifting and tempo-shifting ability. We're constantly aligning things to the grid that we're maintaining. It depends on how the artist wants their experience to sound, sometimes they're all in the same key. Sometimes they have compatible keys in there. 

“Now, what Aimi’s not going to do is wholesale rewrite your music. So we're not going to transpose or shift anything wildly to change the nature of the music itself. So we're really doing that pitch-shifting, both for harmonic compatibility and also to provide some variation. So we might pitch up or down a little bit, just to create some variation in how we present different musical ideas throughout the performance.”

What feedback have you had from producers on their experience in making music for the app?

“We’ve actually just started a series of videos on Instagram showcasing some of the artists that we’re working with. We just posted one with a producer out of Berlin, who's a meticulous craftsman, and it's really interesting that somebody who's spent so much of his career being in complete control of the music that he's creating is now working with a generative music platform, where part of it is giving up control. 

We're mastering and essentially producing it on-the-fly, based on what's happened and what's going to happen in the music

“That's been a really rewarding part of this, seeing how that's become part of the art now. Instead of it being intimidating, or making them feel like they're disenfranchising themselves, I think they feel the opposite, where they're really empowered. The reason for that is because we always put the artist first whenever we talk about the music. We talk about the artists doing something, not an AI doing something. The AI is really an instrument that the artist uses to create the music.”

The software is mixing together many different stems. In music production, this process of combining layers is usually accompanied by processes like EQ and compression. Does Aimi do any of that?

“That's a great question. The goal is that we are able to reuse these musical ideas in a number of different ways outside of the experience that the original artist has created. So ultimately, artists will be able to sample each other's musical ideas. That necessitates that Aimi does the mixing, the mastering and the producing in real time. So all that stuff is happening when you hit play. 

“We're typically doing it faster than real-time, so that we can be ahead of the music, so we know what's coming up. We're mastering and essentially producing it on-the-fly, based on what's happened and what's going to happen in the music.”

The goal is to hide the complexity of this generative platform from the artists so they don't have to get buried in software development

Does it have the ability to do effects processing on-the-fly too? Adding reverb and delay on build-ups, that sort of thing?

“It can definitely do that. That's part of what the artist is able to tune. They can tune the frequency of how often the build-ups happen, how often breakdowns happen, how often risers happen. There's all kinds of parameters that the artist can tweak to shape this multi-dimensional music space. 

“There’s also a lot of real-time effects and modifications in the music to create variation. That's a critical part of this genre of music, that you're constantly stretching and twisting and repurposing content and presenting it in new ways that are sonically interesting to the listener. Aimi does a lot of that in real time.”

You’re working with a real spread of electronic artists. Can you imagine Aimi working for other genres of music?

“Definitely. In December, we announced that we've created a new version of Aimi that's based on a scripting language called Aimi Script. This essentially gives us programmatic control to express any kind of music that we want. Having said that, right now, we're a culture play, and this is as much about the culture of electronic music, as it is about this generative platform. We really want to stay true to that for now. 

We always put the artist first whenever we talk about the music. The AI is really an instrument that the artist uses to create the music

“As we get closer to the end of the year, and into next year, you'll see us broaden the styles that we're incorporating in. Aimi is not going to be a singer-songwriter platform anytime soon. We're not going to be doing rock ‘n’ roll or folk music anytime soon. But you will start to see a lot more diversity in the genres under the electronic music umbrella.”

Have you considered creating something that more amateur producers can use, a tool that can be used to input their own content into Aimi and create their own experiences?

“Absolutely. We’re working on a set of desktop tools that we're going to release. One of them it's called Aimi Studio. The goal behind this is to really hide the complexity of this generative platform from the artists so they don't have to get buried in software development. The idea there is to really unleash the inner artist in all of us. 

“We feel like electronic music uniquely gives us the ability to be curators of different sounds and tastes, and if we have easy ways to express that using a generative music platform, then everybody's an artist. That's really our long term goal. We want everyone to feel empowered to create music.”

It says in your press release that Aimi is looking to fundamentally change the way that music is created, consumed and monetized. Could you expand on that?

“Look, we're not trying to be overly ambitious in our goals. But the basic point here is that to date, the song is what's been monetized. That's created a difficult relationship between the creators of the songs, the owners of the copyright for the songs, and then the people consuming those songs. 

“In Aimi, what's monetized is the musical idea, the building blocks to these songs, and it's not about copyright, per se, it's about you creating a sound and that sound being part of your experience. In our model, every one of those musical ideas has a smart contract associated with it. So wherever it goes, even if another artist samples your melody, or harmony or your bassline, you're going to get paid your pro rata share of the time quantum that your musical idea represents. 

The heartbreak from a breakup, the loss of a loved one, the triumph of success, the sadness of depression... these are human conditions that can only be approximated by machines

“To do that, we have to build the whole ecosystem. Aimi is much more than just the player. In fact, the player is just really an MVP for us right now. We have a lot more functionality that we need to build in the player, there's also this whole system behind it that allows for the uploading of these musical ideas, attributing them, capturing them with smart contracts, and then allowing them to be repurposed and sampled into other experiences by other artists. 

“So that's what we mean when we say that traditionally, the creation process has really been focused on creating songs, and we're moving away from that. The monetization, or I should say, the publication side of things has been about publishing that recorded song, and we're moving away from that. Ultimately, in Aimi, you're making money off your musical idea, not the publication, because there is nothing published. There is no song.”

There’s no doubt that AI is playing an increasingly significant role in many areas of the music industry. Where do you see that going over the next decade?

“AI will have applications across the creative process (tools for artists), how music is distributed (recommendations),  how music is performed (performance tools for DJs), and how music is generated (generative algorithms).”

And when it comes to music-making and music production specifically, can you see any further applications for AI technology opening up?

“Aimi's focus as a platform for generative music is largely on music-making and music production. We see AI being applicable in ways that augment and elevate the creative process for artists. 

As we teach computers to do what musicians do today, we will naturally elevate the musicians to do things that computers can not do today

“This includes organizing and categorizing audio content to allow artists to leverage large bodies of sounds more intuitively, providing programmatic ways to express generative music that aligns with the artists style, using AI to compliment procedural solutions for generative music (e.g. combining learning with rules), and leveraging AI to create generative instruments that can play alongside recorded sounds (e.g. vocals) to ensure music is organic and humanized.”

Do you think a computer or AI will ever be able to do what a musician does, to the same level? Can machines be creative?

“Yes, I do. But as we teach computers to do what musicians do today, we will naturally elevate the musicians to do things that computers can not do today. Said another way, we are teaching computers to do things musicians do today, so musicians don't have to do things that machines can do. 

“Having said this, I do believe that the soul of creativity comes from human emotion. The heartbreak from a breakup, the loss of a loved one, the triumph of success, the sadness of depression... these are human conditions that can only be approximated by machines. This is why an artist like Bon Iver, who locked himself in a cabin to sing about a breakup, will speak to us in ways that a computer can never do. 

While parts of the music production process will be replaced with generative platforms like Aimi, I do not see a future where art goes away

“I also believe that we are drawn to artists because we can empathize with their emotional stories as told in music. As fans, we might listen to computer-generated music, but without the artist involved it's hard for us to empathize or follow a computer. This is one of the reasons why Aimi believes strongly in empowering artists to create music rather than replacing artists with computers.”

Do you think it’s plausible that one day rather than a music producer in a studio, there might be an AI-controlled assistant working with artists to produce their music?

“Not only plausible, this is largely what Aimi is delivering to our artists today. Our platform allows artists to create generative programs that effectively mix, master, and produce music in real-time, replacing much of the tedious process of hand-assembling music in studios. In essence, we are enabling artists to program music, rather than simply play and record music.”

Detractors might be concerned about people being replaced by computers thanks to this technology, even within the music industry - is there any justification to that?

“I do not believe pure AI music will ever replace the artist and the connection the artist has with a fan. While parts of the music production process will be replaced with generative platforms like Aimi, I do not see a future where art goes away. Rather I see a future where the art of music is elevated by the science of technology.”

Visit Aimi's website to find out more. 

Matt Mullen
Tech Editor

I'm MusicRadar's Tech Editor, working across everything from product news and gear-focused features to artist interviews and tech tutorials. I love electronic music and I'm perpetually fascinated by the tools we use to make it. When I'm not behind my laptop keyboard, you'll probably find me behind a MIDI keyboard, carefully crafting the beginnings of another project that I'll ultimately abandon to the creative graveyard that is my overstuffed hard drive.

All-access artist interviews, in-depth gear reviews, essential production tutorials and much more. image
All-access artist interviews, in-depth gear reviews, essential production tutorials and much more.
Get the latest issue now!