DAACI is an innovative new ‘meta-composition’ tool that its creators say will revolutionise producer workflows across the board. But unlike other models, they claim, this one will put human beings right back at the centre. We spoke to DAACI CEO Rachel Lyske to find out more...
Can you give us an overview of just how DAACI works, and what benefits it brings for musicians and composers?
“We created DAACI to be the next step in the evolution of music creation and to be an industry leader when it comes to AI and music. Our goal would be to empower the composers of today and tomorrow, allowing them to create music like never before with the power of collaborative and assistive AI technology.
“One of the major innovations with DAACI is that it isn’t just a generative AI that ingests loads of pre-existing music and puts out its idea of what that sounds like. DAACI’s team, is made up of composers and musicians who all respect the creative process and the rights of artists, and there are serious questions to ask about the ethics of training AI on other musicians’ copyrighted work.
“Our AI isn’t trained on other people’s catalogues. DAACI is different in that it composes, arranges, orchestrates and produces music with authentic and high-quality output, empowering producers and composers and giving them the opportunity to take on projects that they may not have had the capacity to deliver before.
“Traditionally, composers have had to compose by inputting specific note choices in a DAW, which puts all kinds of limitations on their creative process. With DAACI composers still compose by encoding their musical choices, or ‘meta-composing’, allowing the AI to compose for them on the edge. So, If you’re a game developer looking for your next soundtrack, even a dynamic soundtrack catered to the individual, DAACI can assist you to create the sonic environment you envision to help make your game-worlds dynamic, engaging, and highly personal.
“DAACI will also open doors for artists and composers to get paid whenever their music is used. Whether you’re using those AC/DC-inspired guitar riffs or a funky drum pattern inspired by Chad Smith, DAACI can trace where every note comes from and will circulate the rights back to the original source of inspiration. This is the truly unique aspect and is what positions DAACI at the forefront of the industry, and really enhances our message of ‘created by composers, for composers’.”
How do DAACI’s learning algorithms differ from those of other AI music- generating software?
“Most music generation startups set out to build end-to-end systems that can generate a full music track based on very little user input, like a few seconds of music or a set of rough keywords. Like, if you tell the software that you want music that is ‘happy’ and ‘classical’, it will try to create its own interpretation of what those words mean musically.
“Most of these systems rely on deep learning methods trained on large existing collections or, alternatively, play ‘musical Tetris’ with pre-composed loops or static elements. This takes human creativity almost fully out of the loop, and it’s actually ethically questionable. The reason why we respond to music is because of the human connection and the creators behind it.
“Unlike other solutions in the AI and generative music market, DAACI doesn’t rely on pre-recorded tracks or edited audio samples. Instead, it writes musical elements and textures directly through the encoding of musical ideas. So, it’d really be more accurate to say that DAACI has been taught rather than having learnt, and that’s an important distinction when we’re discussing the ethics of those solutions that feed on catalogues of music.”
Where do you see DAACI being deployed in the industry. Do you see it being central to a producer’s workflow?
“Absolutely! We’re excited by the power of this system and what it will do across the music industry and many others. We see DAACI being deployed to various markets, each with differing needs. We’re able to develop interfaces for composing with different levels of user control depending on who our end user is and what they need.
“Take content creators, for example. They could be provided with a library of ready- made tracks generated by DAACI’s technology, or perhaps a bit more control through an interface that lets them specify what track they want in terms of genre, mood, and maybe instrumentation.
“On the other hand, more musically-inclined users, like songwriters and producers, will want a finer degree of control over certain musical elements that DAACI generates. So, an interface designed for them would include more parameters related to musical elements within a track and the ability to export the output in an editable format.
“We see it being used as an assistive tool to be integrated into professional media production processes, from film and TV production to game development, as we spoke about before, to enhancing virtual worlds. Composers and commissioners would adopt DAACI in order to scale the amount of music for a brief, and to speed up time to delivery in order to meet tight production schedules’ deadlines.
“Ultimately, we aim to offer solutions for creative people and creative companies and will be central to the way music is created and monetised.”
Can you tell us a little bit more about the bespoke synth module and how it works to create ‘environments’?
“The synth module is what DAACI uses to generate high-quality and coherent audio from MIDI data. It is, in essence, a DAW without any note information. It contains what we could call ‘sonic worlds’ made up of distinct production environments. The environments themselves are made up of various combinations of software instruments, samplers, effects and parameter data, like fader levels, that work together to form a musical arrangement that leads to a coherent mix. The environments also include the necessary routing for buses and groups – all of those aspects of the output’s mix that we’re familiar with in the recording booth, like reverb, delay, group compression, mix bus processing, etc.”
“The synth module also allows for each track or ‘element’ to be viewed as a separate and distinct entity from its environment, meaning it can be combined with any different number of tracks from other environments, creating new composite environments.
“The producers and composers are the ones creating all of the elements, so they’re the ones engineering this vast and growing ecosystem of interconnected worlds. This is as powerful as it sounds. It means we have the principles of composition for any musical texture that we can call and combine in any circumstance. It’s the ultimate solution to collaboration and empowering composers to meet briefs on the edge.”
How has the experience of working with Abbey Road Studios (who recently absorbed DAACI into their Red incubation program) helped to develop DAACI and has it engendered deeper connections with the wider industry?
“The team at Abbey Road are incredibly supportive and sensitive to what we are striving to achieve with DAACI. We’re building our system as a creative tool for the likes of the very people that walk their halls so advice, guidance and feedback from within the industry are crucial to our mission. We have an incredibly powerful system and together with the industry we can build something special that is value- adding for artists, composers, producers and music makers of all kinds.
“We have been able to speak to incredible people within Abbey Road Studios and UMG, pitch our ideas and technology, get their feedback and get a steer on how we should develop it, whilst respecting artist rights and ethics and so much more. We are looking forward to that continuing throughout the programme.”
Are you thinking of rolling out a subscription-based model to DAACI?
“We’re developing a range of different products. Many of them will involve us integrating our technology into other applications and platforms to extend their musical offering. We’re also working on a software application that’s aimed at helping artists and songwriters in their creative process, and that could well be a subscription-based application.”
It seems like keeping a grasp on the ethics of AI is central to DAACI. Do you think that artists, producers and performers will stay at the heart of music creation in the future?
“Absolutely. DAACI has been built by composers, artists and producers. They’re crucial to this whole ecosystem working. It won’t exist without composers. We aim to assist, amplify and streamline their process so that they can write faster and write more. It’s our view that artists, producers and composers will continue to be the cornerstone of music creation. DAACI is interpreting what has come from a composer’s brain rather than trying to imitate something that’s come from them, and that’s an important distinction.”
Finally, what’s next for DAACI, and how (or when) can our readers get hands on with what DAACI can do?
“We’re working towards rolling out our technology in various forms by the end of this year. You can find out how to access DAACI’s technology by watching out for announcements on product launches across our channels. Exciting times!”