top of page
Writer's pictureDaniel Anstandig

AI Will Give Way to New Music Genres and Styles

AI music abstract image

For many, the concept of AI-generated music might feel like a futuristic gimmick. Something soulless or worth criticizing. But the reality is that AI in music is not new; it’s already a significant part of the creative toolkit for many artists and producers.


Despite the debates… artists are embracing AI as part of the production process. In a recent talk at MIT, musician and futurist Will.i.am shared the origins behind his journey into AI-generated music, emphasizing how AI doesn’t stifle his creativity, but rather “liberates” it.


After all, technology has always fueled music. And every time there’s been a new innovation, there’s also been skepticism or outright fear.


As superstar DJ David Guetta said to the BBC: “Probably there would be no rock ‘n’ roll if there was no electric guitar. There would be no acid house without the Roland TB-303 [bass synthesizer] or the Roland TR-909 drum machine. There would be no hip-hop without the sampler…”


I believe AI will actually give way to entirely new styles and perhaps even genres of music in the coming years. And it seems Guetta agrees, alluding to AI: “I think really AI might define new musical styles. I believe that every new music style comes from a new technology.”


It’s already happening. Today, digital Audio Workstations (DAWs) and Auto-Tune are used by nearly every modern music producer and artist, in the studio and very often onstage. But when Pro Tools hit the market in the ’90s, alongside Auto-Tune, it stirred up quite a bit of controversy. Musical “purists” were up in arms, claiming these technologies were a shortcut, a way for artists to bypass the hard work of nailing correct pitch and timing. But now that AI is so ubiquitous, we barely notice anymore. Most vocalists assume that their tracks are going to be “tuned” as a normal course of business.


Music that is entirely created by AI is not the norm, but music partially created by AI is more of a norm than most realize. Identifying AI-generated music is challenging because of how good the technology has become. Even in songs we might label as entirely “human-made,” there are likely elements that have been shaped or enhanced by AI. We’re gradually merging with the tech, so it’s challenging to untangle the human from the machine in the creative process. And in my view, that makes the music, and the songs we choose to play, all the more interesting.


Let’s clear up a common misconception — AI-generated music doesn’t mean we’re about to replace artists with robots. AI tools are essential aids for musicians and producers. Tools to help in ideation, sound creation, and even in mimicking and enhancing unique sounds. AI even has the ability to create sounds that aren’t possible for humans to create, opening up a wellspring of creative possibilities.


Artists have embraced AI beyond just auto-tune or DAWS or even full song generation. For example, Paul McCartney used AI to remaster an old John Lennon recording and create the “final Beatles record” in 2023 with “Now and Then.” Björk has used AI for both music and visual art projects. Her installation in a hotel lobby in New York harnessed AI to change her pre-recorded choral arrangements in relation to the weather outside the building and the time of day.


I can speak from personal experience about the power of human musicianship and creativity combined with technology. Several years ago, I started a publishing company that produces music for commercial sync and up-and-coming artists. You can check out some of the music I’ve written under the name Rhythm&Truth here. As a lifelong drummer, I still prefer a drum set over samples. We have yet to use AI to create music, but we have certainly utilized AI-driven production tools for editing and creating new and different sounds. This blend of traditional and modern techniques exemplifies how AI can enhance rather than replace human creativity.


There’s more AI-fueled use cases around the corner. Imagine a live performance where AI adjusts the music in real-time based on the audience’s reactions? Or set pieces that change with audience smartphone participation? Music is not a static thing, it’s evolving and experiential, and AI can supercharge that experience.


Why is AI in music controversial?


Besides artistic preferences and scruples, there is controversy about how some music AI is trained.


Recently, major labels including Universal Music Group, Sony Music Entertainment, and Warner Records want to define what “training data” is in order to protect themselves and ensure artists get paid. They have taken legal action against AI music-generating companies Suno and Udio, arguing that their software was trained on copyrighted material and so violates copyright laws. Many musicians stress the importance of ethical data practices, where artists own their data, and systems where AI is designed to respect and protect this information or “training data.”


Copyright-respectful AI is going to be a big topic in the coming years. This is why Futuri’s SpotOn (automatic commercial/spec spot creation) guarantees that all our music in commercials is cleared for use.


The controversy about “training data” is that “training” takes so many forms. As Will.i.am put it: “What did Michael Jackson train on? He trained on James Brown. Did Michael Jackson pay James Brown for everything he did that was inspired by him? No. Prince was also inspired by James Brown. If inspiration is considered training data, it’s the same concept.”


Creativity is iterative, and everything we create using our brains is likely just a repatterning or remix of things we’ve absorbed before. For generative AI that has trained on copyrighted material, the case will inevitably be made that “artists have trained on other people’s data for generations.” Why is AI different?


While I don’t foresee stations sprinting to play fully AI-generated music (until certain AI artists/songs become abundantly popular), the integration of AI in music production is a clue for broadcasters. AI tools can help producers create unique, personalized imaging for stations and client campaigns, enhancing the listening experience.


If a legend like Paul McCartney can embrace AI, why shouldn’t the radio industry explore AI voices for various shows or to create better commercial campaigns? Or create entertaining hybrid human/AI shows?


As AI continues to evolve, we will likely see certain audiences that value the unique touch of human-created music. However, just as synthesizers and drum machines have become accepted tools in music, AI will become seamlessly integrated into the creative process. The key is not to see AI as a replacement but as an enhancement, a way to unlock new levels of creativity.


This article originally ran as a column on Inside Radio.

コメント


Keep updated

Get alerts when new podcasts and articles are published.

Thanks for subscribing!

bottom of page