AI audio giant ElevenLabs, best known for its cutting-edge text-to-speech tools, has taken a bold step into the world of music. On Tuesday, the company unveiled a new AI music model. It says the output is cleared for commercial use.
The launch marks a big milestone for the company. Founded in 2022, ElevenLabs built its name on voice synthesis. It also focused on realistic dubbing and multilingual audio translation. Now, the company is expanding its scope. It’s moving into AI-generated music. This fast-growing industry is already stirring debates. Questions of creativity, copyright, and ethics loom large.
Music at the Push of a Button But Is It Legal?
As part of the rollout, ElevenLabs released several examples of music created using its new tool. One track features an AI-generated voice rapping lines like
“Came up through the cracks with ambition in my pocket, from Compton to the Cosmos.”
The synthesized lyrics copy the cadence and themes of hip-hop icons. They echo the style of Dr. Dre, N.W.A., and Kendrick Lamar. These artists draw from personal experience and lived struggle. Critics raise concerns. They say AI risks copying cultural narratives. It can do so without authenticity or consent.
This issue goes beyond style. The core legal and ethical debate centers around training data. How are these models learning to make music, and what copyrighted material are they being fed to do so?
Legal Precedents: Suno, Udio, and the RIAA Lawsuits
The tension between AI music and copyright law is growing. In 2024, Suno and Udio faced lawsuits from the RIAA. The RIAA represents major U.S. record labels. It claimed the startups trained models on copyrighted songs. They did so without permission or payment.
The cases are still in progress. But they set a clear precedent. AI music tools must handle copyright with care. Without proper licensing, companies risk serious legal trouble.
ElevenLabs’ Strategy: Licensing and Partnerships
To avoid similar pitfalls, ElevenLabs is proactively building partnerships with music rights holders. The company announced licensing agreements with Merlin Network and Kobalt Music Group, two of the largest publishing and distribution platforms for independent artists.
-
Merlin Network represents a global catalog of independent artists, including Adele, Nirvana, Mitski, Phoebe Bridgers, and Carly Rae Jepsen.
-
Kobalt Music Group manages the publishing rights for artists such as Beck, Bon Iver, and Childish Gambino.
A spokesperson from Kobalt told TechCrunch that their licensing agreement with ElevenLabs includes strict opt-in policies meaning only artists who explicitly agree to have their music used for AI training will be included.
“Our clients benefit directly from this agreement,” the Kobalt rep explained. “It opens a new revenue stream, includes revenue sharing, and provides safeguards against misuse.”
These deals are crucial for ElevenLabs’ credibility. By securing licensed content and sharing revenue with rights holders, the company aims to balance innovation with legal responsibility—a strategy that could set a new standard in the AI audio space.
Read More: Can You Tell If That Song Is AI-Generated or Human Written? Here’s How to Check
What This Means for Creators and the Future of AI Music
The commercial clearance of ElevenLabs’ music tool has wide effects. It gives creators, advertisers, game developers, and indie filmmakers a new option. AI music is cheaper and more flexible than traditional licensing.
But ethical concerns remain. Can machines make “authentic” music? This question is sharper when genres come from personal identity and cultural struggle. Another issue is fairness. Artists deserve compensation if their work helps train these models.
As AI music grows, ElevenLabs steps into a complex space. Laws, ethics, and cultural expectations are shifting. The coming years will decide its future. AI music could be a useful tool. Or it could become a disruptive controversy.



