July 10, 2025
THE INDUSTRY MUST UNITE FOR FAIR & ETHICAL USE OF AI
COME WITH ME IF YOU WANT YOUR CAREER TO LIVE
Sometimes it feels like absolutely nobody asked for it, but generative AI is here. And clearly, it has the potential to transform the way music is made, the music industry, and the relationship between artists and their art. Building on a standout panel from IMS Ibiza - Gen AI & Music Rights: Don’t Get Played, Get Paid - Hosted By AFEM: Association For Electronic Music - if you're a creator of music, a rights holder or a fan, whether you're excited about the creative possibilities, concerned about the impact on your livelihood, or both, there are things you can do now to better prepare.
Of course, music is one of the most resilient artforms and cultures - and we've had panics before. The sync function on CDJs was going to kill the art of DJing. Streaming was going to destroy record labels. When the first synthesizers started rolling out of factories in Japan, the Musicians Union in the UK wanted them banned as a threat to 'proper musicians'. Somehow in each case the world continued to turn, and great music and performances continued. Mostly.
So why do 74% of producers and musicians surveyed in the UK, and 71% in Germany and France, worry about the impact on their livelihood from gen AI? The biggest issue is that gen AI needs to be trained on existing, human made music, and it's staggeringly obvious - from the scores of lawsuits happening all over the world, from the rare peeks under the veil of secrecy that most developers operate beneath, and of course from what these gen AIs actually generate - that the vast majority of tech companies have already trained on millions of tracks without paying the creators and rights holders of that music for the privilege. Or even asking. That's not terribly encouraging for the relationship between human artists and creators and the pioneers of the brave new AI future - an industry projected to be worth US$66.89bn in 2025.
Add to that studies like the German rights org GEMA's excellent AI report (a key basis for their recent lawsuits vs Suno and Open AI) suggesting that the global market for AI music will increase by 60% by 2028. That has huge implications for the development of new genres (will AI ever be able to invent a dubstep, a techno, or even a schlager? Or will it simply churn out more of what's popular, trapping us in a self sustaining, 'You Might Like' algorithm loop for the rest of eternity?) and for the diversity of music and the people who get to make it. What - who - is that AI music going to replace? And will it mean fewer opportunities for humans to make a living making music?
It's not surprising so many in our community are worried. So if you're a rights holder, a producer, or a creator of music… what can you actually do about it?
1 Audit your contracts
If there's one piece of advice that the Association For Electronic Music wants artists and creatives to take to heart, says Jay Ahern (Chief Growth Officer at AFEM and a pivotal figure behind AFEM's AI guidelines) it's this: bring your contracts up to date. The standard boilerplate does not generally feature anything about the use of your music for AI training - and artists, labels, distributors, anyone in the IP chain need to make sure that's in place, wherever the future holds. Close the loopholes, get it in writing, and you'll be far better prepared to face what's coming.
2 Unify with your community
If that sounds potentially complicated and expensive, that's where number two comes in. There are communities out there, formal and informal, that have resources (like contract framing advice), expertise (AFEM's membership ranges from festival promoters to agents lawyers), and, maybe most importantly, different perspectives. Not to get all PLUR on you but electronic music is a unifying force full of people who will help out if you ask them. There are class action lawsuits in the US challenging AI companies, but in places like Germany and France it's rights organisations like GEMA and SACEM doing the hard yards of research, the lobbying of governments, and the filing of lawsuits on behalf of their members. Not a 'joiner'? There are still people out there on your side, Sarah Connor. Not least because the entire electronic music industry - all the agents, managers, labels, promoters, lawyers and marketers and journalists - they all rest on a bedrock of what Jay calls 'the creative class' - the people who make music.
3 Support ethical AI tools - and encourage your peers to do the same
There are some great AI tools out there, and there are producers in electronic music already using them to create new and exciting things. But maybe we should start looking at tools and services that don't operate on a fair basis, that don't pay fairly for training, as sitting morally somewhere between battery eggs and blood diamonds. Set a good example, have a clear conscience, and just maybe, don't shy from a little bit of mild social shaming of your peers. That'll be easier, of course, if and when a convincing and trustworthy certification is introduced for 'fair trade AI' - something Jay says is a potential endgame for AFEM's AI principles.
4 It starts with transparency - and it could be solved by technology
Earlier this year, US publication The Atlantic launched a database authors could use to find out if their books or publications had been used to train Meta's AI. The 'Have I Been Trained' website does the same for images. But right now there's no comprehensive, reliable way to find out if your music has been used - "there's no transparency at the moment at all," says AI law expert Marco Erler. If there's one single starting point for any real sustainable relationship between rights holders / artists and AI tech companies, it's this transparency: who has used what, when and how. Support the organisations, lobbyists and legislators pushing for that transparency first - and then we can have a real dialogue. And it could be technology rather than the law that helps find a new equilibrium. Companies like AIxchange (as just one example) are trying to build an ethical marketplace for the use of music in training sets. Cloudflare, which hosts approx 20% of the internet, is now blocking AI web scrapers with the aim of creating a 'permission based model'. There are ongoing attempts to watermark music in its metadata or even (eek) on the blockchain. Seek them out.
5 Lean into what makes you unique - and how you connect with your audience
The dominance of streaming algorithms has already demonstrated that depending on someone else's technology as your main point of connection to your audience is a dangerous and precarious strategy. The rise of AI generated music will only exacerbate that. But it will also underline just how important it is to be unique and innovative, to find new ways of building a resilient, authentic, human connection with your fans whether IRL or digitally, to do all those things that no string of code will ever truly be able to replicate. You should be doing all that already, but there's never been a better time to double down.
6 Don't panic
Like with so many things in the 2025 zeitgeist, it's easy to feel powerless. But it might be worth remembering that many AI models have an insatiable need for new data (i.e. new, human produced music); if that dries up, and they have to start cannibalising (or poisoning themselves with) other AI generated music the decline in quality could be catastrophic. Whether tech companies are ready to admit it or even thinking that far ahead; if they impoverish the people they depend on - you - the bubble will burst. So who really holds the cards?
This piece couldn't have been written without the great input of three of the panellists from the IMS Ibiza panel; Jay Ahern (Chief Growth Officer, AFEM), Marco Erler (Lawyer, Lausen Rechtsanwälte) and Sina Wahnschaffe (Director International Artist Relations, GEMA).
Words: Duncan JA Dick, IMS Contributor. Opinions expressed are solely the author's own.