By Lionel Laurent / Bloomberg Opinion
The song “Echoes of Tomorrow” is a laid-back, catchy tune that might happily slot into a summertime playlists on Spotify or Apple Music. Only the lyrics, which make curious references to “algorithms,” reveal its non-human creator: Artificial intelligence.
The track’s mimicry of flesh-and-blood pop is pretty unsettling. Yet what’s really disturbing is the sheer quantity of similar AI tunes sloshing around online. Tools like Udio and Suno, trained on millions of songs crafted by human artists, are now churning out millions of their own tunes at the click of a button. Deezer SA, a rival of Spotify Technology SA, estimates 20,000 AI tracks are uploaded to its platform daily, or 18 percent of the total. While they only account for 0.5 percent of total listens, real royalties are being earned and often fraudulently so, judging by the spread of bots to amplify listens. This may not be a Napster-scale issue yet, but the $20 billion music market is clearly vulnerable.
Which is why Deezer is now trying a little more sunlight to disinfect its platform. It’s going to start labeling AI-generated content, based on proprietary software. On a recent visit to the firm’s Paris headquarters, I watched on a laptop as the detection tool quickly spotted the telltale signs of a computer-composed song — in this case, “Echoes of Tomorrow” — with what it says is 100 percent accuracy. It turns out that while human ears can be fooled, AI-generated music can be detected from statistical patterns used in its creation. That’s helped the fight against fraud behind the scenes; now it’s going to empower listeners.
Deezer deserves two cheers for this; and maybe one nervous gulp. Increased transparency about the provenance of music is one way to ensure a fairer playing field in a market whose pay-per-stream model already felt unequal for artists lower down the food chain. It’s also a good way to indirectly put pressure on the bigger platforms like Spotify to follow suit and show users what they’re paying for. Much of Spotify’s $145 billion market cap is built on expectations of price hikes and premium subscription tiers; these would be harder to justify if built on AI content masquerading as the the real thing.
Yet what remains worrying is the extent to which AI music is overwhelmingly cannibalizing, not feeding, the human artists on which it was trained without compensation. As Deezer’s own experience attests, the utopian view of AI empowering creators by taking care of low-value tasks isn’t what’s happening: Instead, royalty-collecting society CISAC estimates AI music’s growth through 2028 will come largely at the expense of humans, generating an estimated 10 billion euros ($11.5 billion) of revenue by substituting artists’ work. And while streaming platforms have a role to play here, so do governments and regulators if AI firms are to also improve transparency on the sources of their training data.
”We’re seeing AI music shrink the royalty pool for human artists,” says Ed Newton-Rex, an AI music specialist and founder of nonprofit lobby group Fairly Trained. “There are real economic consequences to this technology.”
Detecting and flagging AI music at the point of distribution is just the start. What’s also needed is a model that protects artists who are threatened at the point of generation; such as paid licensing deals between copyright holders and tech platforms like Suno, which are currently in discussion. Newton-Rex says that detection tools like Deezer’s could be used by streaming platforms for sanctioning AI tools that don’t respect musicians’ rights by removing their uploaded content. He has a point. If human creativity really is going to get a boost from new tech tools, “Echoes of Tomorrow” has to be yesterday’s news.
Lionel Laurent is a Bloomberg Opinion columnist writing about the future of money and the future of Europe. Previously, he was a reporter for Reuters and Forbes.
Talk to us
> Give us your news tips.
> Send us a letter to the editor.
> More Herald contact information.