The AI Threat to Musicians That’s Happening Now

In the April 9 issue, I wrote about how AI is invading the music industry at a rate and to a degree that, even accounting for my alarmist feelings about AI, is shocking. Since then, I’ve been spending a bit of my reading/research time each day checking out what’s new in AI music. And there’s quite a bit.

One of the things we discussed at the Cigar Bar on Friday was a story I’d read in The Free Press. It was about Murphy Campbell, a singer-songwriter and banjo player from North Carolina who was eking out a modest living videotaping herself sitting on a log or a rocking chair and performing her original compositions. Her fan base was steadily increasing when, a few months ago, she noticed that songs were appearing on her Spotify page that were attributed to her, but were not hers. She hadn’t written them. She hadn’t performed them. And yet, they sounded eerily familiar.
 
She eventually realized that they were AI-generated, probably created by someone who was feeding an AI with snippets of her published songs and asking it to create other songs that were similar.
 
Understandably, this irked her. But when she received a notice that she was “sharing” royalties for these counterfeits that were being played on platforms all over the world, she was flummoxed. Who was selling this music? And what, if anything, could she do about it?
 
A few of the people in our little group on Friday had heard of this scam going on in the music industry. “It’s a new thing,” said B, “so it’s not well known. But it isn’t rare either. It’s not a huge issue right now, but it could easily become one.”
 
The problem, B explained, is that if you know what you are doing, you can feed in any sort of music you want and generate a troop of AI singer-songwriters producing and performing “original” music for you. And all you have to do is a bit of video cutting and pasting and using an AI to create a royalty-sharing contract, either with the Murphy Campbells of the world or even with your own AI avatars.
 
The conversation moved on to the potential of this – good and bad.
 
On the good side is the possibility that the music industry could grow geometrically as millions of kinda-like songs and singers are produced and promoted by thousands of AI agents working in their basements or kitchens. 
 
On the bad side is the eventual (but not that eventual) collapse of the music culture we enjoy now, with human-generated music becoming, at best, a personal hobby with very little monetizable value, and where 80% to 90% of the money made will go to musicians and dealmakers that insert themselves into the AI music industry now and figure out what needs to be done.
 
I know a few musicians and would-be musicians that either make a living or hope to make a living composing and performing their own music. Most of them are hanging on to the hope that AI will never be able to capture a large swath of the marketplace by selling fake music to real people. 
 
And maybe they will be proven right. 
 
But what if they are wrong? What will they be left with? 
 
If you are in the music industry now or would like to be in the future, you need to hedge your bet by continuing to do your own thing while learning about and even testing out AI music. You’ve got nothing but a bit of time to lose… and you’ve got an exciting and remunerative future to win.