Is the latest song you listened to on repeat actually sung by Blackpink or Justin Bieber? According to a new study from musicMagpie aptly titled Bop or Bot?, there’s an incredibly high chance that it’s a deepfake voice clone created to fool you. The study found a staggering 1.63 million AI versions on YouTube alone. Listeners may not always be able to tell the difference, and it could actually have a financial impact on the artists whose voices are being used for the songs.
The biggest victims of these deepfake tracks are K-pop groups, making up 35% of the top twenty most-streamed AI-generated artists. Blackpink sits at the top of the list, with over 17.3 million views of AI-generated content mimicking the group, with an AI cover of BabyMonster’s ‘Batter Up’ garnering 2.5 million views on its own. Justin Bieber is second on the list with over 13 million views, including his biggest fake hit with 10.1 million views, George Benson’s ‘Nothing’s Gonna Change My Love For You’. Rounding out the top three stolen vocals is Kanye West with 3.4 million views of AI-generated tracks, including a cover of ‘Somebody That I Used to Know’ with 2.6 million streams.
There’s a more literal theft involved, too. The financial implications of AI-generated music are substantial, according to musicMagpie. The company estimated that the rise in AI-generated content could translate into more than $13.5 million in lost revenue for the original creators. That represents a loss of roughly $500,000 for Blackpink, while Bieber and West lost $202,964 and $130,000, respectively.
Voice Tricks
Not even being dead can save artists from AI theft, as the AI ghost of Frank Sinatra’s 8.9 million views and Freddie Mercury’s 3.55 million views can attest. As for unlicensed fictional voices, there’s an unexpected appeal in SpongeBob performing songs and garnering 10.2 million views from the yellow cartoon character. His biggest hit? Don Maclean’s “American Pie.”
Part of the problem is that people aren’t good at distinguishing AI-generated music from human-made music. In a subsequent study, musicMagpie found that 72% of participants were confident they could tell an AI-produced song from a human-made one, but 49% couldn’t. And it’s not a question of age; Gen Z participants were actually the easiest to fool. All of this is fodder for the ongoing legal battles facing AI music startups like Suno and Udio over excessive use of unlicensed material to train their AI models. If the Recording Industry Association of America (RIAA) and the labels can successfully argue that there’s a real monetary loss involved, they’ll likely have a stronger case against AI model developers.
“These findings highlight a growing challenge in the music industry: as AI technology becomes more sophisticated, music lovers across generations are struggling to discern between what is real and what has been artificially created,” the study’s authors note. “If nearly half of listeners can’t tell the difference between a human artist and an AI, what does this mean for the value of human creativity? How will this impact the way we create, perceive, and appreciate music in the years to come? These are questions the industry must grapple with as AI continues to evolve.”