Your K-Pop Favorites May Be Illegal AI Voice Clones
Is your latest song on repeat really being sung by Blackpink or Justin Bieber? There’s a shockingly high chance it’s a deepfake voice clone designed to trick you, according to a new study from musicMagpie aptly titled Bop or Bot? The study found an astonishing 1.63 million AI covers on YouTube alone. Listeners may not always be able to tell the difference, and it could even have a financial impact on the artists whose voices are used on the songs.
The biggest victims of these deepfake tracks are K-pop groups, which account for 35% of the top 20 most-streamed AI-generated artists. Blackpink tops the list, with over 17.3 million views of AI-generated content impersonating the group, with an AI cover of BabyMonster’s “Batter Up” alone amassing 2.5 million views. Justin Bieber comes in second on the list with over 13 million views, including his biggest fake hit, George Benson’s “Nothing’s Gonna Change My Love For You,” with 10.1 million views. Rounding out the top three for stolen votes is Kanye West with 3.4 million views for AI-generated tracks, including a cover of “Somebody That I Used to Know” with 2.6 million streams.
There’s also a more literal theft at play. The financial implications of AI-generated music are substantial, according to musicMagpie. The firm estimated that the rise in AI-generated content could result in more than $13.5 million in lost revenue for original creators. That’s about a $500,000 loss for Blackpink, while Bieber and West lost out on $202,964 and $130,000, respectively.
Voice tricks
Even death can’t save artists from AI theft, as the AI ghosts of Frank Sinatra’s 8.9 million views and Freddie Mercury’s 3.55 million views can attest. As for unlicensed fictional voices, there’s an unexpected appeal in SpongeBob SquarePants performing songs and racking up 10.2 million views of the yellow cartoon character. His biggest hit? Don Maclean’s “American Pie.”
Part of the problem is that humans aren’t good at distinguishing AI-generated music from human-made music. In another study, musicMagpie found that 72% of respondents were confident they could tell an AI-generated song from a human-made one, but 49% couldn’t. And it’s not a matter of age; Gen Z participants were actually the easiest to fool. All of this fuels the ongoing legal battles facing AI music startups like Suno and Udio, which are overly reliant on unlicensed material to train their AI models. If the Recording Industry Association of America (RIAA) and music labels can successfully argue that there’s a genuine financial loss, they’ll likely have a stronger case against the developers of the AI models.
“These findings highlight a growing challenge in the music industry: as AI technology becomes more advanced, music fans across generations are struggling to distinguish what is real from what is artificially created,” the study’s authors point out. “If nearly half of listeners cannot tell the difference between a human artist and an AI, what does this mean for the value of human creativity? How will this impact the way we create, perceive, and appreciate music in the years to come? These are questions the industry will have to grapple with as AI continues to evolve.”