Entertainment

Folk Artist Finds Stolen, Altered Songs Uploaded to Her Spotify Without Permission

A North Carolina folk musician found two AI-generated songs on her Spotify profile, exposing how easily voice-cloning tools and unverified distributors can impersonate independent artists at scale.

Sarah Chen4 min read
Published
Listen to this article0:00 min
Share this article:
Folk Artist Finds Stolen, Altered Songs Uploaded to Her Spotify Without Permission
AI-generated illustration

Murphy Campbell did not know she had released new music until the messages started arriving. Fans of the North Carolina folk singer flooded her inbox asking about the fresh tracks on her Spotify profile, so she opened the app and pressed play on something she had never recorded.

"It was this computer mimicking my voice," Campbell, who specializes in traditional folk music rooted in Appalachian banjo and dulcimer traditions, said, "and trying to play the banjo and dulcimer really poorly." The two songs were AI-generated imitations built from the public record she had spent years building on YouTube, uploaded to her streaming profile by someone she still cannot name. "I laughed for a long time," she added. "And then I was hard to be around for a few days because I was so frustrated. It feels so out of your hands: 'Who did this?' At the end of the day, there was a human somewhere that had to prompt the AI to do this."

The mechanics behind what happened to Campbell follow a pattern that has become grimly familiar across the independent music world. An actor identifies a working musician with public recordings, typically sourced from YouTube or SoundCloud, and feeds those performances into a commercially available AI voice-cloning tool. The output, a synthetic version of the artist's vocal and instrumental identity, is then formatted as a distributable audio file. Because streaming platforms like Spotify do not accept direct uploads from the public, the actor routes the tracks through one of dozens of third-party music distributors, services that require little more than an email address and a credit card to begin delivering songs. Distributors assign an ISRC code and barcode to each track and push it live, often within 24 to 72 hours, attached to whatever artist name the uploader specifies. Spotify's own systems have no front-end mechanism to verify that the person submitting music under a given name is actually that artist.

Once the fake tracks land, removing them is a separate ordeal. Campbell described a back-and-forth process of waiting and re-explaining her case to platform support teams. One streaming service compounded the problem by repeatedly demanding the barcode number tied to the upload, information an artist would only possess if she had uploaded the track herself. The system was built to process takedown requests from rights holders who already held distributor records, not from artists whose identities had been appropriated entirely. "I'm in this weird limbo where I'm telling robots to take down music robots made," she said. Kevin Erickson of the nonprofit Future of Music Coalition observed that schemes like this "don't have to be financially successful to create a lot of problems" for legitimate artists, whose reputations, streaming algorithms, and listener trust are all vulnerable to manipulation in the interim.

Spotify's global head of marketing and policy, Sam Duboff, acknowledged the pressure. "We reached a point where we were hearing enough from artists about this that we wanted to fix it," Duboff said, while also noting that building individual verification features across every streaming service creates its own coordination problem for artists managing careers across multiple platforms.

Independent artists trying to protect themselves have several practical pressure points to work on immediately. Claiming an official Spotify for Artists profile is the first line of defense, giving the artist some administrative visibility over what appears under their name. Registering directly with a music distributor, even before releasing anything commercially, establishes a prior claim in the system and makes it harder for a bad actor to anchor a fake catalog to your identity. Google Alerts set to your stage name will surface new streaming appearances faster than waiting for fans to notice. When filing a takedown, document everything before contacting support: screenshots of the offending tracks, timestamps of when they appeared, links to your original YouTube recordings, and any ISRC codes you do legitimately hold. Without that paper trail, platform support queues tend to stall, particularly when their first-contact systems are automated.

Campbell's case also underscores why copyright registration matters before a crisis, not during one. A registered copyright with the U.S. Copyright Office creates a legal record of authorship that is significantly harder for a takedown dispute to dismiss than a complaint sent through a web form. For artists in the traditional folk and old-time space, where performances often predate any streaming release, that registration is frequently the only documentation strong enough to cut through distributor bureaucracy. The gap between how quickly AI tools can manufacture a fake identity and how slowly platforms respond to reclaim a real one is still very wide.

Know something we missed? Have a correction or additional information?

Submit a Tip

Discussion

More in Entertainment