Imagine needing a badge to prove you exist. That's Spotify now.

For months, users have flooded community forums demanding AI songs get labeled. Subscribers ask why they pay full price while synthetic content fills the platform.

Then there's The Velvet Sundown. The band racked up 850,000 monthly listeners and a verified page before anyone revealed it as a "synthetic music project with the support of artificial intelligence." Now they're at 126,000. Their profile finally discloses the AI connection.

Spotify's answer: a green checkmark. The "Verified by Spotify" badge starts appearing over coming weeks. It relies on signals like linked social accounts, consistent listener activity, merch, and concert dates. Spotify says more than 99% of actively searched artists will get verified.

But the badge only confirms a human exists behind the profile. It says nothing about whether that human used AI to make the music.

Ed Newton-Rex, a creators' rights campaigner and former AI executive, told the BBC the approach could "punish real human artists who don't have some of the markers the verification is based on." Artists who don't tour. Artists who don't sell merch.

His suggestion: label AI-generated music instead.

Apple Music and Amazon Music haven't tackled the problem publicly. Apple uses human editors for playlists, which naturally limits AI visibility, but has no explicit labeling. Deezer is actively filtering out AI content. Deezer reports a flood of synthetic tracks daily. Spotify is at least trying.

Nick Collins, Professor of Music at the University of Durham, put it well to the BBC: "AI usage is not a binary position between 'entirely authentically handmade' and 'fully AI generated' but can have lots of in-between cases."

The spectrum makes any labeling system messy. But pretending the problem doesn't exist isn't working either.