Taylor Swift wants to own her own voice. Literally. The pop star has filed three trademark applications in the US covering a photo of herself from the Eras Tour and two audio clips of her saying "Hey, it's Taylor" and "Hey, it's Taylor Swift." The recordings come from Spotify and Amazon Music promos for her album The Life of a Showgirl. The strategy is straightforward: register specific images and sounds, then use trademark law to go after AI-generated content that's "confusingly similar," not just exact copies.
Trademark lawyer Josh Gerben, who first reported the filings, explained that this gives Swift a federal claim against synthetic versions of herself. And there's been plenty to claim against. AI-generated explicit images of Swift circulated widely. A fake political ad showed her endorsing Donald Trump. This isn't theoretical damage.
Matthew McConaughey went this route first, becoming the first celebrity to use trademark rules to protect his likeness from AI misuse earlier this year. The catch: trademark law requires commercial use and consumer confusion. A random meme or non-commercial deepfake might slip through. Tennessee passed the ELVIS Act in 2024 to fill exactly that gap, extending right of publicity protections to include voice specifically against AI misuse. Existing law wasn't built for synthetic media.
Swift can afford the lawyers. Most people can't. Her filings show an uncomfortable truth: the legal tools to fight AI impersonation exist, but they're expensive and imperfect. If one of the most powerful celebrities in the world needs to get creative with trademark applications to protect her face and voice, what does that mean for everyone else?