Gilad Yadin, a legal scholar writing on yadin.com, published an essay arguing that privacy law's "right to be forgotten" is now structurally unenforceable against the AI systems most people actually use. The mechanism he identifies isn't regulatory failure or corporate noncompliance. It's architecture.
The right dates to the 2014 Court of Justice of the European Union ruling in Google Spain v. González, which held that individuals could demand removal of outdated or irrelevant personal data from search results. GDPR and California's CCPA later codified similar erasure rights. Both frameworks were built around relational databases, where personal data occupies discrete, addressable storage locations. Delete the row, delete the data. Clean, surgical, auditable.
AI models don't work that way. During training, a system like ChatGPT or Google's Gemini doesn't store facts as records — it transforms vast corpora into abstract mathematical parameters, called weights, distributed across the model. A single personal detail may shift hundreds of weights; a single weight blends the statistical influence of many different facts at once. There is no row to delete. The post-processing filters companies use instead — output guardrails that suppress certain responses — are, in Yadin's framing, cosmetic: the encoded knowledge remains in the weights, intact and potentially recoverable if filters are bypassed or degraded by future updates.
ChatGPT is approaching one billion active users, according to Yadin's essay. Google surfaces Gemini-powered AI Overviews as the primary result for billions of additional searches. For a large and growing share of the population, querying an AI is the first stop — and often the only one — for information about other people. GDPR erasure requests still carry legal weight against traditional search indices and source websites. Against a deployed model, they don't accomplish the underlying thing they were designed to accomplish.
Yadin roots this in a longer arc: for most of human history, forgetting was the default. Minor indiscretions faded. People moved on. The Information Revolution inverted that default, and the right to be forgotten was the legal patch. His argument is that the patch has already been outpaced — and that nobody has figured out what replaces it.