Here's the ironic part about trying to report on a paper proving GNU libc's atanh function is correctly rounded: you can't read it. The paper sits behind Anubis, a proof-of-work system designed to stop AI scrapers from burning down servers. Anubis works like Hashcash, the old anti-spam proposal. Each visitor's browser solves a small computational puzzle. For one person loading a page, the cost is negligible. For a bot hitting thousands of pages, it adds up fast. The system requires JavaScript, which means scrapers that can't execute JS get blocked outright. But privacy plugins like JShelter break it too. They disable the exact JS features Anubis needs, so researchers running privacy tools can't reach the papers behind it. Techaro, the company behind Anubis, calls this a placeholder. They're working on fingerprinting headless browsers through tells like font rendering behavior. Until that ships, operators face a binary: add friction for everyone, or let scraping take their servers offline. The same infrastructure built to preserve open access now walls it off. Researchers who want to verify a math function first have to prove they're human, shifting the focus to automated defenses like **[Project Glasswing: Anthropic's Plan to Arm Defenders]**.