Dave Rupert noticed something happening to developers who lean heavily on AI coding tools. They're ending each day exhausted from managing the work, not from doing it. One developer described having six worktrees open and four half-written features, with a growing sense of losing the plot entirely. Rupert, who has ADHD, recognizes this pattern. There's a thin line between feeling fast and productive and collapsing from overload.

The problem is inventory. Rupert draws on Eliyahu M Goldratt's book "The Goal," which explains that excess inventory in manufacturing isn't free. It takes up space. It costs money to store. Applied to code, every line an AI generates becomes future maintenance liability. More code doesn't mean more velocity. It means more lingering pull requests and CI runs needing rebasing. "Congratulations! You've built a factory that's world-class at producing inventory that sits on the floor and rots," Andrew Murphy wrote.

Rupert frames the bottleneck as the space between your ears. At the end of the chain of GPUs sits "the 40-watt lump of meat inside your skull." Understanding is the constraint now. He's abandoned at least two projects because the LLM generated more code than he wanted to read.

Margaret Storey calls this "cognitive debt," technical debt where the product exists beyond your understanding. Security and accessibility problems lurk in that gap. The hardware metaphor isn't perfectly accurate either. No single GPU draws 10,000 watts. Commenters point out Rupert likely means NVIDIA's DGX H100 system, an 8-GPU cluster where individual H100 chips consume 400 to 700 watts. But the insight stands.

The industry's answer? Hand maintenance back to machines. Developers become architects rather than janitors, keeping the 40-watt brain on decisions instead of cleanup. Abdicating the responsibility to review the output is another question entirely.