LLM OneStop, a new VS Code extension, drops the monthly subscription in favor of pay-as-you-go pricing — developers pay for the tokens they consume, nothing else. The extension aggregates OpenAI's ChatGPT, <a href="/news/2026-03-14-1m-token-context-window-generally-available-claude-opus-4-6-sonnet-4-6">Anthropic's Claude</a>, and Google's Gemini into a single interface, letting users switch models on the fly based on task or cost.

The AI coding assistant market runs almost entirely on flat fees: GitHub Copilot charges $10–$19 per month, Cursor $20, Codeium $15. For hobbyists, freelancers, or developers who use AI-assisted coding occasionally rather than all day, those charges accrue regardless of actual use. That's the gap LLM OneStop is targeting.

The multi-model access may matter more than the pricing model. Committing to one provider means absorbing its pricing changes and model quality shifts — both of which have happened repeatedly across OpenAI, Anthropic, and Google over the past two years. Being able to route a boilerplate task to a cheaper model and a complex refactor to a stronger one has real practical value, even if the interface friction of switching providers is currently low.

The extension debuted through Hacker News's "Show HN" channel, the standard launch pad for developer tools seeking early feedback. The listing doesn't make clear whether this is a solo project or team-built. VS Code is the obvious integration target given where developers work; it doesn't require anyone to change their setup.

The per-token model has a ceiling. For developers using AI coding assistance heavily throughout the day, a flat $20/month often wins on cost at scale. LLM OneStop's wager is that a meaningful slice of the market isn't in that category — infrequent users currently overpaying for subscriptions they don't fully use. Early Hacker News comments will be a quick signal on whether that assumption holds.