Fullstack developer Cheikh Seck has published a design walkthrough for Godex, a self-hosted AI coding agent built to replicate the experience of OpenAI's Codex at zero cost. The project was born out of practical necessity: after losing his job and with it access to his Codex workspace, Seck experimented with alternatives including OpenCode, only to be blocked by its free-tier usage cap. The turning point came when he tested Google's gemma3 model locally via Ollama on a 16GB RAM machine and found it performed well enough on coding tasks to make a fully local agentic setup viable.

Godex is built around MCP (Model Context Protocol) servers, which serve as the integration layer between a locally-running LLM — served via Ollama — and the developer's local environment, including file systems, terminals, and editors. MCP gives the LLM structured, protocol-level access to development context without requiring cloud APIs or proprietary infrastructure, a pattern that <a href="/news/2026-03-15-localagent-v0-5-0-local-first-rust-mcp-runtime">other local-first agent runtimes</a> have embraced. Seck designed Godex to be model-agnostic, meaning any Ollama-compatible LLM can be swapped in, though he notes a significant caveat: not all open-weight models reason well enough to handle the multi-step, tool-calling demands of agentic coding workflows.

What's notable in the architecture is the MCP choice itself. Anthropic open-sourced the protocol for Claude tooling, but Godex uses it with no Anthropic involvement — Ollama in, local filesystem out. That Seck reached for MCP as the connective layer, rather than rolling a custom integration, suggests the protocol is finding traction on its own terms.