Apple ships a ~3 billion parameter language model on every Apple Silicon Mac running macOS 26 (Tahoe). The catch? It's locked inside Apple's walled garden, accessible only through Siri and system features. Apfel, a new open-source tool from Franz Enzenhofer, breaks that lock. It gives you three ways to use Apple's Foundation Model directly: a UNIX-style CLI tool, an OpenAI-compatible local AI server at localhost:11434, and an interactive chat mode. All inference happens on your Neural Engine and GPU. No API keys, no subscriptions, no data leaving your machine.

The tool ships with native Model Context Protocol (MCP) support, which is genuinely useful. You can attach MCP servers to give Apple's on-device model tools for math, APIs, databases, or whatever else you need. It works across all three modes. The 4,096 token context window is tight but handles most single-turn tasks fine. Installation is straightforward via Homebrew.

There are real questions here though. The model Apple ships is reportedly on par with Qwen-3-4B from a year ago. Open-source alternatives are moving fast, and it's unclear whether Apple can keep pace. Then there's the legal question. Apfel uses private APIs that Apple's Developer Program License Agreement explicitly bans. Apple could technically issue DMCA takedowns or break compatibility in future macOS updates by changing how FoundationModels works. For now, Apfel works. But anyone building on it should know the ground might shift.