promptcmd is an open-source tool from developer tgalal that treats LLM prompt templates as first-class Unix citizens. Using a .prompt file format built on Google's DotPrompt specification — a YAML-frontmatter-plus-Handlebars templating standard originally created for Firebase Genkit — developers define their prompts once, then register them with the companion binary promptctl to make them available as native shell commands with auto-generated argument parsing, --help text, and full stdin/stdout piping support. A prompt for summarizing text or generating bash scripts becomes functionally indistinguishable from any other <a href="/news/2026-03-15-axe-go-binary-toml-llm-agents-unix-pipes">command-line tool</a>, composable inside pipelines just like grep or awk.
The SSH integration is the sharpest edge. Prepend an SSH connection with promptctl ssh user@server and your entire local prompt library becomes available in that remote shell session — no server-side installation of the prompts required. Developers who work across multiple remote machines know the problem: personal LLM workflows don't travel to remote infrastructure. promptcmd makes them portable. The tool supports Ollama for local self-hosted models alongside OpenAI, Anthropic, Google, and OpenRouter, with providers groupable into load-balanced pools using equal or weighted distribution. Response caching with configurable TTL handles pipeline use cases.
The project's "inversion of control" design philosophy runs counter to how most LLM tools work. Rather than relying on a model's implicit tool-calling to gather context, promptcmd pushes developers to explicitly pre-fetch and embed relevant data before execution — improving reproducibility and auditability in automated pipelines. The "Variants" system extends this: users define named model personalities or task specializations by attaching custom system prompts to any provider configuration, effectively creating custom model aliases.
Adopting DotPrompt puts promptcmd in competition with at least two other prompt template formats. Microsoft's Prompty integrates with LangChain, Semantic Kernel, and Prompt Flow; the independent PromptG project proposes a JSON-based alternative. DotPrompt is listed as Apache 2.0 on its GitHub repository (github.com/google/dotprompt), though no formal standards body has weighed in on any of these formats, and cross-company coordination between the projects is not publicly documented. For promptcmd, DotPrompt brings a multi-language reference implementation and existing editor integrations — though it should be noted these competing-standards details are based on publicly available repository information rather than confirmed by the respective maintainers. Which format becomes the default across the developer toolchain is still unsettled.