Codance AI has open-sourced Multibot, a serverless multi-agent platform built on Cloudflare Workers and Durable Objects. Users deploy and manage multiple AI bots through a web dashboard, connecting them to Telegram, Discord, or Slack via a one-click setup script. A Bring Your Own Key model covers any major LLM provider — OpenAI, Anthropic, Google, and DeepSeek, plus any OpenAI-compatible endpoint. The minimum cost is $5 per month, the price of Cloudflare's Workers Paid plan, required for Durable Objects and Cron Triggers.
From an agent architecture standpoint, Multibot ships several features that distinguish it from simpler bot frameworks. It supports multi-bot group chat with orchestrator coordination, sub-agent spawning for delegating complex multi-step tasks, and a two-layer LLM-driven memory system that automatically consolidates a MEMORY store and a rolling HISTORY buffer. Each bot is assigned its own persistent Linux sandbox powered by Fly.io Sprites — Firecracker microVMs with persistent ext4 filesystems and checkpoint/restore capabilities — enabling shell command execution and stateful file operations that survive across sessions. A <a href="/news/2026-03-15-openclaw-superpowers-self-modifying-skill-library-for-persistent-openclaw-agents">Markdown-driven skills framework</a> with progressive loading, voice support via Cloudflare Workers AI Whisper for speech-to-text and OpenAI TTS for output, and flexible scheduling are also included.
Multibot makes deliberate architectural tradeoffs. Cloudflare Durable Objects provide per-conversation agent state at the edge — one DO instance per botId-channel-chatId tuple — giving low-latency, edge-native coordination for message handling. Tool use and code execution, though, route through Fly.io Sprites, which are centralized persistent VMs rather than edge-distributed infrastructure. Every shell command a bot executes involves a round-trip from the nearest Cloudflare PoP to a Fly.io datacenter, introducing latency that the edge-native coordination layer cannot eliminate. Fly.io has indicated plans for an open-source local version of Sprites, which could eventually let self-hosters co-locate sandbox VMs and reduce that overhead.
The project is released under the MIT license at github.com/codance-ai/multibot. For teams deploying multi-agent infrastructure without managing servers, Multibot is a cost-efficient, production-oriented option — particularly where persistent bot state and cross-platform messaging integration matter more than minimizing round-trip latency on tool calls.
The Hacker News submission drew limited visible engagement, but Multibot is an early production deployment of Fly.io Sprites, which only launched in January 2026 — making it one of the first public projects to put the technology through real workloads.