Six layers, each independently replaceable. Deployment view: Docker containers, the Manager Agent orchestrator, 5 specialist Agents + MCP tools, PostgreSQL + shared volume, and the external LLM API layer — all running on the user’s own machine.
CoStaff is not a single monolithic AI. It’s a multi-agent + tool system stitched together. Each layer runs in its own container, does one thing, and can be swapped or extended independently.
Hand off a task in a single chat message — you give the brief, you get the result.
WebChat · Telegram · LINE · Discord · Slack — Docker container per channel. Each channel adapter handles its platform’s quirks; same agents, multiple chat apps.
Manager Agent — parses intent, remembers conversation, dispatches multi-agent work via A2A Protocol. The orchestrator that lets you say one sentence and have specialists collaborate.
Business · Coding · Database · Twinkle Hub · Custom Agent with MCP Server — each in its own container. Manager dispatches via A2A; specialists call MCP tools to do the actual work.
PostgreSQL + Shared Workspace — per-agent persistent storage. Conversation memory lives in PostgreSQL; file artifacts live in the shared workspace volume.
Gemini · Gemma · Twinkle T1 · 3rd-party models — per-agent configurable. Each specialist can use a different LLM; data and inference stay local if you choose local models.