axonaxon

A multi-agent CLI
coding agent.

One orchestrator plans. Specialists execute. Runs in your terminal, makes real changes to real files — fully configurable, BYOK.

$ npm install -g axon-cli
The orchestration model

One planner. Many specialists.

Hard tasks are easier when one agent plans and many focused agents execute. Depth, fan-out caps, roles, and models — all configurable per project.

user prompt
axonOrchestrator

Plans complex tasks, dispatches specialists, synthesizes outputs.

call_agent
researcher

read · gather · report

read-only
implementer

write · scoped

writes
reviewer

critique · convention

read-only
structured returns
synthesized answer
Features

Everything you need. Nothing in the way.

5 built-in roles

researcher, implementer, reviewer, tester, debugger — each with a hardened system prompt and structured return format.

Budgeted delegation

Three layers: maxDepth, softCap, hardCap. Plus per-parent and per-depth caps. Never explodes the tree.

BYOK, any provider

Anthropic, OpenAI, OpenRouter, Groq, Together, GitHub Models, Ollama, LM Studio — anything OpenAI-compatible.

Permissions, with diffs

Every mutating tool is gated. Write approvals show a diff. Always-allow per session or per project.

Plan mode

Read-only mode that forces the agent to surface a markdown plan before any edit lands. Approve, then execute.

Project memory

Drop an AXON.md at the root — standing instructions that get appended to every system prompt, every sub-agent.

Sessions & exports

Every turn snapshotted. Resume by id, by last-updated, or export the whole transcript to markdown or json.

MCP support

Standard Model Context Protocol servers — register per project (.forge/mcp.json) or personally. Tools merge automatically.

No API key? No problem.

Run axon for free with GitHub Models.

Use a GitHub Personal Access Token (classic) to access frontier models via the GitHub Models endpoint. Free tier, no credit card.

See the free onboarding guide
axon · onboarding
? Provider           › OpenAI-compatible
? Model              › openai/gpt-4o
? Endpoint           › https://models.github.ai/inference
? API key            › ghp_••••••••••••••••••••
? Serper key (opt.)  › ••••••••••••

✓ config written to ~/.axon/config.json (0600)
Slash commands

A TUI you can actually drive.

Multi-line input, diff-preview approvals, plan mode, per-tool allow/deny, sessions, custom commands, mentions, MCP. Every escape valve is one slash away.

/plan

toggle plan mode — review the plan before any edit

/arch

show or reload .axon/architecture.json

/roles init

scaffold the five built-in role prompts

/permissions deny <tool>

block a tool for this project

/sessions

list saved sessions in this workspace

/export md

write the transcript to .axon/exports/

Full slash-command reference

Install once.
Run anywhere.

$ npm install -g axon-cli