100% Rust · Zero overhead · Multi-channel · Live config · ClawHub compatible
Install · ClawHub Skills · Config · Channels · Providers · Troubleshooting · Contributing
RantaiClaw is a production-grade multi-agent runtime written in Rust. It powers autonomous AI employees that communicate across channels (Discord, Slack, Telegram, WhatsApp), execute tools, manage memories, and run skills — all from a single binary.
Built for RantAI's digital employee platform, RantaiClaw runs inside Docker containers as the execution engine for AI agents that operate 24/7 with real-world integrations.
| Metric | RantaiClaw | Python alternatives |
|---|---|---|
| Cold start | < 200ms | 2-5s |
| Memory (idle) | ~15 MB | 200-500 MB |
| Binary size | ~12 MB | N/A (runtime + deps) |
| Concurrent channels | Thousands | Hundreds |
No garbage collector. No runtime overhead. Just async Rust with tokio.
Connect your agent to any combination of channels simultaneously:
| Channel | Status | Protocol |
|---|---|---|
| Telegram | Stable | Long-poll API |
| Discord | Stable | WebSocket gateway |
| Slack | Stable | Socket Mode / Web API |
| WhatsApp Web | Stable | Multi-device protocol |
| WhatsApp Cloud | Stable | Cloud API |
| Matrix (E2EE) | Feature-gated | Matrix SDK |
| Mattermost | Stable | WebSocket |
| Signal | Stable | signal-cli REST |
| Email (IMAP/SMTP) | Stable | IMAP + SMTP |
| IRC | Stable | IRC protocol |
| DingTalk | Stable | WebSocket |
| Lark/Feishu | Feature-gated | WebSocket |
| CLI | Built-in | stdin/stdout |
Each channel runs independently with its own lifecycle — add, remove, or update channels at runtime without restarting.
Update any configuration at runtime via HTTP:
# Hot-swap model without restart
curl -X PATCH http://localhost:8080/config/model \
-H "Authorization: Bearer $TOKEN" \
-d '{"provider": "anthropic", "model": "claude-sonnet-4-20250514"}'
# Add a Discord channel while running
curl -X PATCH http://localhost:8080/config/channels \
-d '{"discord": {"bot_token": "...", "guild_id": "..."}}'
# Remove a channel gracefully
curl -X PATCH http://localhost:8080/config/channels \
-d '{"telegram": null}'
# Start an MCP server for GitHub tools
curl -X PATCH http://localhost:8080/config/mcp-servers \
-d '{"github": {"command": "npx", "args": ["-y", "@modelcontextprotocol/server-github"], "env": {"GITHUB_PERSONAL_ACCESS_TOKEN": "..."}}}'Changes persist to config.runtime.toml and survive restarts.
Route to any LLM provider with automatic fallback:
- OpenRouter — access 200+ models through one API
- OpenAI — GPT-4o, o1, o3
- Anthropic — Claude Sonnet, Opus, Haiku
- Google Gemini — Gemini 2.5 Pro/Flash
- Copilot — GitHub Copilot models
- ZAI GLM — Chinese language models
- Custom OpenAI-compatible endpoints
Install community skills from ClawHub:
rantaiclaw skill install deploy-checker
rantaiclaw skill install code-reviewer
rantaiclaw skill install meeting-summarizerSkills are workspace-scoped markdown files with embedded tools and instructions. Create your own:
# SKILL.md — deploy-checker
## Description
Validates deployment readiness before release.
## Tools
- name: run_checks
kind: shell
command: ./scripts/pre-deploy.sh
## Instructions
- Always run pre-deploy checks before approving a release
- Report any failing checks with specific remediation stepsRun Model Context Protocol servers inside the container for tool integrations:
- GitHub — repositories, issues, PRs, code search
- Slack — channels, messages, users
- Notion — pages, databases, blocks
- Linear — issues, projects, cycles
- Custom — any MCP-compatible server
MCP servers are supervised with automatic restart on crash (exponential backoff, max 5 retries).
Built-in tools with security boundaries:
| Tool | Description |
|---|---|
shell |
Execute commands (sandboxed, allowlist-controlled) |
file_read |
Read files from workspace |
file_write |
Write files to workspace |
web_search |
Search the web |
memory_store |
Persist facts to long-term memory |
memory_recall |
Query memory by semantic similarity |
cron_schedule |
Create/manage scheduled tasks |
send_message |
Message coworkers |
browser |
Web automation (optional, feature-gated) |
composio |
150+ app integrations via Composio |
| Level | Behavior |
|---|---|
| L1 — Supervised | All tool calls require approval |
| L2 — Assisted | Low-risk tools auto-approved (file_read, memory, web_search) |
| L3 — Autonomous | Most tools auto-approved, dangerous commands blocked |
| L4 — Full | All tools auto-approved, agent operates independently |
Multiple backends for persistent agent memory:
- SQLite (default) — zero-config, file-based
- Markdown — human-readable memory files
- PostgreSQL — shared memory across agents (optional)
Memory supports semantic search via embeddings for context-aware recall.
curl -fsSL https://raw.githubusercontent.com/RantAI-dev/RantAIClaw/main/scripts/bootstrap.sh | bashWorks on Linux (x86_64, aarch64, armv7) and macOS (Intel, Apple Silicon). The installer auto-detects your platform, downloads the latest pre-built binary, verifies its SHA256 checksum, and installs it — no Rust toolchain, no compiler, no git clone.
Windows: native Windows is not yet supported by the installer. Run via WSL2, or download the
x86_64-pc-windows-msvc.zipmanually from the latest release.PATH: binary lands in
~/.cargo/bin(or~/.local/bin). Ifrantaiclaw --versionsays "command not found", the installer prints the exactexport PATH=...line for your shell.
After installation:
rantaiclaw --version
rantaiclaw setup # guided wizard — provider, approvals, channels, persona, skills, MCP
rantaiclaw doctor # verify the install and surface any gaps
rantaiclaw chat # start chatting!The legacy
rantaiclaw onboardcommand still works as an alias forrantaiclaw setupthrough v0.5.0; new recipes should prefersetup.
rantaiclaw chat # Interactive TUI chat session
rantaiclaw setup # Guided wizard (or `rantaiclaw setup <topic>` for a single section)
rantaiclaw doctor # Diagnostics: config, policy, daemon, system deps
rantaiclaw daemon # Run gateway: HTTP API + multi-channel listeners
rantaiclaw skill install <id> # Install a community skill from ClawHub
rantaiclaw profile list # Manage multi-profile configs (v0.5.0+)
rantaiclaw migrate --from auto # Import config from a legacy OpenClaw / ZeroClaw install
rantaiclaw status # Verify install and show config health
rantaiclaw config get|set # Inspect/update runtime config
rantaiclaw --help # All commands📖 Full install reference → · Troubleshooting → · Releases →
| Method | Command |
|---|---|
| One-liner (recommended) | curl -fsSL https://raw.githubusercontent.com/RantAI-dev/RantAIClaw/main/scripts/bootstrap.sh | bash |
| Manual download | Pick a release archive, verify against SHA256SUMS, extract, move into PATH. |
| Build from source | git clone https://github.com/RantAI-dev/RantAIClaw.git && cd RantAIClaw && ./bootstrap.sh --from-source |
| Cargo | cargo install --git https://github.com/RantAI-dev/RantAIClaw --locked |
| Docker | docker pull ghcr.io/rantai-dev/rantaiclaw:latest |
| Bootstrap-managed Docker | ./bootstrap.sh --docker --interactive-onboard |
| Homebrew (when published) | brew install rantaiclaw |
Step-by-step recipes for each (with checksum + cosign verification, feature flags, container persistence): see docs/install.md.
# Re-run the installer — always pulls the latest release
curl -fsSL https://raw.githubusercontent.com/RantAI-dev/RantAIClaw/main/scripts/bootstrap.sh | bashrm -f ~/.cargo/bin/rantaiclaw ~/.local/bin/rantaiclaw
rm -rf ~/.rantaiclaw # config + workspace (back up first if needed)RantaiClaw uses TOML configuration at ~/.rantaiclaw/config.toml:
# Model configuration
default_provider = "openrouter"
default_model = "anthropic/claude-sonnet-4-20250514"
default_temperature = 0.7
# Autonomy
[autonomy]
level = "supervised"
auto_approve = ["file_read", "memory_recall", "web_search"]
workspace_only = true
max_actions_per_hour = 100
# Channels
[channels_config]
cli = true
[channels_config.discord]
bot_token = "..."
guild_id = "..."
mention_only = true
[channels_config.telegram]
bot_token = "..."
allowed_users = ["*"]
# MCP Servers
[mcp_servers.github]
command = "npx"
args = ["-y", "@modelcontextprotocol/server-github"]
[mcp_servers.github.env]
GITHUB_PERSONAL_ACCESS_TOKEN = "ghp_..."
# Gateway
[gateway]
enabled = true
port = 8080
allow_public_bind = trueSee Config Reference for all options.
┌─────────────────────────────────────────────────────┐
│ RantaiClaw Binary │
├──────────┬──────────┬───────────┬───────────────────┤
│ Channels │ Tools │ MCP │ Gateway │
│ Registry │ Registry │ Registry │ (Config API) │
├──────────┼──────────┼───────────┼───────────────────┤
│Telegram │ shell │ github │ GET /config │
│Discord │ file_* │ notion │ PATCH /config/* │
│Slack │ memory_* │ linear │ GET /health │
│WhatsApp │ cron_* │ slack │ POST /webhook │
│Matrix │ browser │ custom │ GET /config/ │
│... │ composio │ │ channels │
├──────────┴──────────┴───────────┴───────────────────┤
│ Agent Loop (src/agent/) │
│ System Prompt → LLM → Tool Calls → Response │
├─────────────────────────────────────────────────────┤
│ Provider Layer (OpenRouter/Anthropic/...) │
├─────────────────────────────────────────────────────┤
│ Memory (SQLite/Markdown/PostgreSQL) │
└─────────────────────────────────────────────────────┘
| Module | Path | Responsibility |
|---|---|---|
| Agent | src/agent/ |
Orchestration loop, prompt construction |
| Channels | src/channels/ |
Multi-channel communication |
| Tools | src/tools/ |
Tool execution with security boundaries |
| MCP | src/mcp/ |
MCP server process management |
| Gateway | src/gateway/ |
HTTP server, Config API, webhooks |
| Config | src/config/ |
Schema, runtime persistence |
| Memory | src/memory/ |
Multi-backend memory system |
| Security | src/security/ |
Policy engine, pairing, secrets |
| Providers | src/providers/ |
LLM provider adapters |
| Skills | src/skills/ |
Skill loading and execution |
# Default build (all common channels + tools)
cargo build --release
# With WhatsApp Web support
cargo build --release --features whatsapp-web
# With Matrix E2EE support
cargo build --release --features channel-matrix
# With hardware peripherals (RPi GPIO, Arduino)
cargo build --release --features hardware
# With browser automation
cargo build --release --features browser-native
# With OpenTelemetry observability
cargo build --release --features observability-otel
# Kitchen sink
cargo build --release --features "whatsapp-web,channel-matrix,browser-native,observability-otel"# Format
cargo fmt --all
# Lint
cargo clippy --all-targets -- -D warnings
# Test
cargo test
# Full CI check
./dev/ci.sh allRantaiClaw is built on the foundation of ZeroClaw, an open-source AI agent runtime. We extend our gratitude to the ZeroClaw community for their pioneering work in Rust-native agent systems.
RantaiClaw adds on top of ZeroClaw:
- Live Config API — runtime configuration changes via HTTP endpoints
- Channel Registry — per-channel lifecycle with graceful shutdown via CancellationToken
- MCP Server Management — stdio-based process supervision with exponential backoff
- Multi-agent orchestration — team communication, cross-employee task delegation and review
- ClawHub integration — skill marketplace discovery and installation
- Digital employee platform — dashboard UI, integration management, deployment automation
- Autonomy levels (L1–L4) — configurable agent independence with tool-level permissions
- Runtime config persistence —
config.runtime.tomloverlay preserving base config
- GitHub Discussions — RantAI-dev/RantAIClaw/discussions
- Issues & Feature Requests — RantAI-dev/RantAIClaw/issues
- ClawHub Skills — clawhub.ai
RantaiClaw is built and maintained by the RantAI team. If this project is useful to you, consider sponsoring to support ongoing development:
Your sponsorship helps fund:
- New channel integrations and MCP server support
- Performance optimization and security hardening
- ClawHub skills ecosystem development
- Documentation and community support
Licensed under the GNU Affero General Public License v3.0 (AGPL-3.0).
Copyright 2025–2026 RantAI.
Built with Rust by RantAI
