Skip to content

colygon/openclaw-deploy

Repository files navigation

🦞 OpenClaw Deploy

Deploy OpenClaw AI agents to Nebius Cloud.

→ Open the Deploy UI at claw.moi

Choose Your Path

Path Method Inference Best For
1. Local Install npm install -g openclaw Token Factory Try it now, zero overhead
2. Docker docker run pre-built image Token Factory Portable, reproducible
3. GPU Serverless NemoClaw on Nebius GPU Local model Custom models, data privacy
4. CPU Serverless OpenClaw on Nebius CPU Token Factory Production, always-on

Quick Start

Path 1 — Install locally (30 seconds):

npm install -g openclaw
export TOKEN_FACTORY_API_KEY={your-key}
openclaw init && openclaw gateway --bind loopback --auth token

Path 2 — Docker (2 minutes):

docker run -e TOKEN_FACTORY_API_KEY={your-key} \
  -e TOKEN_FACTORY_URL=https://api.tokenfactory.nebius.com/v1 \
  -e INFERENCE_MODEL=zai-org/GLM-5 \
  -e OPENCLAW_WEB_PASSWORD={your-password} \
  -p 8080:8080 -p 18789:18789 \
  ghcr.io/colygon/openclaw-serverless:latest

Path 4 — Nebius CPU Serverless (3 minutes):

export TOKEN_FACTORY_API_KEY={your-key}
./install-openclaw-serverless.sh

Screenshots

Create Agent Endpoints
Create Endpoints

What's Included

Deploy UI Browser-based deployment wizard with endpoint management
Install Scripts One-command deploy to Nebius serverless
Docker Images Pre-built public images on GHCR
Setup Guide Comprehensive Nebius configuration guide

Public Docker Images

docker pull ghcr.io/colygon/openclaw-serverless:latest   # ~400 MB
docker pull ghcr.io/colygon/nemoclaw-serverless:latest   # ~1.1 GB

Related

  • nebius-skill — Claude Code skill for managing Nebius infrastructure
  • OpenClaw — The open-source AI agent platform
  • Token Factory — Nebius managed GPU inference API

License

MIT

About

Deploy OpenClaw AI agents with Token Factory and Nebius Cloud.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors