Skip to content

Mobile-first web interface for OpenCode AI agents. Manage, control, and code with multiple OpenCode agents from any device - your phone, tablet, or desktop. Features Git integration, file management, and real-time chat in a responsive PWA. Deploy with Docker for instant setup.

License

Notifications You must be signed in to change notification settings

dzianisv/opencode-manager

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

513 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

OpenCode Manager

Mobile-first web interface for OpenCode AI agents. Manage, control, and code with OpenCode from any device - your phone, tablet, or desktop. Features Git integration, file management, and real-time chat in a responsive PWA. Deploy with Docker for instant setup. View diffs, edit files and much more.

Demo

Responsive Dashboard

Responsive Dashboard - Mobile, Tablet, Desktop

Responsive design: The dashboard adapts seamlessly from mobile to desktop. Recent Sessions and Repositories use consistent card styling with hover effects and grid layouts.

Talk Mode (Voice-to-Code)

Talk Mode Demo - Voice-to-Code workflow

Voice-powered coding: Speak your request, get AI-generated code. The demo shows the Talk Mode E2E test - voice input is transcribed via Whisper STT, sent to OpenCode AI, and the response streams back in real-time.

Feature Tour

Feature Tour - Dashboard, Search, Settings, Chat

Key features: Dashboard overview, global search (Cmd+K), voice settings (TTS/STT/Talk Mode), Cloudflare tunnel status with metrics, keyboard shortcuts, OpenCode config, AI providers, and chat sessions.

Why This Fork?

This fork of opencode-manager adds production-ready voice features, cloud deployment, and remote access:

Feature This Fork Upstream
Built-in STT ✅ Faster Whisper server (local, no API key) ❌ External API only
Built-in TTS ✅ Coqui + Chatterbox (7+ models, local) ❌ External API only
Browser Voice ✅ Web Speech API fallback ✅ Web Speech API
System Service install-service for macOS/Linux ❌ Manual startup
Status Command ✅ YAML health output w/ tunnel URL ❌ Not included
Cloudflare Tunnel ✅ Built-in, auto-starts with service ❌ Not included
Tunnel Metrics UI ✅ Live connection stats + logs in Settings ❌ Not included
Session Pruning ✅ Auto-cleanup old sessions ❌ Manual cleanup
Log Rotation ✅ Auto-rotate logs (5MB limit) ❌ Not included
Cloud Deploy ✅ One-command Azure deployment ❌ Not included
Basic Auth ✅ Caddy proxy with auth ❌ Not included
E2E Voice Tests ✅ Browser + API tests ❌ Not included
Large Output Fix ✅ Context overflow prevention ❌ Uses official OpenCode

Quick Start:

# Install globally
bun install -g github:dzianisv/opencode-manager

# Run as system service (auto-starts on boot, includes tunnel)
opencode-manager install-service

# Or run manually
opencode-manager start

# Access from anywhere via Cloudflare tunnel URL
cat ~/.local/run/opencode-manager/endpoints.json

Why We Use a Fork of OpenCode

This project builds OpenCode from VibeTechnologies/opencode, a fork of the official sst/opencode repository. We maintain this fork to include critical fixes that haven't yet been merged upstream.

Current Fork Enhancements

File Persistence for Large Tool Outputs (PR #6234)

The official OpenCode has a known issue where large tool outputs (WebFetch, Bash, MCP tools) can overflow the context window, causing:

  • "prompt is too long" errors (e.g., 202744 tokens > 200000 maximum)
  • Sessions becoming stuck/unresponsive
  • Loss of work when context overflows mid-conversation

Our fork includes the fix from PR #6234 which implements intelligent file persistence:

  • Tool outputs exceeding 30,000 characters are saved to disk instead of the context
  • The AI model receives a file path with instructions to explore the data using Read/Grep/jq
  • Context stays small, preventing overflow errors
  • Files are automatically cleaned up when sessions are deleted

This fix is essential for production use cases where AI agents frequently fetch documentation, analyze large codebases, or work with verbose tool outputs.

Implementation Details:

  1. VibeTechnologies/opencode fork (branch: dev) contains two fixes:

    • Large tool outputs (>30k chars) are saved to disk instead of context (packages/opencode/src/session/prompt.ts)
    • Auto-allow read access to OpenCode storage directory to avoid permission prompts for reading saved tool results (packages/opencode/src/tool/read.ts)
  2. opencode-manager deploys the fork at container startup via:

    • docker-compose.yml - OPENCODE_FORK_REPO and OPENCODE_FORK_BRANCH env vars
    • scripts/docker-entrypoint.sh - install_from_fork() function

Test Results (all 3 integration tests pass):

  • 883,082 character output saved to file successfully
  • No retry loop / sessions didn't get stuck
  • Sessions can continue conversation after context-heavy operations

Staying Up-to-Date

We regularly sync our fork with upstream sst/opencode to incorporate new features and fixes. Once PR #6234 is merged upstream, we plan to switch back to the official release.

Features

Repository Management

  • Multi-Repository Support - Clone and manage multiple git repos/worktrees in local workspaces
  • Private Repository Support - GitHub PAT configuration for cloning private repos
  • Worktree Support - Create and manage Git worktrees for working on multiple branches

Git Integration

  • Git Diff Viewer - View file changes with unified diff, line numbers, and addition/deletion counts
  • Git Status Panel - See all uncommitted changes (modified, added, deleted, renamed, untracked)
  • Branch Switching - Switch between branches via dropdown
  • Branch/Worktree Creation - Create new branch workspaces from any repository
  • Ahead/Behind Tracking - Shows commits ahead/behind remote
  • Push PRs to GitHub - Create and push pull requests directly from your phone

File Browser

  • Directory Navigation - Browse files and folders with tree view
  • File Search - Search files within directories
  • Syntax Highlighting - Code preview with syntax highlighting
  • File Operations - Create files/folders, rename, delete
  • Drag-and-Drop Upload - Upload files by dragging into the browser
  • Large File Support - Virtualization for large files
  • ZIP Download - Download repos as ZIP excluding gitignored files

Chat & Session Features

  • Slash Commands - Built-in commands (/help, /new, /models, /export, /compact, etc.)
  • Custom Commands - Create custom slash commands with templates
  • File Mentions - Reference files with @filename autocomplete
  • Plan/Build Mode Toggle - Switch between read-only and file-change modes
  • Mermaid Diagram Support - Visual diagram rendering in chat messages
  • Session Management - Create, search, delete, and bulk delete sessions
  • Real-time Streaming - Live message streaming with SSE
  • CLI Session Sharing - Sessions created in terminal opencode CLI are visible in Web UI

AI Model & Provider Configuration

  • Model Selection - Browse and select from available AI models with filtering
  • Provider Management - Configure multiple AI providers with API keys or OAuth
  • OAuth Authentication - Secure OAuth login for supported providers (Anthropic, GitHub Copilot)
  • Context Usage Indicator - Visual progress bar showing token usage
  • Agent Configuration - Create custom agents with system prompts and tool permissions

MCP Server Management

  • MCP Server Configuration - Add local (command-based) or remote (HTTP) MCP servers
  • Server Templates - Pre-built templates for common MCP servers
  • Enable/Disable Servers - Toggle servers on/off with auto-restart

Scheduled Tasks

  • Cron Job Scheduling - Schedule recurring tasks with full cron expression support
  • Task Management - Create, update, delete, pause/resume tasks from the UI
  • Command Types - Run OpenCode skills, send messages to OpenCode, or execute scripts
  • Run Now - Manually trigger any scheduled task immediately
  • Status Tracking - View last run time, next scheduled run, and task status
  • Preset Schedules - Quick options for common schedules (hourly, daily, weekly)

Settings & Customization

  • Theme Selection - Dark, Light, or System theme
  • Keyboard Shortcuts - Customizable keyboard shortcuts
  • OpenCode Config Editor - Raw JSON editor for advanced configuration

Mobile & PWA

  • Mobile-First Design - Responsive UI optimized for mobile use
  • PWA Support - Installable as Progressive Web App
  • iOS Keyboard Support - Proper keyboard handling on iOS
  • Enter Key Send - Press Enter to automatically close keyboard and send messages
  • Swipe-to-Navigate - Swipe right from left edge to navigate back

Text-to-Speech (TTS)

  • Dual Provider Support - Browser-native Web Speech API + external OpenAI-compatible endpoints
  • Browser-Native TTS - Built-in Web Speech API for instant playback without API keys
  • Coqui TTS with Multi-Model Support - 7+ high-quality English voice models with runtime switching:
    • Jenny (default, fastest)
    • LJSpeech VITS, Tacotron2, Glow-TTS, FastPitch
    • VCTK VITS (109 multi-speaker voices)
    • XTTS v2 (multilingual voice cloning)
  • AI Message Playback - Listen to assistant responses with TTS
  • OpenAI-Compatible - Works with any OpenAI-compatible TTS endpoint
  • Voice & Speed Discovery - Automatic voice detection with caching (1hr TTL)
  • Voice & Speed Controls - Configurable voice selection and playback speed
  • Audio Caching - 24-hour cache with 200MB limit for performance
  • Markdown Sanitization - Filters unreadable symbols for smooth playback
  • Floating Controls - Persistent stop button for audio control
  • Custom Endpoints - Connect to local or self-hosted TTS services

Session Management

  • Session Pruning - Automatic cleanup of old sessions to save disk space
  • Auto-Prune on Startup - Configurable retention period (default: 30 days)
  • Bulk Delete - Delete multiple sessions at once from the UI

Log Management

  • Cloudflare Tunnel Logs - Logs surfaced in UI Settings → Tunnel tab
  • Log Rotation - Automatic rotation at 5MB with up to 3 backups
  • Runtime Maintenance - Log files rotated every 5 minutes to prevent disk bloat

QA Testing System

  • Autonomous AI Testing - OpenCode AI agent can autonomously test the entire application
  • Quick Test Commands - Run health, API, auth, tunnel, Docker, and E2E tests with one command
  • CI/CD Ready - Integration-ready for GitHub Actions and other CI/CD pipelines
  • Comprehensive Coverage - Tests server startup, API endpoints, authentication, tunnels, Docker deployment, and more

Use @qa-tester in OpenCode or run scripts/qa-test.sh for quick tests.

Screenshots

Files (Mobile) Files (Desktop)
files-mobile files-desktop
Chat (Mobile) Chat (Desktop)
chat-mobile chat-desktop
Inline Diff View
inline-diff-view

Coming Soon

  • Authentication - User authentication and session management

Installation

Option 1: npm/bun (Recommended for Local Use)

Install OpenCode Manager as a global CLI tool directly from GitHub:

# Install with bun (recommended)
bun install -g github:dzianisv/opencode-manager

# Or with npm
npm install -g github:dzianisv/opencode-manager

Prerequisites:

  • Bun installed (for running the CLI)
  • OpenCode installed: curl -fsSL https://opencode.ai/install | bash
  • (Optional) cloudflared for tunnel mode: brew install cloudflared

CLI Commands:

# Start the server
opencode-manager start

# Start with Cloudflare tunnel for remote access
opencode-manager start --tunnel

# Connect to an existing opencode instance
opencode-manager start --client

# Install as a user service (runs on login, tunnel enabled by default)
opencode-manager install-service

# Install as a service without tunnel (local only)
opencode-manager install-service --no-tunnel

# Check service status (YAML output with health checks)
opencode-manager status

# View service logs
opencode-manager logs

# Uninstall the service
opencode-manager uninstall-service

# Show help
opencode-manager help

Status Command Output:

The status command provides comprehensive health information in YAML format:

status: healthy
port: 5001

backend:
  status: healthy
  database: ok
  opencode: connected
  opencode_version: 1.2.0

stt:
  status: running
  model: ggml-small-q5_1
  port: 5552

tts:
  status: running
  provider: coqui
  model: tts_models/en/jenny/jenny

tunnel:
  status: connected
  url: https://admin:password@xxx.trycloudflare.com
  edge_location: San Francisco, CA

Service Installation:

The install-service command installs OpenCode Manager as a user-level service that starts automatically on login:

  • macOS: Creates a launchd plist at ~/Library/LaunchAgents/com.opencode-manager.plist
  • Linux: Creates a systemd user service at ~/.config/systemd/user/opencode-manager.service

Configuration Files:

All configuration is stored in ~/.local/run/opencode-manager/:

File Description
auth.json Basic auth credentials ({"username": "admin", "password": "..."})
endpoints.json Active endpoints (local URL and tunnel URL if enabled)
cloudflared.log Cloudflare tunnel logs (auto-rotated at 5MB)
stdout.log Service stdout (macOS only)
stderr.log Service stderr (macOS only)

On first run, credentials are automatically generated and saved. Use these to authenticate when accessing the web UI.

Option 2: Docker (Recommended for Servers)

# Simple one-liner
docker run -d -p 5003:5003 -v opencode-workspace:/workspace ghcr.io/dzianisv/opencode-manager

# Or with API keys
docker run -d -p 5003:5003 \
  -e ANTHROPIC_API_KEY=sk-... \
  -v opencode-workspace:/workspace \
  ghcr.io/dzianisv/opencode-manager

Access the application at http://localhost:5003

With Docker Compose (for persistent volumes and env vars):

git clone https://github.com/dzianisv/opencode-manager.git
cd opencode-manager

# Configure API keys (optional)
echo "ANTHROPIC_API_KEY=sk-..." > .env

# Start
docker compose up -d

The Docker setup automatically:

  • Installs OpenCode CLI on first run
  • Starts Whisper (STT) and Chatterbox (TTS) servers
  • Sets up persistent volumes for workspace and database

Docker Commands:

docker compose up -d        # Start
docker compose down         # Stop
docker compose logs -f      # View logs
docker compose restart      # Restart
docker exec -it opencode-manager sh  # Shell access

Dev Server Ports

The Docker container exposes ports 5100-5103 for running dev servers inside your repositories. Configure your project's dev server to use one of these ports and access it directly from your browser.

Example usage:

# Vite (vite.config.ts)
server: { port: 5100, host: '0.0.0.0' }

# Next.js
next dev -p 5100 -H 0.0.0.0

# Express/Node
app.listen(5100, '0.0.0.0')

Access your dev server at http://localhost:5100 (or your Docker host IP).

To customize the exposed ports, edit docker-compose.yml:

ports:
  - "5003:5003"      # OpenCode Manager
  - "5100:5100"      # Dev server 1
  - "5101:5101"      # Dev server 2
  - "5102:5102"      # Dev server 3
  - "5103:5103"      # Dev server 4

Global Agent Instructions (AGENTS.md)

OpenCode Manager creates a default AGENTS.md file in the workspace config directory (/workspace/.config/opencode/AGENTS.md). This file provides global instructions to AI agents working within the container.

Default instructions include:

  • Reserved ports (5003 for OpenCode Manager, 5551 for OpenCode server)
  • Available dev server ports (5100-5103)
  • Guidelines for binding to 0.0.0.0 for Docker accessibility

Editing AGENTS.md:

  • Via UI: Settings > OpenCode > Global Agent Instructions
  • Via file: Edit /workspace/.config/opencode/AGENTS.md directly

This file is merged with any repository-specific AGENTS.md files, with repository instructions taking precedence for their respective codebases.

Option 3: Azure VM Deployment (Quick Start)

Deploy OpenCode Manager to an Azure VM with a single command. Includes automatic HTTPS via Cloudflare tunnel and Basic Auth protection.

Prerequisites:

  • Azure CLI installed and logged in (az login)
  • Bun installed
  • SSH keys configured (~/.ssh/id_rsa.pub)

Quick Deploy:

# Clone the repository
git clone https://github.com/dzianisv/opencode-manager.git
cd opencode-manager

# Install dependencies
bun install

# Deploy to Azure (creates VM, configures Docker, sets up tunnel)
bun run scripts/deploy.ts

The script will:

  1. Create an Azure resource group and VM (Standard_D2s_v5 by default)
  2. Install Docker and deploy OpenCode Manager
  3. Set up Caddy reverse proxy with Basic Auth
  4. Create a Cloudflare tunnel for HTTPS access
  5. Enable YOLO mode (auto-approve all AI permissions)

After deployment, you'll receive:

  • Tunnel URL: https://xxx-xxx.trycloudflare.com
  • Username: admin (default)
  • Password: Auto-generated or prompted

Environment Variables (optional):

Create a .env file before deploying to configure:

# Basic Auth
AUTH_USERNAME=admin
AUTH_PASSWORD=your-secure-password

# Azure Configuration
AZURE_LOCATION=westus2
AZURE_VM_SIZE=Standard_D2s_v5

# GitHub Token (for cloning private repos)
GITHUB_TOKEN=ghp_xxx

# AI Provider Keys (optional - can also configure via OAuth in UI)
ANTHROPIC_API_KEY=sk-ant-xxx
OPENAI_API_KEY=sk-xxx
GEMINI_API_KEY=xxx

# OpenCode Fork (for context overflow fix - default)
OPENCODE_FORK_REPO=VibeTechnologies/opencode
OPENCODE_FORK_BRANCH=dev

Deployment Commands:

# Deploy new VM
bun run scripts/deploy.ts

# Check status (shows tunnel URL, credentials, container status)
bun run scripts/deploy.ts --status

# Update to latest code (pulls from GitHub, rebuilds containers)
bun run scripts/deploy.ts --update

# Sync local OpenCode auth to VM (GitHub Copilot, Anthropic OAuth)
bun run scripts/deploy.ts --sync-auth

# Update environment variables
bun run scripts/deploy.ts --update-env

# Change Basic Auth password
bun run scripts/deploy.ts --update-auth

# Re-enable YOLO mode (auto-approve permissions)
bun run scripts/deploy.ts --yolo

# Destroy all Azure resources
bun run scripts/deploy.ts --destroy

Syncing Authentication:

If you have GitHub Copilot or Anthropic OAuth configured locally, sync it to your VM:

# First, authenticate locally with OpenCode
opencode
/connect github-copilot

# Then sync to your Azure VM
bun run scripts/deploy.ts --sync-auth

SSH Access:

# Get VM IP and SSH command
bun run scripts/deploy.ts --status

# SSH into VM
ssh azureuser@<VM_IP>

# View container logs
ssh azureuser@<VM_IP> "sudo docker logs opencode-manager -f"

Cost Estimate:

  • Standard_D2s_v5 (2 vCPU, 8GB RAM): ~$70/month
  • Use --destroy when not in use to avoid charges

Option 4: Native Local Development (macOS)

Run OpenCode Manager natively on macOS without Docker. This is ideal for development or when you want the web UI to connect to an existing OpenCode instance running in your terminal.

Prerequisites:

  • Bun installed
  • Node.js installed (for frontend)
  • OpenCode installed: curl -fsSL https://opencode.ai/install | bash
  • (Optional) cloudflared for tunnel mode: brew install cloudflared

Quick Start:

# Clone the repository
git clone https://github.com/dzianisv/opencode-manager.git
cd opencode-manager

# Install dependencies
pnpm install

# Copy environment configuration
cp .env.local.example .env

# Start with Cloudflare tunnel (spawns opencode serve + creates public URL)
pnpm start

# Or connect to an existing opencode instance with tunnel
pnpm start:client

# Or start without tunnel (local only)
pnpm start:no-tunnel

Available Commands:

Command Description
pnpm start Start with Cloudflare tunnel - spawns opencode serve + public URL
pnpm start:client Connect to existing opencode instance with tunnel
pnpm start:no-tunnel Start without tunnel (local only)
bun scripts/start-native.ts --help Show all available options
pnpm tunnel:start Start persistent Cloudflare tunnel (survives backend restarts)
pnpm tunnel:stop Stop the persistent tunnel
pnpm tunnel:status Check tunnel status and get URL
pnpm cleanup Kill orphaned processes on managed ports (does NOT kill tunnel)

How Local Services Work

When running OpenCode Manager locally, several services work together:

┌─────────────────────────────────────────────────────────────────┐
│                     Your Browser/Mobile Device                   │
└─────────────────────────────────────────────────────────────────┘
                              │
                              ▼
┌─────────────────────────────────────────────────────────────────┐
│              Cloudflare Tunnel (optional, persistent)            │
│              https://xxx.trycloudflare.com                       │
│              Managed by: pnpm tunnel:start/stop                  │
└─────────────────────────────────────────────────────────────────┘
                              │
                              ▼
┌─────────────────────────────────────────────────────────────────┐
│                    Backend Server (port 5001)                    │
│                    Bun + Hono REST API                           │
│                                                                  │
│  • /api/health      - Health check                               │
│  • /api/repos       - Repository management                      │
│  • /api/settings    - User preferences                           │
│  • /api/stt/*       - Speech-to-text (proxies to Whisper)        │
│  • /api/tts/*       - Text-to-speech                             │
│  • /opencode/*      - Proxies to OpenCode server                 │
└─────────────────────────────────────────────────────────────────┘
                              │
              ┌───────────────┼───────────────┐
              ▼               ▼               ▼
┌───────────────────┐ ┌───────────────┐ ┌───────────────────────┐
│  OpenCode Server  │ │ Whisper STT   │ │  SQLite Database      │
│  (port 5551)      │ │ (port 5552)   │ │  ~/.local/run/        │
│                   │ │               │ │  opencode-manager/    │
│  AI agent runtime │ │ Speech-to-    │ │  data.db              │
│  Session mgmt     │ │ text server   │ │                       │
│  Tool execution   │ │ (auto-start)  │ │  Stores settings,     │
│                   │ │               │ │  auth, preferences    │
└───────────────────┘ └───────────────┘ └───────────────────────┘

Service Responsibilities:

Service Port Description
Backend 5001 Main API server - handles web requests, proxies to OpenCode, manages settings
OpenCode 5551 AI agent runtime - executes tools, manages sessions, interfaces with AI providers
Whisper STT 5552 Speech-to-text server - transcribes voice input (auto-starts when needed)
Tunnel - Persistent Cloudflare tunnel for remote access (runs independently)

Data Storage:

All persistent data is stored in ~/.local/run/opencode-manager/:

File Description
data.db SQLite database (settings, preferences, cached data)
auth.json Basic auth credentials
endpoints.json Active endpoints (local URL, tunnel URL)
tunnel.json Tunnel state (PID, URL, port)
tunnel.pid Tunnel process ID
stdout.log Service stdout (macOS launchd only)
stderr.log Service stderr (macOS launchd only)

Client Mode:

When using --client mode, the script will:

  1. Scan for running opencode processes using lsof
  2. Check health via /doc endpoint on each discovered port
  3. Fetch version info from /global/health
  4. List all healthy instances with directory, version, and PID
  5. Let you select which instance to connect to
$ pnpm start:client

╔═══════════════════════════════════════╗
║   OpenCode Manager - Native Start     ║
╚═══════════════════════════════════════╝

🔍 Searching for running opencode servers...

📋 Found multiple opencode servers:

  [1] Port 5551
      Directory: /Users/you/project-a
      Version: 1.1.2
      PID: 12345

  [2] Port 61782
      Directory: /Users/you/project-b
      Version: 1.0.223
      PID: 67890

Select server [1]: 

This is useful when you already have opencode running in a terminal and want the web UI to connect to it without spawning a separate server.

Session Sharing Between CLI and Web UI

OpenCode Manager shares sessions with the opencode CLI. Sessions you create in your terminal are visible in the web UI, and vice versa.

How it works:

Both the CLI and Web UI store sessions in ~/.local/share/opencode/. When you:

  1. Create a session in CLI (opencode in terminal) → Visible in Web UI immediately
  2. Create a session in Web UI → Visible in CLI with /sessions command
  3. Continue a session → Changes sync automatically between both interfaces

Requirements:

  • Both must use the same OpenCode data directory (~/.local/share/opencode/)
  • When running as a service, OpenCode Manager uses the system default location
  • The --client mode connects to your existing terminal sessions

Verification:

# Check sessions visible to CLI
opencode
/sessions

# Check sessions visible to Web UI
curl -s -u admin:PASSWORD http://localhost:5001/api/opencode/session | jq 'length'

# Both should show the same count

Persistent Tunnel (Recommended for Remote Development):

The Cloudflare tunnel can run as a persistent background process that survives backend restarts:

# Start tunnel once (persists until explicitly stopped)
pnpm tunnel:start

# Check tunnel status and get URL (includes health check)
pnpm tunnel:status

# Check if tunnel URL is reachable
bun scripts/tunnel.ts health

# Now you can restart backend freely without losing tunnel connection
pnpm dev:backend  # Ctrl+C and restart as needed

# Stop tunnel when done
pnpm tunnel:stop

The tunnel state is stored in ~/.local/run/opencode-manager/tunnel.json. The endpoints are stored in ~/.local/run/opencode-manager/endpoints.json.

Benefits:

  • Restart backend without disconnecting mobile/remote users
  • Same tunnel URL persists across backend restarts
  • pnpm cleanup does NOT kill the tunnel
  • Health check verifies tunnel URL is actually reachable
  • Automatically updates endpoints.json when tunnel starts

Named Tunnels (Persistent URLs):

Quick tunnels generate random URLs that change on each restart. For a persistent URL, use a Cloudflare named tunnel:

# 1. Login to Cloudflare (one-time)
cloudflared tunnel login

# 2. Create a named tunnel (one-time)
cloudflared tunnel create opencode-manager
# Note the tunnel ID (UUID) from output

# 3. Create config file ~/.cloudflared/config.yml
tunnel: <TUNNEL_ID>
credentials-file: /Users/$USER/.cloudflared/<TUNNEL_ID>.json
ingress:
  - hostname: opencode.yourdomain.com
    service: http://localhost:5001
  - service: http_status:404

# 4. Add DNS record in Cloudflare dashboard
# CNAME: opencode -> <TUNNEL_ID>.cfargotunnel.com

# 5. Run the named tunnel
cloudflared tunnel run opencode-manager

With named tunnels, opencode.yourdomain.com will always point to your instance.

Without Tunnel (Local Only):

# Start without tunnel
pnpm start:no-tunnel

# Or connect to existing instance without tunnel
bun scripts/start-native.ts --client

Custom Port:

# Use a different backend port
bun scripts/start-native.ts --port 3000
bun scripts/start-native.ts --client --port 3000

Option 5: Local Development (Hot Reload)

# Clone the repository
git clone https://github.com/dzianisv/opencode-manager.git
cd opencode-manager

# Install dependencies (uses Bun workspaces)
bun install

# Copy environment configuration
cp .env.example .env

# Start development servers (backend + frontend)
npm run dev

Testing

The project includes a comprehensive QA testing system with autonomous AI testing capabilities.

Quick Testing

Run tests using the provided command script:

# Health check (quick verification)
scripts/qa-test.sh health

# API endpoint tests
scripts/qa-test.sh api

# Authentication tests
scripts/qa-test.sh auth

# Cloudflare tunnel tests
scripts/qa-test.sh tunnel

# Docker deployment tests
scripts/qa-test.sh docker

# E2E test suite
scripts/qa-test.sh e2e

# Run all tests
scripts/qa-test.sh full

# Test remote deployment
scripts/qa-test.sh health https://your-deployment.com

Autonomous AI Testing

Use OpenCode slash commands for quick testing:

/qa-test       # Run comprehensive QA tests
/qa-health     # Quick health check

Or mention the QA agent directly:

"@qa-tester run a full test suite and report any issues"
"Test the application and generate a comprehensive report"

The AI agent will autonomously:

  1. Execute all test protocols
  2. Evaluate results against expected outputs
  3. Generate a professional test report with metrics
  4. Identify issues and provide recommendations

Available Tests

  • ✅ Development server startup and health
  • ✅ Backend API endpoints (health, repos, settings, OpenCode proxy)
  • ✅ Authentication (with/without credentials, valid/invalid)
  • ✅ Cloudflare tunnel (startup, URL generation, public access)
  • ✅ Docker deployment (build, run, health checks, volumes)
  • ✅ E2E test suite (voice, talk mode, browser automation)
  • ✅ Database integrity
  • ✅ Git operations
  • ✅ Performance metrics
  • ✅ Security validation

CI/CD Integration

The QA system can be integrated into GitHub Actions:

- name: Run QA Tests
  run: |
    scripts/qa-test.sh full

See the QA agent at .opencode/agent/qa-tester.md for detailed test protocols.

OAuth Provider Setup

OpenCode WebUI supports OAuth authentication for select providers, offering a more secure and convenient alternative to API keys.

Supported OAuth Providers

  • Anthropic (Claude) - OAuth login with Claude Pro/Max accounts
  • GitHub Copilot - OAuth device flow authentication

Setting Up OAuth

  1. Navigate to Settings → Provider Credentials
  2. Select a provider that shows the "OAuth" badge
  3. Click "Add OAuth" to start the authorization flow
  4. Choose authentication method:
    • "Open Authorization Page" - Opens browser for sign-in
    • "Use Authorization Code" - Provides code for manual entry
  5. Complete authorization in the browser or enter the provided code
  6. Connection status will show as "Configured" when successful

Testing

  1. scripts/run-local-docker.sh Pulls and runs the CI-built Docker image from GHCR locally: ./scripts/run-local-docker.sh
  2. scripts/run-e2e-tests.ts Runs all E2E tests against a running instance: bun run scripts/run-e2e-tests.ts --url http://localhost:5003
  3. Updated AGENTS.md Documents the E2E testing workflow with CI-built images. The workflow GitHub Actions (CI) Local Machine ───────────────────── ────────────────────── Push to main
    ↓ docker-build.yml runs ↓ Build Docker image ↓ Push to GHCR ───────────────→ ./scripts/run-local-docker.sh ↓ Pull image from GHCR ↓ Run container (port 5003) ↓ bun run scripts/run-e2e-tests.ts ↓ ✅ Voice E2E tests ✅ Talk Mode API tests
    ✅ Talk Mode Browser tests

About

Mobile-first web interface for OpenCode AI agents. Manage, control, and code with multiple OpenCode agents from any device - your phone, tablet, or desktop. Features Git integration, file management, and real-time chat in a responsive PWA. Deploy with Docker for instant setup.

Resources

License

Stars

Watchers

Forks

Packages

 
 
 

Languages

  • TypeScript 93.3%
  • HTML 3.3%
  • Python 1.7%
  • Shell 1.0%
  • CSS 0.3%
  • Dockerfile 0.2%
  • JavaScript 0.2%