Mobile-first web interface for OpenCode AI agents. Manage, control, and code with OpenCode from any device - your phone, tablet, or desktop. Features Git integration, file management, and real-time chat in a responsive PWA. Deploy with Docker for instant setup. View diffs, edit files and much more.
Responsive design: The dashboard adapts seamlessly from mobile to desktop. Recent Sessions and Repositories use consistent card styling with hover effects and grid layouts.
Voice-powered coding: Speak your request, get AI-generated code. The demo shows the Talk Mode E2E test - voice input is transcribed via Whisper STT, sent to OpenCode AI, and the response streams back in real-time.
Key features: Dashboard overview, global search (Cmd+K), voice settings (TTS/STT/Talk Mode), Cloudflare tunnel status with metrics, keyboard shortcuts, OpenCode config, AI providers, and chat sessions.
This fork of opencode-manager adds production-ready voice features, cloud deployment, and remote access:
| Feature | This Fork | Upstream |
|---|---|---|
| Built-in STT | ✅ Faster Whisper server (local, no API key) | ❌ External API only |
| Built-in TTS | ✅ Coqui + Chatterbox (7+ models, local) | ❌ External API only |
| Browser Voice | ✅ Web Speech API fallback | ✅ Web Speech API |
| System Service | ✅ install-service for macOS/Linux |
❌ Manual startup |
| Status Command | ✅ YAML health output w/ tunnel URL | ❌ Not included |
| Cloudflare Tunnel | ✅ Built-in, auto-starts with service | ❌ Not included |
| Tunnel Metrics UI | ✅ Live connection stats + logs in Settings | ❌ Not included |
| Session Pruning | ✅ Auto-cleanup old sessions | ❌ Manual cleanup |
| Log Rotation | ✅ Auto-rotate logs (5MB limit) | ❌ Not included |
| Cloud Deploy | ✅ One-command Azure deployment | ❌ Not included |
| Basic Auth | ✅ Caddy proxy with auth | ❌ Not included |
| E2E Voice Tests | ✅ Browser + API tests | ❌ Not included |
| Large Output Fix | ✅ Context overflow prevention | ❌ Uses official OpenCode |
Quick Start:
# Install globally
bun install -g github:dzianisv/opencode-manager
# Run as system service (auto-starts on boot, includes tunnel)
opencode-manager install-service
# Or run manually
opencode-manager start
# Access from anywhere via Cloudflare tunnel URL
cat ~/.local/run/opencode-manager/endpoints.jsonThis project builds OpenCode from VibeTechnologies/opencode, a fork of the official sst/opencode repository. We maintain this fork to include critical fixes that haven't yet been merged upstream.
File Persistence for Large Tool Outputs (PR #6234)
The official OpenCode has a known issue where large tool outputs (WebFetch, Bash, MCP tools) can overflow the context window, causing:
- "prompt is too long" errors (e.g.,
202744 tokens > 200000 maximum) - Sessions becoming stuck/unresponsive
- Loss of work when context overflows mid-conversation
Our fork includes the fix from PR #6234 which implements intelligent file persistence:
- Tool outputs exceeding 30,000 characters are saved to disk instead of the context
- The AI model receives a file path with instructions to explore the data using Read/Grep/jq
- Context stays small, preventing overflow errors
- Files are automatically cleaned up when sessions are deleted
This fix is essential for production use cases where AI agents frequently fetch documentation, analyze large codebases, or work with verbose tool outputs.
Implementation Details:
-
VibeTechnologies/opencode fork (branch:
dev) contains two fixes:- Large tool outputs (>30k chars) are saved to disk instead of context (
packages/opencode/src/session/prompt.ts) - Auto-allow read access to OpenCode storage directory to avoid permission prompts for reading saved tool results (
packages/opencode/src/tool/read.ts)
- Large tool outputs (>30k chars) are saved to disk instead of context (
-
opencode-manager deploys the fork at container startup via:
docker-compose.yml-OPENCODE_FORK_REPOandOPENCODE_FORK_BRANCHenv varsscripts/docker-entrypoint.sh-install_from_fork()function
Test Results (all 3 integration tests pass):
- 883,082 character output saved to file successfully
- No retry loop / sessions didn't get stuck
- Sessions can continue conversation after context-heavy operations
We regularly sync our fork with upstream sst/opencode to incorporate new features and fixes. Once PR #6234 is merged upstream, we plan to switch back to the official release.
- Multi-Repository Support - Clone and manage multiple git repos/worktrees in local workspaces
- Private Repository Support - GitHub PAT configuration for cloning private repos
- Worktree Support - Create and manage Git worktrees for working on multiple branches
- Git Diff Viewer - View file changes with unified diff, line numbers, and addition/deletion counts
- Git Status Panel - See all uncommitted changes (modified, added, deleted, renamed, untracked)
- Branch Switching - Switch between branches via dropdown
- Branch/Worktree Creation - Create new branch workspaces from any repository
- Ahead/Behind Tracking - Shows commits ahead/behind remote
- Push PRs to GitHub - Create and push pull requests directly from your phone
- Directory Navigation - Browse files and folders with tree view
- File Search - Search files within directories
- Syntax Highlighting - Code preview with syntax highlighting
- File Operations - Create files/folders, rename, delete
- Drag-and-Drop Upload - Upload files by dragging into the browser
- Large File Support - Virtualization for large files
- ZIP Download - Download repos as ZIP excluding gitignored files
- Slash Commands - Built-in commands (
/help,/new,/models,/export,/compact, etc.) - Custom Commands - Create custom slash commands with templates
- File Mentions - Reference files with
@filenameautocomplete - Plan/Build Mode Toggle - Switch between read-only and file-change modes
- Mermaid Diagram Support - Visual diagram rendering in chat messages
- Session Management - Create, search, delete, and bulk delete sessions
- Real-time Streaming - Live message streaming with SSE
- CLI Session Sharing - Sessions created in terminal
opencodeCLI are visible in Web UI
- Model Selection - Browse and select from available AI models with filtering
- Provider Management - Configure multiple AI providers with API keys or OAuth
- OAuth Authentication - Secure OAuth login for supported providers (Anthropic, GitHub Copilot)
- Context Usage Indicator - Visual progress bar showing token usage
- Agent Configuration - Create custom agents with system prompts and tool permissions
- MCP Server Configuration - Add local (command-based) or remote (HTTP) MCP servers
- Server Templates - Pre-built templates for common MCP servers
- Enable/Disable Servers - Toggle servers on/off with auto-restart
- Cron Job Scheduling - Schedule recurring tasks with full cron expression support
- Task Management - Create, update, delete, pause/resume tasks from the UI
- Command Types - Run OpenCode skills, send messages to OpenCode, or execute scripts
- Run Now - Manually trigger any scheduled task immediately
- Status Tracking - View last run time, next scheduled run, and task status
- Preset Schedules - Quick options for common schedules (hourly, daily, weekly)
- Theme Selection - Dark, Light, or System theme
- Keyboard Shortcuts - Customizable keyboard shortcuts
- OpenCode Config Editor - Raw JSON editor for advanced configuration
- Mobile-First Design - Responsive UI optimized for mobile use
- PWA Support - Installable as Progressive Web App
- iOS Keyboard Support - Proper keyboard handling on iOS
- Enter Key Send - Press Enter to automatically close keyboard and send messages
- Swipe-to-Navigate - Swipe right from left edge to navigate back
- Dual Provider Support - Browser-native Web Speech API + external OpenAI-compatible endpoints
- Browser-Native TTS - Built-in Web Speech API for instant playback without API keys
- Coqui TTS with Multi-Model Support - 7+ high-quality English voice models with runtime switching:
- Jenny (default, fastest)
- LJSpeech VITS, Tacotron2, Glow-TTS, FastPitch
- VCTK VITS (109 multi-speaker voices)
- XTTS v2 (multilingual voice cloning)
- AI Message Playback - Listen to assistant responses with TTS
- OpenAI-Compatible - Works with any OpenAI-compatible TTS endpoint
- Voice & Speed Discovery - Automatic voice detection with caching (1hr TTL)
- Voice & Speed Controls - Configurable voice selection and playback speed
- Audio Caching - 24-hour cache with 200MB limit for performance
- Markdown Sanitization - Filters unreadable symbols for smooth playback
- Floating Controls - Persistent stop button for audio control
- Custom Endpoints - Connect to local or self-hosted TTS services
- Session Pruning - Automatic cleanup of old sessions to save disk space
- Auto-Prune on Startup - Configurable retention period (default: 30 days)
- Bulk Delete - Delete multiple sessions at once from the UI
- Cloudflare Tunnel Logs - Logs surfaced in UI Settings → Tunnel tab
- Log Rotation - Automatic rotation at 5MB with up to 3 backups
- Runtime Maintenance - Log files rotated every 5 minutes to prevent disk bloat
- Autonomous AI Testing - OpenCode AI agent can autonomously test the entire application
- Quick Test Commands - Run health, API, auth, tunnel, Docker, and E2E tests with one command
- CI/CD Ready - Integration-ready for GitHub Actions and other CI/CD pipelines
- Comprehensive Coverage - Tests server startup, API endpoints, authentication, tunnels, Docker deployment, and more
Use @qa-tester in OpenCode or run scripts/qa-test.sh for quick tests.
| Files (Mobile) | Files (Desktop) |
![]() |
![]() |
| Chat (Mobile) | Chat (Desktop) |
![]() |
![]() |
| Inline Diff View | |
![]() |
- Authentication - User authentication and session management
Install OpenCode Manager as a global CLI tool directly from GitHub:
# Install with bun (recommended)
bun install -g github:dzianisv/opencode-manager
# Or with npm
npm install -g github:dzianisv/opencode-managerPrerequisites:
- Bun installed (for running the CLI)
- OpenCode installed:
curl -fsSL https://opencode.ai/install | bash - (Optional) cloudflared for tunnel mode:
brew install cloudflared
CLI Commands:
# Start the server
opencode-manager start
# Start with Cloudflare tunnel for remote access
opencode-manager start --tunnel
# Connect to an existing opencode instance
opencode-manager start --client
# Install as a user service (runs on login, tunnel enabled by default)
opencode-manager install-service
# Install as a service without tunnel (local only)
opencode-manager install-service --no-tunnel
# Check service status (YAML output with health checks)
opencode-manager status
# View service logs
opencode-manager logs
# Uninstall the service
opencode-manager uninstall-service
# Show help
opencode-manager helpStatus Command Output:
The status command provides comprehensive health information in YAML format:
status: healthy
port: 5001
backend:
status: healthy
database: ok
opencode: connected
opencode_version: 1.2.0
stt:
status: running
model: ggml-small-q5_1
port: 5552
tts:
status: running
provider: coqui
model: tts_models/en/jenny/jenny
tunnel:
status: connected
url: https://admin:password@xxx.trycloudflare.com
edge_location: San Francisco, CAService Installation:
The install-service command installs OpenCode Manager as a user-level service that starts automatically on login:
- macOS: Creates a launchd plist at
~/Library/LaunchAgents/com.opencode-manager.plist - Linux: Creates a systemd user service at
~/.config/systemd/user/opencode-manager.service
Configuration Files:
All configuration is stored in ~/.local/run/opencode-manager/:
| File | Description |
|---|---|
auth.json |
Basic auth credentials ({"username": "admin", "password": "..."}) |
endpoints.json |
Active endpoints (local URL and tunnel URL if enabled) |
cloudflared.log |
Cloudflare tunnel logs (auto-rotated at 5MB) |
stdout.log |
Service stdout (macOS only) |
stderr.log |
Service stderr (macOS only) |
On first run, credentials are automatically generated and saved. Use these to authenticate when accessing the web UI.
# Simple one-liner
docker run -d -p 5003:5003 -v opencode-workspace:/workspace ghcr.io/dzianisv/opencode-manager
# Or with API keys
docker run -d -p 5003:5003 \
-e ANTHROPIC_API_KEY=sk-... \
-v opencode-workspace:/workspace \
ghcr.io/dzianisv/opencode-managerAccess the application at http://localhost:5003
With Docker Compose (for persistent volumes and env vars):
git clone https://github.com/dzianisv/opencode-manager.git
cd opencode-manager
# Configure API keys (optional)
echo "ANTHROPIC_API_KEY=sk-..." > .env
# Start
docker compose up -dThe Docker setup automatically:
- Installs OpenCode CLI on first run
- Starts Whisper (STT) and Chatterbox (TTS) servers
- Sets up persistent volumes for workspace and database
Docker Commands:
docker compose up -d # Start
docker compose down # Stop
docker compose logs -f # View logs
docker compose restart # Restart
docker exec -it opencode-manager sh # Shell accessThe Docker container exposes ports 5100-5103 for running dev servers inside your repositories. Configure your project's dev server to use one of these ports and access it directly from your browser.
Example usage:
# Vite (vite.config.ts)
server: { port: 5100, host: '0.0.0.0' }
# Next.js
next dev -p 5100 -H 0.0.0.0
# Express/Node
app.listen(5100, '0.0.0.0')Access your dev server at http://localhost:5100 (or your Docker host IP).
To customize the exposed ports, edit docker-compose.yml:
ports:
- "5003:5003" # OpenCode Manager
- "5100:5100" # Dev server 1
- "5101:5101" # Dev server 2
- "5102:5102" # Dev server 3
- "5103:5103" # Dev server 4OpenCode Manager creates a default AGENTS.md file in the workspace config directory (/workspace/.config/opencode/AGENTS.md). This file provides global instructions to AI agents working within the container.
Default instructions include:
- Reserved ports (5003 for OpenCode Manager, 5551 for OpenCode server)
- Available dev server ports (5100-5103)
- Guidelines for binding to
0.0.0.0for Docker accessibility
Editing AGENTS.md:
- Via UI: Settings > OpenCode > Global Agent Instructions
- Via file: Edit
/workspace/.config/opencode/AGENTS.mddirectly
This file is merged with any repository-specific AGENTS.md files, with repository instructions taking precedence for their respective codebases.
Deploy OpenCode Manager to an Azure VM with a single command. Includes automatic HTTPS via Cloudflare tunnel and Basic Auth protection.
Prerequisites:
Quick Deploy:
# Clone the repository
git clone https://github.com/dzianisv/opencode-manager.git
cd opencode-manager
# Install dependencies
bun install
# Deploy to Azure (creates VM, configures Docker, sets up tunnel)
bun run scripts/deploy.tsThe script will:
- Create an Azure resource group and VM (Standard_D2s_v5 by default)
- Install Docker and deploy OpenCode Manager
- Set up Caddy reverse proxy with Basic Auth
- Create a Cloudflare tunnel for HTTPS access
- Enable YOLO mode (auto-approve all AI permissions)
After deployment, you'll receive:
- Tunnel URL:
https://xxx-xxx.trycloudflare.com - Username:
admin(default) - Password: Auto-generated or prompted
Environment Variables (optional):
Create a .env file before deploying to configure:
# Basic Auth
AUTH_USERNAME=admin
AUTH_PASSWORD=your-secure-password
# Azure Configuration
AZURE_LOCATION=westus2
AZURE_VM_SIZE=Standard_D2s_v5
# GitHub Token (for cloning private repos)
GITHUB_TOKEN=ghp_xxx
# AI Provider Keys (optional - can also configure via OAuth in UI)
ANTHROPIC_API_KEY=sk-ant-xxx
OPENAI_API_KEY=sk-xxx
GEMINI_API_KEY=xxx
# OpenCode Fork (for context overflow fix - default)
OPENCODE_FORK_REPO=VibeTechnologies/opencode
OPENCODE_FORK_BRANCH=devDeployment Commands:
# Deploy new VM
bun run scripts/deploy.ts
# Check status (shows tunnel URL, credentials, container status)
bun run scripts/deploy.ts --status
# Update to latest code (pulls from GitHub, rebuilds containers)
bun run scripts/deploy.ts --update
# Sync local OpenCode auth to VM (GitHub Copilot, Anthropic OAuth)
bun run scripts/deploy.ts --sync-auth
# Update environment variables
bun run scripts/deploy.ts --update-env
# Change Basic Auth password
bun run scripts/deploy.ts --update-auth
# Re-enable YOLO mode (auto-approve permissions)
bun run scripts/deploy.ts --yolo
# Destroy all Azure resources
bun run scripts/deploy.ts --destroySyncing Authentication:
If you have GitHub Copilot or Anthropic OAuth configured locally, sync it to your VM:
# First, authenticate locally with OpenCode
opencode
/connect github-copilot
# Then sync to your Azure VM
bun run scripts/deploy.ts --sync-authSSH Access:
# Get VM IP and SSH command
bun run scripts/deploy.ts --status
# SSH into VM
ssh azureuser@<VM_IP>
# View container logs
ssh azureuser@<VM_IP> "sudo docker logs opencode-manager -f"Cost Estimate:
- Standard_D2s_v5 (2 vCPU, 8GB RAM): ~$70/month
- Use
--destroywhen not in use to avoid charges
Run OpenCode Manager natively on macOS without Docker. This is ideal for development or when you want the web UI to connect to an existing OpenCode instance running in your terminal.
Prerequisites:
- Bun installed
- Node.js installed (for frontend)
- OpenCode installed:
curl -fsSL https://opencode.ai/install | bash - (Optional) cloudflared for tunnel mode:
brew install cloudflared
Quick Start:
# Clone the repository
git clone https://github.com/dzianisv/opencode-manager.git
cd opencode-manager
# Install dependencies
pnpm install
# Copy environment configuration
cp .env.local.example .env
# Start with Cloudflare tunnel (spawns opencode serve + creates public URL)
pnpm start
# Or connect to an existing opencode instance with tunnel
pnpm start:client
# Or start without tunnel (local only)
pnpm start:no-tunnelAvailable Commands:
| Command | Description |
|---|---|
pnpm start |
Start with Cloudflare tunnel - spawns opencode serve + public URL |
pnpm start:client |
Connect to existing opencode instance with tunnel |
pnpm start:no-tunnel |
Start without tunnel (local only) |
bun scripts/start-native.ts --help |
Show all available options |
pnpm tunnel:start |
Start persistent Cloudflare tunnel (survives backend restarts) |
pnpm tunnel:stop |
Stop the persistent tunnel |
pnpm tunnel:status |
Check tunnel status and get URL |
pnpm cleanup |
Kill orphaned processes on managed ports (does NOT kill tunnel) |
When running OpenCode Manager locally, several services work together:
┌─────────────────────────────────────────────────────────────────┐
│ Your Browser/Mobile Device │
└─────────────────────────────────────────────────────────────────┘
│
▼
┌─────────────────────────────────────────────────────────────────┐
│ Cloudflare Tunnel (optional, persistent) │
│ https://xxx.trycloudflare.com │
│ Managed by: pnpm tunnel:start/stop │
└─────────────────────────────────────────────────────────────────┘
│
▼
┌─────────────────────────────────────────────────────────────────┐
│ Backend Server (port 5001) │
│ Bun + Hono REST API │
│ │
│ • /api/health - Health check │
│ • /api/repos - Repository management │
│ • /api/settings - User preferences │
│ • /api/stt/* - Speech-to-text (proxies to Whisper) │
│ • /api/tts/* - Text-to-speech │
│ • /opencode/* - Proxies to OpenCode server │
└─────────────────────────────────────────────────────────────────┘
│
┌───────────────┼───────────────┐
▼ ▼ ▼
┌───────────────────┐ ┌───────────────┐ ┌───────────────────────┐
│ OpenCode Server │ │ Whisper STT │ │ SQLite Database │
│ (port 5551) │ │ (port 5552) │ │ ~/.local/run/ │
│ │ │ │ │ opencode-manager/ │
│ AI agent runtime │ │ Speech-to- │ │ data.db │
│ Session mgmt │ │ text server │ │ │
│ Tool execution │ │ (auto-start) │ │ Stores settings, │
│ │ │ │ │ auth, preferences │
└───────────────────┘ └───────────────┘ └───────────────────────┘
Service Responsibilities:
| Service | Port | Description |
|---|---|---|
| Backend | 5001 | Main API server - handles web requests, proxies to OpenCode, manages settings |
| OpenCode | 5551 | AI agent runtime - executes tools, manages sessions, interfaces with AI providers |
| Whisper STT | 5552 | Speech-to-text server - transcribes voice input (auto-starts when needed) |
| Tunnel | - | Persistent Cloudflare tunnel for remote access (runs independently) |
Data Storage:
All persistent data is stored in ~/.local/run/opencode-manager/:
| File | Description |
|---|---|
data.db |
SQLite database (settings, preferences, cached data) |
auth.json |
Basic auth credentials |
endpoints.json |
Active endpoints (local URL, tunnel URL) |
tunnel.json |
Tunnel state (PID, URL, port) |
tunnel.pid |
Tunnel process ID |
stdout.log |
Service stdout (macOS launchd only) |
stderr.log |
Service stderr (macOS launchd only) |
Client Mode:
When using --client mode, the script will:
- Scan for running opencode processes using
lsof - Check health via
/docendpoint on each discovered port - Fetch version info from
/global/health - List all healthy instances with directory, version, and PID
- Let you select which instance to connect to
$ pnpm start:client
╔═══════════════════════════════════════╗
║ OpenCode Manager - Native Start ║
╚═══════════════════════════════════════╝
🔍 Searching for running opencode servers...
📋 Found multiple opencode servers:
[1] Port 5551
Directory: /Users/you/project-a
Version: 1.1.2
PID: 12345
[2] Port 61782
Directory: /Users/you/project-b
Version: 1.0.223
PID: 67890
Select server [1]: This is useful when you already have opencode running in a terminal and want the web UI to connect to it without spawning a separate server.
OpenCode Manager shares sessions with the opencode CLI. Sessions you create in your terminal are visible in the web UI, and vice versa.
How it works:
Both the CLI and Web UI store sessions in ~/.local/share/opencode/. When you:
- Create a session in CLI (
opencodein terminal) → Visible in Web UI immediately - Create a session in Web UI → Visible in CLI with
/sessionscommand - Continue a session → Changes sync automatically between both interfaces
Requirements:
- Both must use the same OpenCode data directory (
~/.local/share/opencode/) - When running as a service, OpenCode Manager uses the system default location
- The
--clientmode connects to your existing terminal sessions
Verification:
# Check sessions visible to CLI
opencode
/sessions
# Check sessions visible to Web UI
curl -s -u admin:PASSWORD http://localhost:5001/api/opencode/session | jq 'length'
# Both should show the same countPersistent Tunnel (Recommended for Remote Development):
The Cloudflare tunnel can run as a persistent background process that survives backend restarts:
# Start tunnel once (persists until explicitly stopped)
pnpm tunnel:start
# Check tunnel status and get URL (includes health check)
pnpm tunnel:status
# Check if tunnel URL is reachable
bun scripts/tunnel.ts health
# Now you can restart backend freely without losing tunnel connection
pnpm dev:backend # Ctrl+C and restart as needed
# Stop tunnel when done
pnpm tunnel:stopThe tunnel state is stored in ~/.local/run/opencode-manager/tunnel.json.
The endpoints are stored in ~/.local/run/opencode-manager/endpoints.json.
Benefits:
- Restart backend without disconnecting mobile/remote users
- Same tunnel URL persists across backend restarts
pnpm cleanupdoes NOT kill the tunnel- Health check verifies tunnel URL is actually reachable
- Automatically updates
endpoints.jsonwhen tunnel starts
Named Tunnels (Persistent URLs):
Quick tunnels generate random URLs that change on each restart. For a persistent URL, use a Cloudflare named tunnel:
# 1. Login to Cloudflare (one-time)
cloudflared tunnel login
# 2. Create a named tunnel (one-time)
cloudflared tunnel create opencode-manager
# Note the tunnel ID (UUID) from output
# 3. Create config file ~/.cloudflared/config.yml
tunnel: <TUNNEL_ID>
credentials-file: /Users/$USER/.cloudflared/<TUNNEL_ID>.json
ingress:
- hostname: opencode.yourdomain.com
service: http://localhost:5001
- service: http_status:404
# 4. Add DNS record in Cloudflare dashboard
# CNAME: opencode -> <TUNNEL_ID>.cfargotunnel.com
# 5. Run the named tunnel
cloudflared tunnel run opencode-managerWith named tunnels, opencode.yourdomain.com will always point to your instance.
Without Tunnel (Local Only):
# Start without tunnel
pnpm start:no-tunnel
# Or connect to existing instance without tunnel
bun scripts/start-native.ts --clientCustom Port:
# Use a different backend port
bun scripts/start-native.ts --port 3000
bun scripts/start-native.ts --client --port 3000# Clone the repository
git clone https://github.com/dzianisv/opencode-manager.git
cd opencode-manager
# Install dependencies (uses Bun workspaces)
bun install
# Copy environment configuration
cp .env.example .env
# Start development servers (backend + frontend)
npm run devThe project includes a comprehensive QA testing system with autonomous AI testing capabilities.
Run tests using the provided command script:
# Health check (quick verification)
scripts/qa-test.sh health
# API endpoint tests
scripts/qa-test.sh api
# Authentication tests
scripts/qa-test.sh auth
# Cloudflare tunnel tests
scripts/qa-test.sh tunnel
# Docker deployment tests
scripts/qa-test.sh docker
# E2E test suite
scripts/qa-test.sh e2e
# Run all tests
scripts/qa-test.sh full
# Test remote deployment
scripts/qa-test.sh health https://your-deployment.comUse OpenCode slash commands for quick testing:
/qa-test # Run comprehensive QA tests
/qa-health # Quick health check
Or mention the QA agent directly:
"@qa-tester run a full test suite and report any issues"
"Test the application and generate a comprehensive report"
The AI agent will autonomously:
- Execute all test protocols
- Evaluate results against expected outputs
- Generate a professional test report with metrics
- Identify issues and provide recommendations
- ✅ Development server startup and health
- ✅ Backend API endpoints (health, repos, settings, OpenCode proxy)
- ✅ Authentication (with/without credentials, valid/invalid)
- ✅ Cloudflare tunnel (startup, URL generation, public access)
- ✅ Docker deployment (build, run, health checks, volumes)
- ✅ E2E test suite (voice, talk mode, browser automation)
- ✅ Database integrity
- ✅ Git operations
- ✅ Performance metrics
- ✅ Security validation
The QA system can be integrated into GitHub Actions:
- name: Run QA Tests
run: |
scripts/qa-test.sh fullSee the QA agent at .opencode/agent/qa-tester.md for detailed test protocols.
OpenCode WebUI supports OAuth authentication for select providers, offering a more secure and convenient alternative to API keys.
- Anthropic (Claude) - OAuth login with Claude Pro/Max accounts
- GitHub Copilot - OAuth device flow authentication
- Navigate to Settings → Provider Credentials
- Select a provider that shows the "OAuth" badge
- Click "Add OAuth" to start the authorization flow
- Choose authentication method:
- "Open Authorization Page" - Opens browser for sign-in
- "Use Authorization Code" - Provides code for manual entry
- Complete authorization in the browser or enter the provided code
- Connection status will show as "Configured" when successful
- scripts/run-local-docker.sh Pulls and runs the CI-built Docker image from GHCR locally: ./scripts/run-local-docker.sh
- scripts/run-e2e-tests.ts Runs all E2E tests against a running instance: bun run scripts/run-e2e-tests.ts --url http://localhost:5003
- Updated AGENTS.md
Documents the E2E testing workflow with CI-built images.
The workflow
GitHub Actions (CI) Local Machine
───────────────────── ──────────────────────
Push to main
↓ docker-build.yml runs ↓ Build Docker image ↓ Push to GHCR ───────────────→ ./scripts/run-local-docker.sh ↓ Pull image from GHCR ↓ Run container (port 5003) ↓ bun run scripts/run-e2e-tests.ts ↓ ✅ Voice E2E tests ✅ Talk Mode API tests
✅ Talk Mode Browser tests







