-
Notifications
You must be signed in to change notification settings - Fork 0
Open
Description
Phase 1 — OpenCode Server integration (Mode A)
Goal: Real LLM responses via OpenCode → Ollama
- Add
opencode-servercontainer to compose stack - Implement
server-mode.ts(OpenCode HTTP/WS API client) - Wire gateway →
runOpencodeAgent()→ OpenCode Server → Ollama - Stream response tokens back to Matrix thread
- Session state serialisation (transcript persistence)
- Handle OpenCode session errors, retries, timeouts
Exit criteria: Interactive Matrix conversation with Ollama-backed responses,
session continuity across multiple turns in same thread
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels
Projects
Status
Todo