Skip to content
Discussion options

You must be logged in to vote

OpenACP connects to agents, not models directly. The model choice depends on which agent you use:

  • Aider supports Ollama and any OpenAI-compatible endpoint (vLLM, LM Studio, text-generation-webui). Point it at localhost:11434 and it works through OpenACP.
  • Claude Code and Codex CLI are locked to their respective providers (Anthropic, OpenAI).
  • goose (by Block) supports local models via OpenAI-compatible APIs.
  • Cline supports local models through its configuration.

So yes — use an agent that supports local models, and OpenACP will bridge it to your chat platform like any other agent. The privacy benefit is real: your code never leaves your machine.

Performance note: local models on consumer …

Replies: 2 comments

Comment options

You must be logged in to vote
0 replies
Answer selected by amanbuild
Comment options

You must be logged in to vote
0 replies
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants