kimi-cli接入本地ollama报错:✨ hi LLM provider error: The API returned an empty response. #1338
minghe-zty
started this conversation in
General
Replies: 1 comment
-
|
昨天验证后发现 在简单场景下(hello world)可以运行,不知道使用高级功能会不会出问题 估测在上下文压缩阶段出现问题 |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
What version of Kimi Code CLI is running?
1.17.0
Which open platform/subscription were you using?
/login
Which model were you using?
ollama
What platform is your computer?
Darwin 25.1.0 arm64 arm
What issue are you seeing?
我想替换kimi-cli的api为自部署kimi 2.5模型api,尝试后发现会报错 LLM provider error: The API returned an empty response.;替换api为第三方中转openai api也会报错 LLM provider error: The API returned an empty response.,尝试替换api为本地ollama,使用ollama中kimi-k2.5:cloud 可以正常运行 切换为ollama中llama3.2,会报错LLM provider error: The API returned an empty response.
以下是我的config.toml:
default_model = "llama3.2"
[models."llama3.2"]
provider = "ollama"
model = "llama3.2"
max_context_size = 32768
capabilities = ["image_in"]
[models."kimi-k2.5:cloud"]
provider = "ollama"
model = "kimi-k2.5:cloud"
max_context_size = 32768
capabilities = ["image_in"]
[providers.ollama]
type = "openai_legacy"
base_url = "http://localhost:11434/v1"
api_key = "ollama"
以下是输出:
~ kimi
╭───────────────────────────────────────────────────────╮
│ │
│ ▐█▛█▛█▌ Welcome to Kimi Code CLI! │
│ ▐█████▌ Send /help for help information. │
│ │
│ Directory: ~ │
│ Session: 618fc273-210d-43bd-8441-d62952e30bf7 │
│ Model: llama3.2 │
│ Tip: send /login to use our latest kimi-k2.5 model │
│ │
│ Tip: Kimi Code Web UI, a GUI version of Kimi Code, │
│ is now in technical preview. │
│ Type /web to switch, or next time run
kimi we │ │ bdirectly. ││ │
╰───────────────────────────────────────────────────────╯
✨ hi
LLM provider error: The API returned an empty response.
✨ /model
Select a model (↑↓ navigate, Enter select, Ctrl+C cance
l):
What steps can reproduce the bug?
本地安装ollama
运行ollama pull llama3.2
修改kimi-cli的config.toml:
default_model = "llama3.2"
[models."llama3.2"]
provider = "ollama"
model = "llama3.2"
max_context_size = 32768
capabilities = ["image_in"]
[models."kimi-k2.5:cloud"]
provider = "ollama"
model = "kimi-k2.5:cloud"
max_context_size = 32768
capabilities = ["image_in"]
[providers.ollama]
type = "openai_legacy"
base_url = "http://localhost:11434/v1"
api_key = "ollama"
再次启动kimi cli即可
What is the expected behavior?
期望kimi-cli可以接入其他api,至少可以接入自己部署的kimi k2.5
Additional information
No response
Beta Was this translation helpful? Give feedback.
All reactions