What version of Kimi Code CLI is running?
1.17.0
Which open platform/subscription were you using?
/login
Which model were you using?
ollama
What platform is your computer?
Darwin 25.1.0 arm64 arm
What issue are you seeing?
我想替换kimi-cli的api为自部署kimi 2.5模型api,尝试后发现会报错 LLM provider error: The API returned an empty response.;替换api为第三方中转openai api也会报错 LLM provider error: The API returned an empty response.,尝试替换api为本地ollama,使用ollama中kimi-k2.5:cloud 可以正常运行 切换为ollama中llama3.2,会报错LLM provider error: The API returned an empty response.
以下是我的config.toml:
default_model = "llama3.2"
[models."llama3.2"]
provider = "ollama"
model = "llama3.2"
max_context_size = 32768
capabilities = ["image_in"]
[models."kimi-k2.5:cloud"]
provider = "ollama"
model = "kimi-k2.5:cloud"
max_context_size = 32768
capabilities = ["image_in"]
[providers.ollama]
type = "openai_legacy"
base_url = "http://localhost:11434/v1"
api_key = "ollama"
以下是输出:
~ kimi
╭───────────────────────────────────────────────────────╮
│ │
│ ▐█▛█▛█▌ Welcome to Kimi Code CLI! │
│ ▐█████▌ Send /help for help information. │
│ │
│ Directory: ~ │
│ Session: 618fc273-210d-43bd-8441-d62952e30bf7 │
│ Model: llama3.2 │
│ Tip: send /login to use our latest kimi-k2.5 model │
│ │
│ Tip: Kimi Code Web UI, a GUI version of Kimi Code, │
│ is now in technical preview. │
│ Type /web to switch, or next time run kimi we │ │ b directly. │
│ │
╰───────────────────────────────────────────────────────╯
✨ hi
LLM provider error: The API returned an empty response.
✨ /model
Select a model (↑↓ navigate, Enter select, Ctrl+C cance
l):
- kimi-k2.5:cloud (ollama)
2. llama3.2 (ollama) (current)
Switched to kimi-k2.5:cloud with thinking off. Reloading.
..
╭───────────────────────────────────────────────────────╮
│ │
│ ▐█▛█▛█▌ Welcome to Kimi Code CLI! │
│ ▐█████▌ Send /help for help information. │
│ │
│ Directory: ~ │
│ Session: 618fc273-210d-43bd-8441-d62952e30bf7 │
│ Model: kimi-k2.5:cloud │
│ Tip: send /login to use our latest kimi-k2.5 model │
│ │
│ Tip: Kimi Code Web UI, a GUI version of Kimi Code, │
│ is now in technical preview. │
│ Type /web to switch, or next time run kimi we │ │ b directly. │
│ │
╰───────────────────────────────────────────────────────╯
✨ hi,what's your name
• Used Shell (sysctl -n hw.ncpu)
• Used ReadFile (dijkstra.py)
• 让我先查找一下相关的文件。
• Used Glob (111)
• Used Glob (111/.py)
• I don't have access to previous conversation history. L
et me look for any game repositories in the current wor
king directory that you might be referring to.
• Used Glob (game/**/)
• Used Glob (new_game//)
• Used Glob (anthropic_test//)
• I'll check the existing dijkstra.py file first to see w
hat's already there.
• Used ReadFile (dijkstra.py)
• I'll create a stunning, animated personal portfolio web
site for you. Let me start by exploring the current dir
ectory and then build this step by step.
• Used Glob ()
• Used ReadMediaFile (.kimi/avatar.jpg)
File not found
• I'll help you design and implement a simple 2D platform
er game. Let me first explore the working directory to
see what's already there, and then I'll create the game
.
• Used Shell (ls -la /Users/zty/game/ &...s empty or does
n't exist")
⠦ Composing...
context: 20.3% (6.7k/32.8k)
What steps can reproduce the bug?
本地安装ollama
运行ollama pull llama3.2
修改kimi-cli的config.toml:
default_model = "llama3.2"
[models."llama3.2"]
provider = "ollama"
model = "llama3.2"
max_context_size = 32768
capabilities = ["image_in"]
[models."kimi-k2.5:cloud"]
provider = "ollama"
model = "kimi-k2.5:cloud"
max_context_size = 32768
capabilities = ["image_in"]
[providers.ollama]
type = "openai_legacy"
base_url = "http://localhost:11434/v1"
api_key = "ollama"
再次启动kimi cli即可
What is the expected behavior?
期望kimi-cli可以接入其他api,至少可以接入自己部署的kimi k2.5
Additional information
No response
What version of Kimi Code CLI is running?
1.17.0
Which open platform/subscription were you using?
/login
Which model were you using?
ollama
What platform is your computer?
Darwin 25.1.0 arm64 arm
What issue are you seeing?
我想替换kimi-cli的api为自部署kimi 2.5模型api,尝试后发现会报错 LLM provider error: The API returned an empty response.;替换api为第三方中转openai api也会报错 LLM provider error: The API returned an empty response.,尝试替换api为本地ollama,使用ollama中kimi-k2.5:cloud 可以正常运行 切换为ollama中llama3.2,会报错LLM provider error: The API returned an empty response.
以下是我的config.toml:
default_model = "llama3.2"
[models."llama3.2"]
provider = "ollama"
model = "llama3.2"
max_context_size = 32768
capabilities = ["image_in"]
[models."kimi-k2.5:cloud"]
provider = "ollama"
model = "kimi-k2.5:cloud"
max_context_size = 32768
capabilities = ["image_in"]
[providers.ollama]
type = "openai_legacy"
base_url = "http://localhost:11434/v1"
api_key = "ollama"
以下是输出:
~ kimi
╭───────────────────────────────────────────────────────╮
│ │
│ ▐█▛█▛█▌ Welcome to Kimi Code CLI! │
│ ▐█████▌ Send /help for help information. │
│ │
│ Directory: ~ │
│ Session: 618fc273-210d-43bd-8441-d62952e30bf7 │
│ Model: llama3.2 │
│ Tip: send /login to use our latest kimi-k2.5 model │
│ │
│ Tip: Kimi Code Web UI, a GUI version of Kimi Code, │
│ is now in technical preview. │
│ Type /web to switch, or next time run
kimi we │ │ bdirectly. ││ │
╰───────────────────────────────────────────────────────╯
✨ hi
LLM provider error: The API returned an empty response.
✨ /model
Select a model (↑↓ navigate, Enter select, Ctrl+C cance
l):
What steps can reproduce the bug?
本地安装ollama
运行ollama pull llama3.2
修改kimi-cli的config.toml:
default_model = "llama3.2"
[models."llama3.2"]
provider = "ollama"
model = "llama3.2"
max_context_size = 32768
capabilities = ["image_in"]
[models."kimi-k2.5:cloud"]
provider = "ollama"
model = "kimi-k2.5:cloud"
max_context_size = 32768
capabilities = ["image_in"]
[providers.ollama]
type = "openai_legacy"
base_url = "http://localhost:11434/v1"
api_key = "ollama"
再次启动kimi cli即可
What is the expected behavior?
期望kimi-cli可以接入其他api,至少可以接入自己部署的kimi k2.5
Additional information
No response