Skip to content

Add Ollama provider support for LLM and chat assistant flows#497

Open
pPyrius wants to merge 1 commit into
GenAI-Security-Project:mainfrom
pPyrius:fix/ollama-support
Open

Add Ollama provider support for LLM and chat assistant flows#497
pPyrius wants to merge 1 commit into
GenAI-Security-Project:mainfrom
pPyrius:fix/ollama-support

Conversation

@pPyrius
Copy link
Copy Markdown

@pPyrius pPyrius commented May 13, 2026

Summary

Adds first-class Ollama support across the FinBot LLM stack.

Changes

  • Register ollama as a supported LLM_PROVIDER
  • Add OLLAMA_MODEL config, used only when LLM_PROVIDER=ollama
  • Add Ollama Python SDK dependency
  • Route chat assistant requests through the configured LLM provider instead of always using OpenAI
  • Convert OpenAI-style tool definitions/history into Ollama-compatible chat/tool messages
  • Auto-pull the configured Ollama model on first use if it is not installed locally
  • Update README and .env.example with Ollama configuration
  • Add regression tests for Ollama routing, model selection, model pull behavior, and chat provider routing

Testing

  • git diff --check
  • python -m py_compile finbot/agents/chat.py finbot/core/llm/client.py finbot/core/llm/ollama_client.py finbot/config.py
  • uv lock --check
  • env DEBUG=true .venv/bin/pytest tests/unit/llm/test_llm_client.py tests/unit/llm/test_ollama_client.py tests/unit/agents/test_chat_provider_routing.py -q

Result:

45 passed, 2 skipped

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant