Skip to content

Remove Ollama And Use Gemini API #5

@waygeance

Description

@waygeance

Is your feature request related to a problem? Please describe.

  • Right now, the project relies on Ollama for LLM functionality. This adds extra setup steps (installing and running Ollama locally), makes it harder to deploy in different environments (e.g., cloud, CI, shared servers), and tightly couples the app to a local runtime.

Describe the solution you'd like

  • Replace the existing Ollama integration with the Google Gemini API (or add Gemini as the primary/default LLM backend).

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or request

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions