Skip to content

support for Ollama is added.#19

Open
sinakhatibi wants to merge 3 commits intopixegami:mainfrom
sinakhatibi:main
Open

support for Ollama is added.#19
sinakhatibi wants to merge 3 commits intopixegami:mainfrom
sinakhatibi:main

Conversation

@sinakhatibi
Copy link
Copy Markdown

Hi,

I have added similar file with suffix _ollama, which can use only Ollama models for creating embeddings and responses.

- Replaced OpenAIEmbeddings with OllamaEmbeddings for embedding generation.
- Updated the model to use OllamaLLM instead of ChatOpenAI.
- Adjusted import statements to include necessary Ollama modules.
- Added prompt formatting and response handling using OllamaLLM.
- Ensured the query processing and response generation utilize the Ollama model.
@ahmedovelshan
Copy link
Copy Markdown

ahmedovelshan commented May 6, 2025

I run Ollama as a Docker container in an EC2 instance and used your code with some modifications. When I run it, the size of the DB does not change. Did I make any mistakes?


def save_to_chroma(chunks):
if os.path.exists(CHROMA_PATH):
shutil.rmtree(CHROMA_PATH)
embeddings = OllamaEmbeddings(model="nomic-embed-text", base_url=os.getenv("OLLAMA_BASE_URL"))
db = Chroma.from_documents(chunks, embeddings, persist_directory=CHROMA_PATH)
db.persist()
print(f"Saved {len(chunks)} chunks to {CHROMA_PATH}.")

@sinakhatibi
Copy link
Copy Markdown
Author

Hi,

In my case, the Ollama was installed on the same machine as the code. Do you receive any error or warning?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants