offline-ollama-assistant is a fully offline Python project that turns your local Ollama models into a configurable coding assistant and general-purpose AI tool.
- Discover local Ollama models automatically.
- Route prompts to different model roles using JSON settings.
- Read, summarize, and edit local files with built-in guardrails.
- Extract text from PDF and DOCX files when optional dependencies are installed.
- Keep prompts, model selection, permissions, and routing rules in simple config files.
Before you start, make sure you have:
- Python
3.10or newer - Ollama installed and running locally
- At least one Ollama model pulled on your machine
Example model install:
ollama pull llama3.2:1bCheck that Ollama is running:
ollama listFollow these steps to set up the project on your machine.
git clone https://github.com/abhinav25232354/CodingAgent.git
cd CodingAgentWindows PowerShell:
python -m venv .venv
.venv\Scripts\Activate.ps1macOS/Linux:
python3 -m venv .venv
source .venv/bin/activateBasic install:
pip install -e .Install with document support for PDF and DOCX:
pip install -e .[docs]If you have not already downloaded a model, pull one now:
ollama pull llama3.2:1bThen verify:
ollama listOpen settings/settings.json and check these values:
ollama.base_url: should usually stay ashttp://127.0.0.1:11434model_selection.active_model: set this to a model you already pulledpermissions.read_roots: directories the assistant can readpermissions.write_roots: directories the assistant can modify
Default example:
"model_selection": {
"mode": "fixed",
"active_model": "llama3.2:1b",
"allow_multi_model_fallback": false,
"default_role": "general"
}You can start the assistant in either of these ways:
python main.pyor:
offline-assistantIf you want to use this project as your own local coding assistant setup, use this flow:
Make a folder for the projects you want the assistant to work with.
Example:
DevWorkspace/
|-- CodingAgent/
|-- my-project-1/
|-- my-project-2/
In settings/settings.json, update:
read_rootsto include folders the assistant may inspectwrite_rootsto include folders the assistant may edit
If you want the assistant to work only inside this repository, keep both as ".".
If you want it to help inside another local project, point those settings to that project folder.
Suggested starting point:
- Small local setup:
llama3.2:1b - Stronger coding help: choose a larger coding-capable Ollama model you already have installed
Set the chosen model in:
settings/settings.json -> model_selection.active_model
You can customize:
settings/system_prompts.jsonfor assistant behaviorsettings/settings.jsonfor routing, permissions, and tool access
This lets you create your own coding environment for:
- code explanation
- refactoring help
- local file editing
- document summarization
- offline project assistance
Activate your virtual environment, make sure Ollama is running, and launch the assistant.
Use this checklist if you are setting up for the first time:
- Install Python
3.10+ - Install Ollama
- Pull at least one model with
ollama pull <model-name> - Create and activate a virtual environment
- Run
pip install -e . - Optionally run
pip install -e .[docs] - Update
settings/settings.json - Run
python main.py
.
|-- pyproject.toml
|-- README.md
|-- main.py
|-- settings/
| |-- settings.json
| `-- system_prompts.json
|-- src/
| `-- offline_assistant/
| |-- __init__.py
| |-- assistant.py
| |-- cli.py
| |-- config.py
| |-- models.py
| |-- ollama_client.py
| |-- prompting.py
| |-- routing.py
| |-- utils.py
| `-- tools/
| |-- __init__.py
| |-- base.py
| |-- documents.py
| |-- files.py
| |-- registry.py
| |-- textops.py
| `-- web.py
`-- examples/
|-- sample_session.txt
`-- tasks.md
Controls:
- Ollama connection
- model selection
- routing rules
- read/write permissions
- tool availability
- generation settings
Controls the assistant's behavior and system prompt templates.
- The assistant works even if you only have one local model.
- Document extraction features need
pip install -e .[docs]. - The project is designed for local, offline usage with Ollama running on your machine.