Karpathy’s LLM Wiki, 100% local with Ollama. Drop Markdown notes → AI extracts concepts → your Obsidian wiki auto-links and grows. Zero sharing. Your notes stay yours.
-
Updated
May 2, 2026 - Python
Karpathy’s LLM Wiki, 100% local with Ollama. Drop Markdown notes → AI extracts concepts → your Obsidian wiki auto-links and grows. Zero sharing. Your notes stay yours.
Local-first RAG platform — Ollama + Qdrant + Redis + MinIO. No cloud, no API keys, runs entirely on your machine.
Local RAG (FAISS + SQLite + Ollama) for offline querying of sensitive documentation with evidence citations. PySide6 UI.
Maintain a centralized knowledge wiki for LLMs by compiling raw documentation into actionable, versioned markdown files.
Automate your documentation by using LLMs to read files, structure a searchable wiki, and maintain current content with source tracking.
Build a personal wiki from your files, messages, and bookmarks using an automated LLM pipeline for knowledge management.
Query your Obsidian vault using local LLMs to generate text and retrieve information.
Build a structured knowledge base with LLM agents to automate documentation, link concepts, and maintain long-term information retention in Markdown.
Add a description, image, and links to the rag-local topic page so that developers can more easily learn about it.
To associate your repository with the rag-local topic, visit your repo's landing page and select "manage topics."