Karpathy’s LLM Wiki, 100% local with Ollama. Drop Markdown notes → AI extracts concepts → your Obsidian wiki auto-links and grows. Zero sharing. Your notes stay yours.
-
Updated
May 2, 2026 - Python
Karpathy’s LLM Wiki, 100% local with Ollama. Drop Markdown notes → AI extracts concepts → your Obsidian wiki auto-links and grows. Zero sharing. Your notes stay yours.
Modular Context | Karpathy LLM Knowledge Base + Gmail & G-Cal — multi-account MCP server for Claude Code, encrypted local-first
Structured marketing knowledge base optimized for AI, LLMs, agents, and prompt workflows. Modular, reusable, and high-density content for sales, marketing, and copywriting.
The Karpathy LLM Wiki, production-ready. Zero-RAG personal knowledge base with MCP server, multi-agent support, and hallucination enforcement. No embeddings, no vector DBs — just markdown + git.
Query your Obsidian vault using local LLMs to generate text and retrieve information.
Automate your documentation by using LLMs to read files, structure a searchable wiki, and maintain current content with source tracking.
Maintain a centralized knowledge wiki for LLMs by compiling raw documentation into actionable, versioned markdown files.
Manage multiple Gmail accounts with an MCP server for AI agents to read, search, label, archive, and unsubscribe from email
Manage your personal knowledge base using a framework of markdown files that AI coding agents read and maintain within an Obsidian vault.
Add a description, image, and links to the llm-knowledge-base topic page so that developers can more easily learn about it.
To associate your repository with the llm-knowledge-base topic, visit your repo's landing page and select "manage topics."