Built on LangChain v1.2 & LangGraph v1.0
with persistent memory, dynamic prompts, and seamless tool integration
FinchBot is a lightweight, modular AI Agent framework built on LangChain v1.2 and LangGraph v1.0. It's not just another LLM wrapper—it's a thoughtfully designed architecture focused on three core challenges:
- How to enable infinite Agent extensibility? — Through a dual-layer extension mechanism of Skills and Tools
- How to give Agents real memory? — Through a dual-layer storage architecture + Agentic RAG
- How to make Agent behavior customizable? — Through a dynamic prompt file system
- Why FinchBot?
- System Architecture
- Core Components
- Quick Start
- Tech Stack
- Extension Guide
- Documentation
| Pain Point | Traditional Approach | FinchBot Solution |
|---|---|---|
| Hard to Extend | Requires modifying core code | Inherit FinchTool base class or create Markdown skill files |
| Fragile Memory | Relies on LLM context window | SQLite + Vector dual storage + Agentic RAG + Weighted RRF |
| Rigid Prompts | Hardcoded in source code | Bootstrap file system, user-customizable prompts, hot reload |
| Slow Startup | Synchronous blocking I/O | Fully async + Thread pool concurrency, 3-5x faster startup |
| Outdated Architecture | Based on old LangChain APIs | LangChain v1.2 + LangGraph v1.0 state graph orchestration |
graph BT
classDef roof fill:#ffebee,stroke:#c62828,stroke-width:3px,color:#b71c1c,rx:10,ry:10;
classDef pillar fill:#e3f2fd,stroke:#1565c0,stroke-width:2px,color:#0d47a1,rx:8,ry:8;
classDef base fill:#e8f5e9,stroke:#2e7d32,stroke-width:3px,color:#1b5e20,rx:10,ry:10;
Roof("FinchBot Framework<br/>Lightweight • Flexible • Extensible"):::roof
subgraph Pillars [Core Philosophy]
direction LR
P("Privacy First<br/>Local Embedding<br/>No Cloud Upload"):::pillar
M("Modularity<br/>Factory Pattern<br/>Decoupled Design"):::pillar
D("Dev Friendly<br/>Type Safety<br/>Rich Documentation"):::pillar
S("Fast Startup<br/>Fully Async<br/>Thread Pool"):::pillar
O("Out of Box<br/>Zero Config<br/>Auto Fallback"):::pillar
end
Base("Tech Foundation<br/>LangChain v1.2 • LangGraph v1.0 • Python 3.13"):::base
Base === P & M & D & S & O
P & M & D & S & O === Roof
FinchBot integrates with LangBot for multi-platform messaging - develop once, reach everywhere:
LangBot (15k+ GitHub Stars) is a production-grade multi-platform bot framework supporting 12+ messaging platforms.
Quick Start with LangBot:
# Install LangBot
uvx langbot
# Access WebUI at http://localhost:5300
# Configure your platforms and connect to FinchBotFinchBot uses the official langchain-mcp-adapters library for MCP integration:
# Configure MCP servers in config
finchbot config
# Select "MCP Configuration" optionMCP Features:
- Dynamic tool discovery and registration
- Standardized tool calling interface
- Support for stdio and HTTP transports
- Multiple MCP servers support
FinchBot provides a full-featured command line interface, three commands to get started:
# Step 1: Configure API keys and default model
uv run finchbot config
# Step 2: Manage your sessions
uv run finchbot sessions
# Step 3: Start chatting
uv run finchbot chat| Feature | Description |
|---|---|
| Environment Variables | All configurations can be set via environment variables (OPENAI_API_KEY, ANTHROPIC_API_KEY, etc.) |
| i18n Support | Built-in Chinese/English support, auto-detects system language |
| Auto Fallback | Web search automatically falls back through Tavily → Brave → DuckDuckGo |
FinchBot is built on LangChain v1.2 and LangGraph v1.0, serving as an Agent system with persistent memory, dynamic tool scheduling, and multi-platform messaging support.
graph TB
classDef uiLayer fill:#ffebee,stroke:#c62828,stroke-width:2px,color:#b71c1c;
classDef coreLayer fill:#e3f2fd,stroke:#1565c0,stroke-width:2px,color:#0d47a1;
classDef infraLayer fill:#e8f5e9,stroke:#2e7d32,stroke-width:2px,color:#1b5e20;
subgraph UI [User Interaction Layer]
CLI[CLI Interface]:::uiLayer
Channels[Multi-platform Channels<br/>Discord/DingTalk/Feishu/WeChat/Email]:::uiLayer
end
subgraph Core [Agent Core]
Agent[LangGraph Agent<br/>Decision Engine]:::coreLayer
Context[ContextBuilder<br/>Context Building]:::coreLayer
Tools[ToolRegistry<br/>15 Built-in Tools + MCP]:::coreLayer
Memory[MemoryManager<br/>Dual-layer Memory]:::coreLayer
end
subgraph Infra [Infrastructure Layer]
Storage[Dual-layer Storage<br/>SQLite + VectorStore]:::infraLayer
LLM[LLM Providers<br/>OpenAI/Anthropic/DeepSeek]:::infraLayer
end
CLI --> Agent
Channels --> Agent
Agent --> Context
Agent <--> Tools
Agent <--> Memory
Memory --> Storage
Agent --> LLM
sequenceDiagram
autonumber
participant U as User
participant C as Channel
participant B as MessageBus
participant F as AgentFactory
participant A as Agent
participant M as MemoryManager
participant T as Tools
participant L as LLM
U->>C: Send Message
C->>B: InboundMessage
B->>F: Get/Create Agent
F->>A: Return Compiled Agent
Note over A: Build Context
A->>M: Recall Relevant Memories
M-->>A: Return Context
A->>L: Send Request
L-->>A: Stream Response
alt Tool Call Needed
A->>T: Execute Tool
T-->>A: Return Result
A->>L: Continue with Result
L-->>A: Final Response
end
A->>M: Store New Memories
A->>B: OutboundMessage
B->>C: Route to Channel
C->>U: Display Response
finchbot/
├── agent/ # Agent Core
│ ├── core.py # Agent creation and execution
│ ├── factory.py # AgentFactory for component assembly
│ ├── context.py # ContextBuilder for prompt assembly
│ ├── capabilities.py # CapabilitiesBuilder for capability building
│ └── skills.py # SkillsLoader for Markdown skills
├── channels/ # Multi-Platform Messaging (via LangBot)
│ ├── base.py # BaseChannel abstract class
│ ├── bus.py # MessageBus async router
│ ├── manager.py # ChannelManager coordinator
│ ├── schema.py # Message models
│ └── langbot_integration.py # LangBot integration guide
├── cli/ # CLI Interface
│ ├── chat_session.py
│ ├── config_manager.py
│ ├── providers.py
│ └── ui.py
├── config/ # Configuration Management
│ ├── loader.py
│ ├── schema.py # Includes MCPConfig, ChannelsConfig
│ └── ...
├── constants.py # Unified constants definition
├── i18n/ # Internationalization
│ ├── loader.py # Language loader
│ └── locales/
├── memory/ # Memory System
│ ├── manager.py
│ ├── types.py
│ ├── services/
│ └── storage/
├── providers/ # LLM Providers
│ └── factory.py
├── sessions/ # Session Management
│ ├── metadata.py
│ ├── selector.py
│ └── title_generator.py
├── skills/ # Skill System
│ ├── skill-creator/
│ ├── summarize/
│ └── weather/
├── tools/ # Tool System
│ ├── base.py
│ ├── factory.py # MCP tools via langchain-mcp-adapters
│ ├── registry.py
│ ├── config_tools.py # Configuration tools
│ ├── tools_generator.py # Tool documentation generator
│ ├── filesystem.py
│ ├── memory.py
│ ├── shell.py
│ ├── web.py
│ ├── session_title.py
│ └── search/
└── utils/ # Utility Functions
├── cache.py
├── logger.py
└── model_downloader.py
FinchBot implements an advanced dual-layer memory architecture that solves LLM context window limits and long-term forgetting issues.
| Dimension | Traditional RAG | Agentic RAG (FinchBot) |
|---|---|---|
| Retrieval Trigger | Fixed pipeline | Agent autonomous decision |
| Retrieval Strategy | Single vector retrieval | Hybrid retrieval + dynamic weight adjustment |
| Memory Management | Passive storage | Active remember/recall/forget |
| Classification | None | Auto-classification + importance scoring |
| Update Mechanism | Full rebuild | Incremental sync |
flowchart TB
classDef businessLayer fill:#e3f2fd,stroke:#1565c0,stroke-width:2px,color:#0d47a1;
classDef serviceLayer fill:#fff9c4,stroke:#fbc02d,stroke-width:2px,color:#f57f17;
classDef storageLayer fill:#e8f5e9,stroke:#2e7d32,stroke-width:2px,color:#1b5e20;
MM[MemoryManager<br/>remember/recall/forget]:::businessLayer
RS[RetrievalService<br/>Hybrid Retrieval + RRF]:::serviceLayer
CS[ClassificationService<br/>Auto Classification]:::serviceLayer
IS[ImportanceScorer<br/>Importance Scoring]:::serviceLayer
ES[EmbeddingService<br/>FastEmbed Local]:::serviceLayer
SQLite[(SQLiteStore<br/>Source of Truth<br/>Precise Query)]:::storageLayer
Vector[(VectorStore<br/>ChromaDB<br/>Semantic Search)]:::storageLayer
DS[DataSyncManager<br/>Incremental Sync]:::storageLayer
MM --> RS & CS & IS
RS --> SQLite & Vector
CS --> SQLite
IS --> SQLite
ES --> Vector
SQLite <--> DS <--> Vector
FinchBot uses Weighted RRF (Weighted Reciprocal Rank Fusion) strategy:
class QueryType(StrEnum):
"""Query type determines retrieval weights"""
KEYWORD_ONLY = "keyword_only" # Pure keyword (1.0/0.0)
SEMANTIC_ONLY = "semantic_only" # Pure semantic (0.0/1.0)
FACTUAL = "factual" # Factual (0.8/0.2)
CONCEPTUAL = "conceptual" # Conceptual (0.2/0.8)
COMPLEX = "complex" # Complex (0.5/0.5)
AMBIGUOUS = "ambiguous" # Ambiguous (0.3/0.7)FinchBot's prompt system uses file system + modular assembly design.
~/.finchbot/
├── config.json # Main configuration file
└── workspace/
├── bootstrap/ # Bootstrap files directory
│ ├── SYSTEM.md # Role definition
│ ├── MEMORY_GUIDE.md # Memory usage guide
│ ├── SOUL.md # Personality settings
│ └── AGENT_CONFIG.md # Agent configuration
├── config/ # Configuration directory
│ └── mcp.json # MCP server configuration
├── generated/ # Auto-generated files
│ ├── TOOLS.md # Tool documentation
│ └── CAPABILITIES.md # Capabilities info
├── skills/ # Custom skills
├── memory/ # Memory storage
└── sessions/ # Session data
flowchart TD
classDef startEnd fill:#ffebee,stroke:#c62828,stroke-width:2px,color:#b71c1c;
classDef process fill:#e3f2fd,stroke:#1565c0,stroke-width:2px,color:#0d47a1;
classDef file fill:#fff9c4,stroke:#fbc02d,stroke-width:2px,color:#f57f17;
classDef output fill:#e8f5e9,stroke:#2e7d32,stroke-width:2px,color:#1b5e20;
A([Agent Startup]):::startEnd --> B[Load Bootstrap Files]:::process
B --> C[bootstrap/SYSTEM.md]:::file
B --> D[bootstrap/MEMORY_GUIDE.md]:::file
B --> E[bootstrap/SOUL.md]:::file
B --> F[bootstrap/AGENT_CONFIG.md]:::file
C --> G[Assemble Prompt]:::process
D --> G
E --> G
F --> G
G --> H[Load Always-on Skills]:::process
H --> I[Build Skill Summary XML]:::process
I --> J[Generate Tool Docs TOOLS.md]:::process
J --> K[Generate Capabilities CAPABILITIES.md]:::process
K --> L[Inject Runtime Info]:::process
L --> M[Complete System Prompt]:::output
M --> N([Send to LLM]):::startEnd
Tools are the bridge for Agent to interact with the external world. FinchBot provides 15 built-in tools with easy extension.
flowchart TB
classDef registry fill:#e3f2fd,stroke:#1565c0,stroke-width:2px,color:#0d47a1;
classDef builtin fill:#e8f5e9,stroke:#2e7d32,stroke-width:2px,color:#1b5e20;
classDef custom fill:#fff9c4,stroke:#fbc02d,stroke-width:2px,color:#f57f17;
classDef agent fill:#f3e5f5,stroke:#7b1fa2,stroke-width:2px,color:#7b1fa2;
TR[ToolRegistry<br/>Global Registry]:::registry
Lock[Single-Lock Pattern<br/>Thread-Safe Singleton]:::registry
File[File Operations<br/>read_file / write_file<br/>edit_file / list_dir]:::builtin
Web[Network<br/>web_search / web_extract]:::builtin
Memory[Memory<br/>remember / recall / forget]:::builtin
System[System<br/>exec / session_title]:::builtin
Config[Configuration<br/>configure_mcp / refresh_capabilities<br/>get_capabilities / get_mcp_config_path]:::builtin
Inherit[Inherit FinchTool<br/>Implement _run]:::custom
Register[Register to Registry]:::custom
Agent[Agent Call]:::agent
TR --> Lock
Lock --> File & Web & Memory & System & Config
Lock --> Inherit --> Register
File --> Agent
Web --> Agent
Memory --> Agent
System --> Agent
Config --> Agent
Register --> Agent
| Category | Tool | Function |
|---|---|---|
| File Ops | read_file |
Read local files |
write_file |
Write local files | |
edit_file |
Edit file content | |
list_dir |
List directory contents | |
| Network | web_search |
Web search (Tavily/Brave/DDG) |
web_extract |
Web content extraction | |
| Memory | remember |
Proactively store memories |
recall |
Retrieve memories | |
forget |
Delete/archive memories | |
| System | exec |
Secure shell execution |
session_title |
Manage session titles | |
| Configuration | configure_mcp |
Dynamically configure MCP servers (enable/disable/add/update/remove/list) |
refresh_capabilities |
Refresh capabilities file | |
get_capabilities |
Get current capabilities | |
get_mcp_config_path |
Get MCP config path |
flowchart TD
classDef check fill:#fff9c4,stroke:#fbc02d,stroke-width:2px,color:#f57f17;
classDef engine fill:#e8f5e9,stroke:#2e7d32,stroke-width:2px,color:#1b5e20;
classDef fallback fill:#e3f2fd,stroke:#1565c0,stroke-width:2px,color:#0d47a1;
Start[Web Search Request]:::check
Check1{TAVILY_API_KEY<br/>Set?}:::check
Tavily[Tavily<br/>Best Quality<br/>AI-Optimized]:::engine
Check2{BRAVE_API_KEY<br/>Set?}:::check
Brave[Brave Search<br/>Privacy Friendly<br/>Large Free Tier]:::engine
DDG[DuckDuckGo<br/>Zero Config<br/>Always Available]:::fallback
Start --> Check1
Check1 -->|Yes| Tavily
Check1 -->|No| Check2
Check2 -->|Yes| Brave
Check2 -->|No| DDG
| Priority | Engine | API Key | Features |
|---|---|---|---|
| 1 | Tavily | Required | Best quality, AI-optimized, deep search |
| 2 | Brave Search | Required | Large free tier, privacy-friendly |
| 3 | DuckDuckGo | Not required | Always available, zero config |
How it works:
- If
TAVILY_API_KEYis set → Use Tavily (best quality) - Else if
BRAVE_API_KEYis set → Use Brave Search - Else → Use DuckDuckGo (no API key needed, always works)
This design ensures web search works out of the box even without any API key configuration!
FinchBot's Agent can autonomously manage MCP servers through the configure_mcp tool, enabling dynamic capability expansion without manual configuration file editing.
Supported Operations:
| Operation | Description |
|---|---|
add |
Add new MCP server |
update |
Update existing server configuration |
remove |
Delete MCP server |
enable |
Enable disabled MCP server |
disable |
Temporarily disable MCP server |
list |
List all configured servers |
Dynamic Prompt Updates:
When MCP configuration changes, the Agent can refresh capability descriptions through refresh_capabilities, ensuring the system prompt always reflects current capabilities.
flowchart LR
classDef config fill:#e8f5e9,stroke:#2e7d32,stroke-width:2px,color:#1b5e20;
classDef system fill:#e3f2fd,stroke:#1565c0,stroke-width:2px,color:#0d47a1;
classDef prompt fill:#fff9c4,stroke:#fbc02d,stroke-width:2px,color:#f57f17;
MCP[MCP Config<br/>configure_mcp]:::config --> Refresh[refresh_capabilities]:::system --> Builder[CapabilitiesBuilder<br/>Regenerate]:::system --> Write[CAPABILITIES.md]:::prompt --> Load[Next Session<br/>Auto-Load]:::prompt
The session_title tool embodies FinchBot's out-of-the-box philosophy:
| Method | Description | Example |
|---|---|---|
| Auto Generate | After 2-3 turns, AI automatically generates title based on content | "Python Async Programming Discussion" |
| Agent Modify | Tell Agent "Change session title to XXX" | Agent calls tool to modify automatically |
| Manual Rename | Press r key in session manager to rename |
User manually enters new title |
This design lets users manage sessions without technical details—whether automatic or manual.
Skills are FinchBot's unique innovation—defining Agent capabilities through Markdown files.
FinchBot includes a built-in skill-creator skill, the ultimate expression of the out-of-the-box philosophy:
Just tell the Agent what skill you want, and it will create it automatically!
User: Help me create a translation skill that can translate Chinese to English
Agent: Okay, I'll create a translation skill for you...
[Invokes skill-creator skill]
✅ Created skills/translator/SKILL.md
You can now use the translation feature directly!
No manual file creation, no coding—extend Agent capabilities with just one sentence!
skills/
├── skill-creator/ # Skill creator (Built-in) - Core of out-of-the-box
│ └── SKILL.md
├── summarize/ # Intelligent summarization (Built-in)
│ └── SKILL.md
├── weather/ # Weather query (Built-in)
│ └── SKILL.md
└── my-custom-skill/ # Agent auto-created or user-defined
└── SKILL.md
| Feature | Description |
|---|---|
| Agent Auto-Create | Tell Agent your needs, auto-generates skill files |
| Dual Skill Source | Workspace skills first, built-in skills fallback |
| Dependency Check | Auto-check CLI tools and environment variables |
| Cache Invalidation | Smart caching based on file modification time |
| Progressive Loading | Always-on skills first, others on demand |
FinchBot integrates with LangBot for production-grade multi-platform messaging.
Why LangBot?
- 15k+ GitHub Stars, actively maintained
- Supports 12+ platforms: QQ, WeChat, WeCom, Feishu, DingTalk, Discord, Telegram, Slack, LINE, KOOK, Satori
- Built-in WebUI for easy configuration
- Plugin ecosystem with MCP support
flowchart LR
classDef bus fill:#e3f2fd,stroke:#1565c0,stroke-width:2px,color:#0d47a1;
classDef manager fill:#fff9c4,stroke:#fbc02d,stroke-width:2px,color:#f57f17;
classDef channel fill:#e8f5e9,stroke:#2e7d32,stroke-width:2px,color:#1b5e20;
FinchBot[FinchBot<br/>Agent Core]:::bus
LangBot[LangBot<br/>Platform Layer]:::manager
QQ[QQ]:::channel
WeChat[WeChat]:::channel
Feishu[Feishu]:::channel
DingTalk[DingTalk]:::channel
Discord[Discord]:::channel
Telegram[Telegram]:::channel
Slack[Slack]:::channel
FinchBot <--> LangBot
LangBot <--> QQ & WeChat & Feishu & DingTalk & Discord & Telegram & Slack
# Install LangBot
uvx langbot
# Access WebUI at http://localhost:5300
# Configure your platforms and connect to FinchBotFor more details, see LangBot Documentation.
FinchBot is built on LangChain v1.2 and LangGraph v1.0, using the latest Agent architecture.
from langchain.agents import create_agent
from langgraph.checkpoint.sqlite import SqliteSaver
def create_finch_agent(
model: BaseChatModel,
workspace: Path,
tools: Sequence[BaseTool] | None = None,
use_persistent: bool = True,
) -> tuple[CompiledStateGraph, SqliteSaver | MemorySaver]:
# 1. Initialize checkpoint (persistent state)
if use_persistent:
checkpointer = SqliteSaver.from_conn_string(str(db_path))
else:
checkpointer = MemorySaver()
# 2. Build system prompt
system_prompt = build_system_prompt(workspace)
# 3. Create Agent (using LangChain official API)
agent = create_agent(
model=model,
tools=list(tools) if tools else None,
system_prompt=system_prompt,
checkpointer=checkpointer,
)
return agent, checkpointer| Provider | Models | Features |
|---|---|---|
| OpenAI | GPT-5, GPT-5.2, O3-mini | Best overall capability |
| Anthropic | Claude Sonnet 4.5, Opus 4.6 | High safety, long context |
| DeepSeek | DeepSeek Chat, Reasoner | Chinese, cost-effective |
| Gemini | Gemini 2.5 Flash | Google's latest |
| Groq | Llama 4 Scout/Maverick | Ultra-fast inference |
| Moonshot | Kimi K1.5/K2.5 | Long context, Chinese |
| Item | Requirement |
|---|---|
| OS | Windows / Linux / macOS |
| Python | 3.13+ |
| Package Manager | uv (Recommended) |
# Clone repository (choose one)
# Gitee (recommended for users in China)
git clone https://gitee.com/xt765/finchbot.git
# or GitHub
git clone https://github.com/xt765/finchbot.git
cd finchbot
# Install dependencies
uv syncNote: The embedding model (~95MB) will be automatically downloaded to the local cache when you run the application for the first time (e.g.,
finchbot chat). No manual intervention required.
Development Installation
For development, install with dev dependencies:
uv sync --extra devThis includes: pytest, ruff, basedpyright
# Step 1: Configure API keys and default model
uv run finchbot config
# Step 2: Manage your sessions
uv run finchbot sessions
# Step 3: Start chatting
uv run finchbot chatThat's it! These three commands cover the complete workflow:
finchbot config— Interactive configuration for LLM providers, API keys, and settingsfinchbot sessions— Full-screen session manager for creating, renaming, deleting sessionsfinchbot chat— Start or continue an interactive conversation
FinchBot provides official Docker support for easy deployment:
# Clone repository
git clone https://github.com/xt765/finchbot.git
cd finchbot
# Create .env file with your API keys
cp .env.example .env
# Edit .env and add your API keys
# Build and run
docker-compose up -d
# Access the Web interface
# http://localhost:8000| Feature | Description |
|---|---|
| One-command Deploy | docker-compose up -d |
| Persistent Storage | Workspace and model cache via volumes |
| Health Check | Built-in container health monitoring |
| Multi-arch Support | Works on x86_64 and ARM64 |
# Or set environment variables directly
export OPENAI_API_KEY="your-api-key"
uv run finchbot chat# Default: Show WARNING and above logs
finchbot chat
# Show INFO and above logs
finchbot -v chat
# Show DEBUG and above logs (debug mode)
finchbot -vv chat# For memory system semantic search (optional but recommended)
uv run finchbot models download# Create skill directory
mkdir -p ~/.finchbot/workspace/skills/my-skill
# Create skill file
cat > ~/.finchbot/workspace/skills/my-skill/SKILL.md << 'EOF'
---
name: my-skill
description: My custom skill
metadata:
finchbot:
emoji: ✨
always: false
---
# My Custom Skill
When user requests XXX, I should...
EOF| Layer | Technology | Version |
|---|---|---|
| Core Language | Python | 3.13+ |
| Agent Framework | LangChain | 1.2.10+ |
| State Management | LangGraph | 1.0.8+ |
| Data Validation | Pydantic | v2 |
| Vector Storage | ChromaDB | 0.5.0+ |
| Local Embedding | FastEmbed | 0.4.0+ |
| Search Enhancement | BM25 | 0.2.2+ |
| CLI Framework | Typer | 0.23.0+ |
| Rich Text | Rich | 14.3.0+ |
| Logging | Loguru | 0.7.3+ |
| Configuration | Pydantic Settings | 2.12.0+ |
| Web Backend | FastAPI | 0.115.0+ |
| Web Frontend | React + Vite | Latest |
Inherit FinchTool base class, implement _run() method, then register with ToolRegistry.
Configure MCP servers in finchbot config or directly in the config file. MCP tools are automatically loaded via langchain-mcp-adapters.
Create a SKILL.md file in ~/.finchbot/workspace/skills/{skill-name}/.
Add a new Provider class in providers/factory.py.
Add a new .toml file under i18n/locales/.
Use LangBot for multi-platform support. See the LangBot Documentation for details.
| Advantage | Description |
|---|---|
| Privacy First | Uses FastEmbed locally for vector generation, no cloud upload |
| True Persistence | Dual-layer memory storage with semantic retrieval and precise queries |
| Production Ready | Double-checked locking, auto-retry, timeout control mechanisms |
| Flexible Extension | Inherit FinchTool or create SKILL.md to extend without modifying core code |
| Model Agnostic | Supports OpenAI, Anthropic, Gemini, DeepSeek, Moonshot, Groq, etc. |
| Thread Safe | Tool registration uses double-checked locking pattern |
| Multi-Platform | Via LangBot: QQ, WeChat, Feishu, DingTalk, Discord, Telegram, Slack, etc. |
| MCP Support | Official langchain-mcp-adapters for stdio and HTTP transports |
| Document | Description |
|---|---|
| User Guide | CLI usage tutorial |
| API Reference | API reference |
| Configuration Guide | Configuration options |
| Extension Guide | Adding tools/skills |
| Architecture | System architecture |
| Deployment Guide | Deployment instructions |
| Development Guide | Development environment setup |
| Contributing Guide | Contribution guidelines |
Contributions are welcome! Please read the Contributing Guide for more information.
This project is licensed under the MIT License.
If this project is helpful to you, please give it a Star ⭐️
