AI-powered dependency solver that thinks fast, explains clearly
Pix is a CLI tool that uses Ollama (local AI) to solve dependency issues for package managers like pip, npm, Maven, and Cargo.
- Fast Mode - Instant solutions based on common patterns (~0.2s)
- AI Mode - Detailed AI-powered solutions with web search (~60s)
- Multi-package - Works with pip, npm, Maven, Cargo
- Offline-first - Runs entirely on your local machine with Ollama
pip install git+https://github.com/12britz/pix.git- Python 3.10 or higher
- Ollama installed and running
# macOS
brew install ollama
# Linux
curl -fsSL https://ollama.com/install.sh | sh
# Windows
# Download from https://ollama.ai/downloadStart Ollama:
ollama servePull a model (first time only):
ollama pull qwen3.5# Clone the repository
git clone https://github.com/12britz/pix.git
cd pix
# Install with pip
pip install -e .
# Or install with pipx (recommended for CLI tools)
pipx install -e .# Using pip
pip install git+https://github.com/12britz/pix.git
# Using pipx (recommended)
pipx install git+https://github.com/12britz/pix.gitpix checkExpected output:
Ollama is available at http://localhost:11434
Models (1): qwen3.5:latest
# Check Ollama is working
pix check
# Get instant fix suggestions
pix solve -e "gcc failed exit status 1"
# Get detailed AI solution
pix solve -e "gcc failed" --ai
# Resolve dependencies from a file
pix resolve requirements.txtVerify Ollama is installed and show available models.
pix check
# Output:
# Ollama is available at http://localhost:11434
# Models (2): qwen3.5:latest, llama3.2Solve dependency issues with smart suggestions.
# Fast mode - instant fixes (~0.2s)
pix solve -e "gcc failed"
# AI mode - detailed solution (~60s)
pix solve -e "gcc failed" --ai
# With dependency file context
pix solve -e "version conflict" -f requirements.txt --ai
# Specify package manager
pix solve -e "peer dependency conflict" -m npm
pix solve -e "artifact not found" -m mavenResolve all dependencies from a package file.
pix resolve requirements.txtCreate an empty dependency file.
pix init -m pip requirements.txt
pix init -m npm package.json
pix init -m maven pom.xml
pix init -m cargo Cargo.tomlpix solve -e "error: Microsoft Visual C++ 14.0 is required"
pix solve -e "python.h: No such file or directory"
pix solve -e "Could not build wheels for scipy"
pix solve -e "SSL certificate verification failed"
pix solve -e "ReadTimeoutError: pip install"pix solve -e "UNABLE_TO_VERIFY_LEAF_SIGNATURE"
pix solve -e "peer dependency conflict"
pix solve -e "EACCES permission denied"
pix solve -e "ERESOLVE could not resolve"pix solve -e " Cargo.lock is out-of-date"
pix solve -e "failed to run custom build command"
pix solve -e "error: linking with cc failed"pix solve -e "Could not resolve dependencies"
pix solve -e "Missing artifact"
pix solve -e "Non-resolvable parent POM"┌─────────────────────────────────────────────────────────────┐
│ pix solve │
│ │
│ 1. Parse error message │
│ 2. Match against known patterns │
│ 3. Search web for solutions (optional) │
│ 4. Generate solution with local AI (Ollama) │
│ 5. Present colorful, actionable output │
└─────────────────────────────────────────────────────────────┘
$ pix solve -e "gcc failed"
Searching... [0.2s]
============================================================
ISSUE gcc failed
============================================================
QUICK FIXES
[!] Install build tools
sudo apt install build-essential
macOS: brew install gcc
[!] Use prebuilt wheels
pip install --upgrade pip && pip install --only-binary :all: package
>> Run with --ai for detailed AI solution
Total: 0.2s
$ pix solve -e "gcc failed" --ai
Searching... [0.1s]
Getting detailed solution... [55.3s]
============================================================
DETAILED SOLUTION
============================================================
To fix the `pip` error where `gcc` fails...
### 1. Install Compiler Tools
...
Total: 55.4s
| Manager | Files | Flags |
|---|---|---|
| pip | requirements.txt, *.in |
-m pip |
| npm | package.json |
-m npm |
| Maven | pom.xml |
-m maven |
| Cargo | Cargo.toml |
-m cargo |
ollama serveollama pull qwen3.5
ollama pull llama3.2pix --ollama-url http://localhost:11434 checkMIT
Contributions welcome! Feel free to submit issues and pull requests.