Skip to content

lee101/pi-infinity

 
 

Repository files navigation

Pi Infinity

Pi Infinity

npm i -g @codex-infinity/pi-infinity

Pi Infinity is a coding agent that can run forever.

Run locally or on bare metal GPU hardware.


What makes Pi Infinity different?

Two flags turn Pi into a fully autonomous coding agent:

  • --auto-next-steps -- After each response, automatically continues with the next logical steps (including testing)
  • --auto-next-idea -- Generates and implements new improvement ideas for your codebase
# Autonomous coding -- completes tasks then moves to the next one
pinf --auto-next-steps "fix all lint errors and add tests"

# Fully autonomous -- dreams up and implements improvements forever
pinf --auto-next-steps --auto-next-idea

Quickstart

npm install -g @codex-infinity/pi-infinity

Then run pinf to get started.

Authentication

Set your API key for any supported provider:

export OPENAI_API_KEY=sk-...
export ANTHROPIC_API_KEY=sk-ant-...
export GOOGLE_API_KEY=...
pinf "your prompt"

Features

  • Autonomous operation -- --auto-next-steps keeps it working without intervention
  • Idea generation -- --auto-next-idea brainstorms and implements improvements
  • AnyLLM -- OpenAI, Anthropic, Google, local models, bring your own provider
  • Local execution -- runs entirely on your machine
  • GPU cloud -- deploy on bare metal GPU hardware for long-running sessions

Share your OSS coding agent sessions

If you use pi or other coding agents for open source work, please share your sessions.

Public OSS session data helps improve coding agents with real-world tasks, tool use, failures, and fixes instead of toy benchmarks.

For the full explanation, see this post on X.

To publish sessions, use badlogic/pi-share-hf. Read its README.md for setup instructions. All you need is a Hugging Face account, the Hugging Face CLI, and pi-share-hf.

You can also watch this video, where I show how I publish my pi-mono sessions.

I regularly publish my own pi-mono work sessions here:

Packages

Package Description
@codex-infinity/pi-infinity Interactive coding agent CLI
@mariozechner/pi-ai Unified multi-provider LLM API (OpenAI, Anthropic, Google, etc.)
@mariozechner/pi-agent-core Agent runtime with tool calling and state management
@mariozechner/pi-mom Slack bot that delegates messages to the pi coding agent
@mariozechner/pi-tui Terminal UI library with differential rendering
@mariozechner/pi-web-ui Web components for AI chat interfaces
@mariozechner/pi-pods CLI for managing vLLM deployments on GPU pods

Development

npm install          # Install all dependencies
npm run build        # Build all packages
npm run check        # Lint, format, and type check
./test.sh            # Run tests (skips LLM-dependent tests without API keys)
./pi-test.sh         # Run pi from sources (must be run from repo root)

Contributing

See CONTRIBUTING.md for contribution guidelines and AGENTS.md for project-specific rules.

License

MIT

About

https://codex-infinity.com AI agent toolkit: coding agent CLI, unified LLM API, TUI & web UI libraries, Slack bot, vLLM pods

Resources

License

Contributing

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages

  • TypeScript 96.2%
  • JavaScript 2.7%
  • Shell 0.7%
  • CSS 0.4%
  • HTML 0.0%
  • C 0.0%