FastAPI proxy for OpenAI-, and Anthropic-compatible clients backed by the GigaChat API
-
Updated
Apr 16, 2026 - Python
FastAPI proxy for OpenAI-, and Anthropic-compatible clients backed by the GigaChat API
A unified AI proxy server for free access to multiple LLM providers through Puter.js SDK - No expensive API keys needed!
Free AI Gateway — Use Claude (Kiro) and GPT-4.1 (Copilot) as OpenAI-compatible API. No paid API keys. Works with Cursor, Cline, Codex CLI, Aider, Claude Code, and any OpenAI/Anthropic tool. Single binary.
找工作智能体,链接大模型API 自动找工作投简历,自动回复,自动约面试的一个手机端智能体,支持minimax-api-key,openai, authropic等。通过boss直聘根据自己的简历经历自动找hr,全程无需自己操作
Central HTTP LLM proxy that routes inference requests to authenticated remote workers over WebSocket — queueing, streaming, and cancellation included.
Patch-X local runnable CLI fork. main preserves the source snapshot; codex/runnable-local-cli maintains the runnable local build.
Run Claude Code through a local Anthropic-compatible gateway backed by Codex OAuth or the OpenAI Responses API.
Proxy server untuk mengubah GitHub Copilot API jadi endpoint OpenAI/Anthropic-compatible dengan Responses translation, multi-account rotation, caching, rate limiting, dan WebUI.
Add a description, image, and links to the anthropic-compatible topic page so that developers can more easily learn about it.
To associate your repository with the anthropic-compatible topic, visit your repo's landing page and select "manage topics."