All notable changes to this project will be documented in this file.
The format is based on Keep a Changelog, and this project adheres to Semantic Versioning.
- External Tool Bridge: Added proxy-level bridging for external OpenAI-compatible
toolsacross/v1/chat/completionsand/v1/responses. - Streaming Tool Call Parity: Added streaming support for external tool calls in both Chat Completions and Responses APIs.
- Explicit External Tool Config: Added explicit
EXTERNAL_TOOLS_MODE=proxy-bridgeandEXTERNAL_TOOLS_CONFLICT_POLICY=namespaceconfiguration surface and documentation.
- Project Version: Bumped the repository version to
1.5.0across package metadata and documentation badges.
- Jest Test Shutdown: Removed a lingering queue rescheduling timer from the proxy request lock flow and updated the default test command to use the verified clean Jest invocation, eliminating the previous generic open-handle warning during
npm test.
- OpenAI-compatible API:
/v1/models,/v1/chat/completions,/v1/responsesendpoints - Streaming Support: Full SSE streaming for Chat Completions and Responses API
- Model Aliases: GPT-style model aliasing (e.g.,
gpt5-nano→gpt-5-nano) - Docker Deployment: Complete Docker setup with healthcheck and volume management
- Configuration: Environment variables and config.json support
- Auto Cleanup: Configurable automatic conversation/session storage cleanup
- Default Security:
DISABLE_TOOLSdefaults totruefor safer out-of-box behavior