An OpenAI API-compatible wrapper for Claude Code, allowing you to use Claude Code with any OpenAI client library. Now powered by the official Claude Code Python SDK with enhanced authentication and features.
π Production Ready - All core features working and tested:
- β Chat completions endpoint with official Claude Code Python SDK
- β Streaming and non-streaming responses
- β Full OpenAI SDK compatibility
- β OpenAI Function Calling - Complete support for tools via OpenAI format! π
- β Multi-provider authentication (API key, Bedrock, Vertex AI, CLI auth)
- β System prompt support via SDK options
- β Model selection support with validation
- β Fast by default - Tools disabled for OpenAI compatibility (5-10x faster)
- β Optional tool usage (Read, Write, Bash, etc.) when explicitly enabled
- β Real-time cost and token tracking from SDK
- β Session continuity with conversation history across requests π
- β Session management endpoints for full session control π
- β Health, auth status, and models endpoints
- β Development mode with auto-reload
- OpenAI-compatible
/v1/chat/completionsendpoint - Support for both streaming and non-streaming responses
- Compatible with OpenAI Python SDK and all OpenAI client libraries
- Automatic model validation and selection
- OpenAI Function Calling support π - Use Claude's tools via OpenAI's function calling format
- Official Claude Code Python SDK integration (v0.0.14)
- Real-time cost tracking - actual costs from SDK metadata
- Accurate token counting - input/output tokens from SDK
- Session management - proper session IDs and continuity
- Enhanced error handling with detailed authentication diagnostics
- Automatic detection of authentication method
- Claude CLI auth - works with existing
claude authsetup - Direct API key -
ANTHROPIC_API_KEYenvironment variable - AWS Bedrock - enterprise authentication with AWS credentials
- Google Vertex AI - GCP authentication support
- System prompt support via SDK options
- Optional tool usage - Enable Claude Code tools (Read, Write, Bash, etc.) when needed
- Fast default mode - Tools disabled by default for OpenAI API compatibility
- Development mode with auto-reload (
uvicorn --reload) - Interactive API key protection - Optional security with auto-generated tokens
- Comprehensive logging and debugging capabilities
Get started in under 2 minutes:
# 1. Install Claude Code CLI (if not already installed)
npm install -g @anthropic-ai/claude-code
# 2. Authenticate (choose one method)
claude auth login # Recommended for development
# OR set: export ANTHROPIC_API_KEY=your-api-key
# 3. Clone and setup the wrapper
git clone https://github.com/RichardAtCT/claude-code-openai-wrapper
cd claude-code-openai-wrapper
poetry install
# 4. Start the server
poetry run uvicorn main:app --reload --port 8000
# 5. Test it works
poetry run python test_endpoints.pyπ That's it! Your OpenAI-compatible Claude Code API is running on http://localhost:8000
-
Claude Code CLI: Install Claude Code CLI
# Install Claude Code (follow Anthropic's official guide) npm install -g @anthropic-ai/claude-code -
Authentication: Choose one method:
- Option A: Authenticate via CLI (Recommended for development)
claude auth login
- Option B: Set environment variable
export ANTHROPIC_API_KEY=your-api-key - Option C: Use AWS Bedrock or Google Vertex AI (see Configuration section)
- Option A: Authenticate via CLI (Recommended for development)
-
Python 3.10+: Required for the server
-
Poetry: For dependency management
# Install Poetry (if not already installed) curl -sSL https://install.python-poetry.org | python3 -
-
Clone the repository:
git clone https://github.com/RichardAtCT/claude-code-openai-wrapper cd claude-code-openai-wrapper -
Install dependencies with Poetry:
poetry install
This will create a virtual environment and install all dependencies.
-
Configure environment:
cp .env.example .env # Edit .env with your preferences
Edit the .env file:
# Claude CLI path (usually just "claude")
CLAUDE_CLI_PATH=claude
# Optional API key for client authentication
# If not set, server will prompt for interactive API key protection on startup
# API_KEY=your-optional-api-key
# Server port
PORT=8000
# Timeout in milliseconds
MAX_TIMEOUT=600000
# CORS origins
CORS_ORIGINS=["*"]The server supports interactive API key protection for secure remote access:
-
No API key set: Server prompts "Enable API key protection? (y/N)" on startup
- Choose No (default): Server runs without authentication
- Choose Yes: Server generates and displays a secure API key
-
Environment API key set: Uses the configured
API_KEYwithout prompting
# Example: Interactive protection enabled
poetry run python main.py
# Output:
# ============================================================
# π API Endpoint Security Configuration
# ============================================================
# Would you like to protect your API endpoint with an API key?
# This adds a security layer when accessing your server remotely.
#
# Enable API key protection? (y/N): y
#
# π API Key Generated!
# ============================================================
# API Key: Xf8k2mN9-vLp3qR5_zA7bW1cE4dY6sT0uI
# ============================================================
# π IMPORTANT: Save this key - you'll need it for API calls!
# Example usage:
# curl -H "Authorization: Bearer Xf8k2mN9-vLp3qR5_zA7bW1cE4dY6sT0uI" \
# http://localhost:8000/v1/models
# ============================================================Perfect for:
- π Local development - No authentication needed
- π Remote access - Secure with generated tokens
- π VPN/Tailscale - Add security layer for remote endpoints
-
Verify Claude Code is installed and working:
claude --version claude --print --model claude-3-5-haiku-20241022 "Hello" # Test with fastest model
-
Start the server:
Development mode (recommended - auto-reloads on changes):
poetry run uvicorn main:app --reload --port 8000
Production mode:
poetry run python main.py
Port Options for production mode:
- Default: Uses port 8000 (or PORT from .env)
- If port is in use, automatically finds next available port
- Specify custom port:
poetry run python main.py 9000 - Set in environment:
PORT=9000 poetry run python main.py
# Basic chat completion (no auth)
curl -X POST http://localhost:8000/v1/chat/completions \
-H "Content-Type: application/json" \
-d '{
"model": "claude-3-5-sonnet-20241022",
"messages": [
{"role": "user", "content": "What is 2 + 2?"}
]
}'
# With API key protection (when enabled)
curl -X POST http://localhost:8000/v1/chat/completions \
-H "Content-Type: application/json" \
-H "Authorization: Bearer your-generated-api-key" \
-d '{
"model": "claude-3-5-sonnet-20241022",
"messages": [
{"role": "user", "content": "Write a Python hello world script"}
],
"stream": true
}'from openai import OpenAI
# Configure client (automatically detects auth requirements)
client = OpenAI(
base_url="http://localhost:8000/v1",
api_key="your-api-key-if-required" # Only needed if protection enabled
)
# Alternative: Let examples auto-detect authentication
# The wrapper's example files automatically check server auth status
# Basic chat completion
response = client.chat.completions.create(
model="claude-3-5-sonnet-20241022",
messages=[
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "What files are in the current directory?"}
]
)
print(response.choices[0].message.content)
# Output: Fast response without tool usage (default behavior)
# Enable tools when you need them (e.g., to read files)
response = client.chat.completions.create(
model="claude-3-5-sonnet-20241022",
messages=[
{"role": "user", "content": "What files are in the current directory?"}
],
extra_body={"enable_tools": True} # Enable tools for file access
)
print(response.choices[0].message.content)
# Output: Claude will actually read your directory and list the files!
# Use OpenAI Function Calling format
tools = [{
"type": "function",
"function": {
"name": "list_directory",
"description": "List contents of a directory",
"parameters": {
"type": "object",
"properties": {
"path": {"type": "string", "description": "Directory path"}
}
}
}
}]
response = client.chat.completions.create(
model="claude-3-5-sonnet-20241022",
messages=[{"role": "user", "content": "List files in the current directory"}],
tools=tools,
tool_choice="auto"
)
# Check if Claude wants to use tools
if response.choices[0].message.tool_calls:
print("Claude wants to call:", response.choices[0].message.tool_calls[0].function.name)
# Check real costs and tokens
print(f"Cost: ${response.usage.total_tokens * 0.000003:.6f}") # Real cost tracking
print(f"Tokens: {response.usage.total_tokens} ({response.usage.prompt_tokens} + {response.usage.completion_tokens})")
# Streaming
stream = client.chat.completions.create(
model="claude-3-5-sonnet-20241022",
messages=[
{"role": "user", "content": "Explain quantum computing"}
],
stream=True
)
for chunk in stream:
if chunk.choices[0].delta.content:
print(chunk.choices[0].delta.content, end="")claude-sonnet-4-20250514(Recommended)claude-opus-4-20250514claude-3-7-sonnet-20250219claude-3-5-sonnet-20241022claude-3-5-haiku-20241022
The model parameter is passed to Claude Code via the --model flag.
The wrapper now supports OpenAI's function calling format, allowing you to use Claude's powerful tools (file operations, web search, command execution) through the standard OpenAI API.
- OpenAI Function Calling Format (Recommended for compatibility):
tools = [{
"type": "function",
"function": {
"name": "read_file",
"description": "Read the contents of a file",
"parameters": {
"type": "object",
"properties": {
"path": {"type": "string", "description": "File path"}
},
"required": ["path"]
}
}
}]
response = client.chat.completions.create(
model="claude-3-5-sonnet-20241022",
messages=[{"role": "user", "content": "Read the README.md file"}],
tools=tools,
tool_choice="auto" # or "none", or specific function
)- Enable All Claude Tools (Simple but Claude-specific):
response = client.chat.completions.create(
model="claude-3-5-sonnet-20241022",
messages=[{"role": "user", "content": "What's in this directory?"}],
extra_body={"enable_tools": True}
)- Legacy Function Format (For older OpenAI clients):
functions = [{
"name": "get_weather",
"description": "Get weather for a location",
"parameters": {
"type": "object",
"properties": {
"location": {"type": "string"}
}
}
}]
response = client.chat.completions.create(
model="claude-3-5-sonnet-20241022",
messages=[{"role": "user", "content": "What's the weather?"}],
functions=functions,
function_call="auto"
)- read_file - Read file contents
- write_file - Write content to files
- edit_file - Edit files by replacing text
- run_command - Execute bash commands
- list_directory - List directory contents
- search_files - Search for files by pattern
- search_in_files - Search within file contents
- web_search - Search the web
- fetch_url - Fetch content from URLs
When Claude uses a tool, you'll receive a response with tool_calls:
message = response.choices[0].message
if message.tool_calls:
for tool_call in message.tool_calls:
print(f"Tool: {tool_call.function.name}")
print(f"Arguments: {tool_call.function.arguments}")
# Execute the tool and continue the conversation
tool_result = execute_tool(tool_call) # Your implementation
messages.append(message) # Add assistant message with tool calls
messages.append({
"role": "tool",
"tool_call_id": tool_call.id,
"content": json.dumps(tool_result)
})
# Get final response
final_response = client.chat.completions.create(
model="claude-3-5-sonnet-20241022",
messages=messages
)See examples/tools_example.py for complete examples of using tools with the OpenAI SDK.
The wrapper now supports session continuity, allowing you to maintain conversation context across multiple requests. This is a powerful feature that goes beyond the standard OpenAI API.
- Stateless Mode (default): Each request is independent, just like the standard OpenAI API
- Session Mode: Include a
session_idto maintain conversation history across requests
import openai
client = openai.OpenAI(
base_url="http://localhost:8000/v1",
api_key="not-needed"
)
# Start a conversation with session continuity
response1 = client.chat.completions.create(
model="claude-3-5-sonnet-20241022",
messages=[
{"role": "user", "content": "Hello! My name is Alice and I'm learning Python."}
],
extra_body={"session_id": "my-learning-session"}
)
# Continue the conversation - Claude remembers the context
response2 = client.chat.completions.create(
model="claude-3-5-sonnet-20241022",
messages=[
{"role": "user", "content": "What's my name and what am I learning?"}
],
extra_body={"session_id": "my-learning-session"} # Same session ID
)
# Claude will remember: "Your name is Alice and you're learning Python."# First message (add -H "Authorization: Bearer your-key" if auth enabled)
curl -X POST http://localhost:8000/v1/chat/completions \
-H "Content-Type: application/json" \
-d '{
"model": "claude-3-5-sonnet-20241022",
"messages": [{"role": "user", "content": "My favorite color is blue."}],
"session_id": "my-session"
}'
# Follow-up message - context is maintained
curl -X POST http://localhost:8000/v1/chat/completions \
-H "Content-Type: application/json" \
-d '{
"model": "claude-3-5-sonnet-20241022",
"messages": [{"role": "user", "content": "What's my favorite color?"}],
"session_id": "my-session"
}'The wrapper provides endpoints to manage active sessions:
GET /v1/sessions- List all active sessionsGET /v1/sessions/{session_id}- Get session detailsDELETE /v1/sessions/{session_id}- Delete a sessionGET /v1/sessions/stats- Get session statistics
# List active sessions
curl http://localhost:8000/v1/sessions
# Get session details
curl http://localhost:8000/v1/sessions/my-session
# Delete a session
curl -X DELETE http://localhost:8000/v1/sessions/my-session- Automatic Expiration: Sessions expire after 1 hour of inactivity
- Streaming Support: Session continuity works with both streaming and non-streaming requests
- Memory Persistence: Full conversation history is maintained within the session
- Efficient Storage: Only active sessions are kept in memory
See examples/session_continuity.py for comprehensive Python examples and examples/session_curl_example.sh for curl examples.
POST /v1/chat/completions- OpenAI-compatible chat completions (supportssession_idandtools)GET /v1/models- List available modelsGET /v1/tools- List available tools/functions πGET /v1/auth/status- Check authentication status and configurationGET /health- Health check endpoint
GET /v1/sessions- List all active sessionsGET /v1/sessions/{session_id}- Get detailed session informationDELETE /v1/sessions/{session_id}- Delete a specific sessionGET /v1/sessions/stats- Get session manager statistics
Access the interactive Swagger UI at: http://localhost:8000/docs (or http://192.168.1.11:8000/docs for Docker)
Create a chat completion (OpenAI-compatible)
Request Body:
{
"model": "claude-3-5-sonnet-20241022", // Required
"messages": [ // Required
{
"role": "system|user|assistant", // Required
"content": "string", // Required
"name": "string" // Optional
}
],
"temperature": 0.7, // Optional (0-2, default: 1.0)
"top_p": 1.0, // Optional (0-1, default: 1.0)
"n": 1, // Optional (must be 1)
"stream": false, // Optional (default: false)
"stop": ["string"], // Optional
"max_tokens": null, // Optional (not supported by Claude Code)
"presence_penalty": 0, // Optional (-2 to 2, default: 0)
"frequency_penalty": 0, // Optional (-2 to 2, default: 0)
"logit_bias": {}, // Optional (not supported)
"user": "string", // Optional
"session_id": "string", // Optional (for conversation continuity)
"enable_tools": false // Optional (enable Claude Code tools)
}Response (non-streaming):
{
"id": "chatcmpl-123",
"object": "chat.completion",
"created": 1677652288,
"model": "claude-3-5-sonnet-20241022",
"choices": [{
"index": 0,
"message": {
"role": "assistant",
"content": "Hello! How can I help you?",
"name": null
},
"finish_reason": "stop"
}],
"usage": {
"prompt_tokens": 10,
"completion_tokens": 20,
"total_tokens": 30
},
"system_fingerprint": null
}Response (streaming):
data: {"id":"chatcmpl-123","object":"chat.completion.chunk","created":1677652288,"model":"claude-3-5-sonnet-20241022","choices":[{"index":0,"delta":{"content":"Hello"},"finish_reason":null}]}
data: {"id":"chatcmpl-123","object":"chat.completion.chunk","created":1677652288,"model":"claude-3-5-sonnet-20241022","choices":[{"index":0,"delta":{"content":" there!"},"finish_reason":null}]}
data: {"id":"chatcmpl-123","object":"chat.completion.chunk","created":1677652288,"model":"claude-3-5-sonnet-20241022","choices":[{"index":0,"delta":{},"finish_reason":"stop"}]}
data: [DONE]
List available models
Response:
{
"object": "list",
"data": [
{
"id": "claude-sonnet-4-20250514",
"object": "model",
"owned_by": "anthropic"
},
{
"id": "claude-opus-4-20250514",
"object": "model",
"owned_by": "anthropic"
},
{
"id": "claude-3-5-sonnet-20241022",
"object": "model",
"owned_by": "anthropic"
}
]
}Health check endpoint
Response:
{
"status": "healthy",
"service": "claude-code-openai-wrapper"
}Check authentication status
Response:
{
"claude_code_auth": {
"method": "browser|api_key|bedrock|vertex|claude_cli",
"status": {
"valid": true,
"errors": [],
"config": {
"method": "Browser authentication",
"note": "Authentication completed via browser"
}
},
"environment_variables": ["DOCKER_CONTAINER"]
},
"server_info": {
"api_key_required": false,
"api_key_source": "none",
"version": "1.0.0"
}
}List all active sessions
Response:
{
"sessions": [
{
"session_id": "my-session-123",
"created_at": "2024-03-20T10:30:00Z",
"last_active": "2024-03-20T11:45:00Z",
"message_count": 5,
"expires_at": "2024-03-20T12:45:00Z"
}
],
"count": 1
}Get session details
Response:
{
"session_id": "my-session-123",
"conversation": {
"messages": [
{
"role": "user",
"content": "Hello"
},
{
"role": "assistant",
"content": "Hi! How can I help?"
}
]
},
"metadata": {
"created_at": "2024-03-20T10:30:00Z",
"last_active": "2024-03-20T11:45:00Z",
"message_count": 2,
"expires_at": "2024-03-20T12:45:00Z"
}
}Delete a session
Response:
{
"message": "Session deleted successfully",
"session_id": "my-session-123"
}Get session statistics
Response:
{
"active_sessions": 3,
"total_messages": 45,
"memory_usage_mb": 2.5,
"oldest_session": "2024-03-20T09:00:00Z",
"newest_session": "2024-03-20T11:30:00Z"
}If API key protection is enabled (via API_KEY environment variable), include the API key in the Authorization header:
curl -H "Authorization: Bearer your-api-key" http://localhost:8000/v1/modelsclaude-sonnet-4-20250514- Latest and most capableclaude-opus-4-20250514- Most intelligent modelclaude-3-7-sonnet-20250219- Extended context windowclaude-3-5-sonnet-20241022- Fast and capableclaude-3-5-haiku-20241022- Fastest response times
Add "enable_tools": true to use Claude Code's file operations:
{
"model": "claude-3-5-sonnet-20241022",
"messages": [{"role": "user", "content": "List files in current directory"}],
"enable_tools": true
}Add "session_id": "your-session-id" to maintain conversation context:
{
"model": "claude-3-5-sonnet-20241022",
"messages": [{"role": "user", "content": "Continue our discussion"}],
"session_id": "my-conversation-123"
}- Images in messages are converted to text placeholders
- OpenAI parameters not yet mapped:
temperature,top_p,max_tokens,logit_bias,presence_penalty,frequency_penalty - Multiple responses (
n > 1) not supported
- Tool configuration - allowed/disallowed tools endpoints
- OpenAI parameter mapping - temperature, top_p, max_tokens support
- Enhanced streaming - better chunk handling
- MCP integration - Model Context Protocol server support
- β Function Calling: Full OpenAI function calling support with all Claude tools! π
- β SDK Integration: Official Python SDK replaces subprocess calls
- β Real Metadata: Accurate costs and token counts from SDK
- β Multi-auth: Support for CLI, API key, Bedrock, and Vertex AI authentication
- β Session IDs: Proper session tracking and management
- β System Prompts: Full support via SDK options
- β Session Continuity: Conversation history across requests with session management
-
Claude CLI not found:
# Check Claude is in PATH which claude # Update CLAUDE_CLI_PATH in .env if needed
-
Authentication errors:
# Test authentication with fastest model claude --print --model claude-3-5-haiku-20241022 "Hello" # If this fails, re-authenticate if needed
-
Timeout errors:
- Increase
MAX_TIMEOUTin.env - Note: Claude Code can take time for complex requests
- Increase
Test all endpoints with a simple script:
# Make sure server is running first
poetry run python test_endpoints.pyRun the comprehensive test suite:
# Make sure server is running first
poetry run python test_basic.py
# With API key protection enabled, set TEST_API_KEY:
TEST_API_KEY=your-generated-key poetry run python test_basic.pyThe test suite automatically detects whether API key protection is enabled and provides helpful guidance for providing the necessary authentication.
Check authentication status:
curl http://localhost:8000/v1/auth/status | python -m json.tool# Install development dependencies
poetry install --with dev
# Format code
poetry run black .
# Run full tests (when implemented)
poetry run pytest tests/All tests should show:
- 4/4 endpoint tests passing
- 4/4 basic tests passing
- Authentication method detected (claude_cli, anthropic, bedrock, or vertex)
- Real cost tracking (e.g., $0.001-0.005 per test call)
- Accurate token counts from SDK metadata
The wrapper includes full Docker support with Ubuntu GUI for browser-based Claude authentication:
# Clone the repository
git clone https://github.com/jorge123255/claude-code-openai-wrapper
cd claude-code-openai-wrapper
# IMPORTANT: Set up security first
cp .env.example .env
# Edit .env to set a secure VNC_PASSWORD (required!)
# Build and run with docker-compose
docker-compose up -d
# Access the services:
# - API: http://localhost:8000
# - Desktop GUI: http://localhost:6080 (requires VNC password)Node.js Version: The container requires Node.js 20+ for Claude CLI. The Dockerfile automatically installs the correct version.
First Run Authentication:
- Access the desktop at http://localhost:6080 (enter your VNC password from .env)
- The authentication handler will automatically open Firefox if needed
- Complete the Claude login in the browser
- The API server starts automatically once authenticated
Security Notes:
- VNC password is required - no default password for security
- Set
API_KEYin .env to protect API endpoints - The container runs without API key prompts in Docker mode
- Ubuntu Desktop Environment: XFCE desktop accessible via web browser
- Automatic Authentication: Browser opens automatically when auth is needed
- Persistent Storage: Authentication tokens survive container restarts
- Multiple Auth Methods: Browser, API key, Bedrock, or Vertex AI
- noVNC Web Access: No VNC client needed - access desktop through browser
- Access the Desktop: Navigate to http://localhost:6080
- Complete Authentication:
- Firefox will open automatically if authentication is needed
- Log in to Claude when prompted
- Authentication is saved persistently
- Use the API: Once authenticated, the API is available at http://localhost:8000
# SECURITY: VNC password is REQUIRED - generate a secure one:
# openssl rand -base64 12
VNC_PASSWORD=your-secure-password-here
# Display settings
RESOLUTION=1920x1080x24 # Desktop resolution
# Authentication method
AUTH_METHOD=browser # Options: browser, api_key, bedrock, vertex
ANTHROPIC_API_KEY= # For api_key method
# API protection (recommended for remote access)
API_KEY= # Protect wrapper endpointsThe container uses these persistent volumes:
/config/claude: Authentication data/config/api: API configuration/data: User projects/data/var/log/supervisor: Logs
# Build the image
docker build -f docker/Dockerfile -t claude-code-wrapper .
# Run with custom settings
docker run -d \
-p 8000:8000 \
-p 6080:6080 \
-e VNC_PASSWORD=mysecurepassword \
-v claude_auth:/config/claude \
claude-code-wrapperFor Unraid users, a Community App template is included:
-
Add Template Repository (if not using Community Apps):
https://github.com/jorge123255/claude-code-openai-wrapper/tree/main/docker/unraid -
Network Configuration:
- Use bridge network with custom IP (e.g., 192.168.1.11)
- Or use
br0network type for direct LAN access
-
Install from Community Apps:
- Search for "Claude Code Wrapper"
- Configure paths and passwords
- Start the container
-
Default Paths:
- Config:
/mnt/user/appdata/claude-code-wrapper/ - Ports: 8000 (API), 6080 (GUI)
- Config:
-
VNC Password (REQUIRED):
- Generate a strong password:
openssl rand -base64 12 - Never use default passwords
- Container will refuse to start without VNC_PASSWORD set
- Generate a strong password:
-
API Protection:
- Set
API_KEYin .env for remote access - Use HTTPS reverse proxy for production
- Consider IP whitelisting for sensitive deployments
- Set
-
Network Security:
- Use Docker networks to isolate containers
- Limit port exposure to localhost only if possible
- Consider VPN access for remote management
-
Authentication:
- Browser auth tokens are stored in persistent volumes
- Use API key method for headless deployments
- Rotate credentials regularly
-
Authentication Issues:
# Check auth handler logs docker-compose logs -f claude-code-wrapper | grep auth-handler # Manually trigger auth check docker exec claude-code-wrapper touch /tmp/need_auth
-
Desktop Not Loading:
# Check VNC password was set correctly docker-compose exec claude-code-wrapper cat /root/.vnc/passwd # Restart desktop services docker-compose exec claude-code-wrapper supervisorctl restart desktop:*
-
Container Health:
# Check container status docker-compose ps # View all logs docker-compose logs -f
MIT License
Contributions are welcome! Please open an issue or submit a pull request.