Skip to content

feat: add MiniMax as first-class LLM provider#188

Open
octo-patch wants to merge 1 commit intoIntelligent-Internet:mainfrom
octo-patch:feature/add-minimax-provider
Open

feat: add MiniMax as first-class LLM provider#188
octo-patch wants to merge 1 commit intoIntelligent-Internet:mainfrom
octo-patch:feature/add-minimax-provider

Conversation

@octo-patch
Copy link
Copy Markdown

Summary

Add MiniMax as a first-class LLM provider, enabling ii-agent to use MiniMax models (MiniMax-M2.7, MiniMax-M2.7-highspeed) via the OpenAI-compatible API.

Changes

  • src/ii_agent/core/config/llm_config.py: Add MINIMAX = "minimax" to APITypes enum
  • src/ii_agent/utils/constants.py: Add MiniMax API base URL, default model, and is_minimax_family() helper
  • src/ii_agent/llm/minimax.py (new): MiniMaxDirectClient extending OpenAIDirectClient with:
    • Default API base URL (https://api.minimax.io/v1)
    • Temperature clamping to (0.01, 1.0] range (MiniMax rejects 0.0)
    • Automatic stripping of <think>...</think> tags from model responses
  • src/ii_agent/llm/__init__.py: Register MiniMax in get_client() factory routing

Tests

  • 29 unit tests covering temperature clamping, think-tag stripping, client initialization, factory routing, and constants
  • 3 integration tests (require MINIMAX_API_KEY) covering text generation, tool calling, and highspeed model

Usage

from ii_agent.core.config.llm_config import LLMConfig, APITypes

config = LLMConfig(
    model="MiniMax-M2.7",
    api_key=SecretStr("your-minimax-api-key"),
    api_type=APITypes.MINIMAX,
)
client = get_client(config)

Test plan

  • All 29 unit tests pass
  • All 3 integration tests pass against live MiniMax API
  • Verify no regressions in existing provider tests

Add MiniMax (MiniMax-M2.7, MiniMax-M2.7-highspeed) as a supported LLM
provider via the OpenAI-compatible API at api.minimax.io/v1.

Changes:
- Add MINIMAX to APITypes enum in llm_config.py
- Add MiniMax constants and is_minimax_family() helper in constants.py
- Create MiniMaxDirectClient extending OpenAIDirectClient with:
  - Default API base URL (https://api.minimax.io/v1)
  - Temperature clamping to (0.01, 1.0] range (MiniMax rejects 0.0)
  - Automatic stripping of <think>...</think> tags from responses
- Register MiniMax routing in get_client() factory
- Add 29 unit tests and 3 integration tests
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant