Switch from Bedrock to Anthropic endpoint as default. Include support for gpt-5.5#118
Switch from Bedrock to Anthropic endpoint as default. Include support for gpt-5.5#118
Conversation
Co-authored-by: OpenAI Codex <codex@openai.com>
Co-authored-by: OpenAI Codex <codex@openai.com>
Co-authored-by: OpenAI Codex <codex@openai.com>
|
Claude finished @lewtun's task in 2m 14s —— View job PR ReviewNo blocking issues — 1 P1 What I checked
P1 — Stale user-facing string and comment now misrepresent
|
Co-authored-by: OpenAI Codex <codex@openai.com>
|
Claude finished @lewtun's task in 46s —— View job Re-review (follow-up commit
|

Summary
When running
ml-internwith the default config, the CLI was usingbedrock/us.anthropic.claude-opus-4-6-v1. On machines without permission to invoke that Bedrock inference profile, even a trivial prompt failed with alitellm.APIConnectionErrorwrapping a Bedrock authorization error forbedrock:InvokeModelWithResponseStream.This PR switches the default path away from Bedrock, adds direct OpenAI GPT-5 model support to the suggested model list and effort validation, and fixes a probe edge case that showed up while testing GPT-5.5.
What changed
anthropic/claude-opus-4-6)/modelsuggestions to surface direct Anthropic and OpenAI optionsopenai/gpt-5.4andopenai/gpt-5.5xhighis accepted for GPT-5.4 / GPT-5.5OPENAI_API_KEYusage in the READMETesting
Live smoke tests were run against:
anthropic/claude-opus-4-6anthropic/claude-opus-4-7openai/gpt-5.4openai/gpt-5.5Results:
maxreasoning effort and returned a successful response in streaming and non-streaming modesmaxtoxhighand returned a successful response in streaming and non-streaming modesuv run pytest tests/unit/test_llm_params.pycc @jagwar for viz