Skip to content
This repository was archived by the owner on Feb 1, 2026. It is now read-only.

Implement Claude provider, LLM repair, and improved streaming parity#3

Open
improdead wants to merge 1 commit intocopperlaneai:mainfrom
improdead:feature/claude-provider-llm-repair
Open

Implement Claude provider, LLM repair, and improved streaming parity#3
improdead wants to merge 1 commit intocopperlaneai:mainfrom
improdead:feature/claude-provider-llm-repair

Conversation

@improdead
Copy link
Copy Markdown

@improdead improdead commented Jan 13, 2026

This PR completes the implementation of:

  • Claude/Anthropic Provider: New provider with PromptPayload coercion (including system aggregation), ToolSpec→Claude tools mapping, tool_use parsing into ToolCall, and streaming event handling.
  • OpenAI Parity: Tightened tool_choice normalization and streaming tool-call delta parity.
  • LLM Repair/Validation: Core logic in src/coevolved/core/llm_repair.py including LLMRepairPolicy, RepairContext, and validated_llm_call.
  • Prebuilt Retry Helpers: llm_retry and agent_retry in src/coevolved/prebuilt/llm_retry.py.
  • Tests: Unit tests for Claude provider, LLM repair, and retry logic.
  • Documentation: New docs for LLM validation and retry.
  • Exports: Wired exports in core and prebuilt levels.
  • Extras: Added anthropic optional dependency in pyproject.toml.

If you like my work, please contact me at dekai.me.

- Claude/Anthropic provider with prompt/tool/stream handling
- OpenAI tool_choice normalization and streaming tool-call delta parity
- LLMRepairPolicy, RepairContext, and validated_llm_call core logic
- llm_retry and agent_retry high-level prebuilt helpers
- Documentation and unit tests
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant