Skip to content

fix(anthropic): pass through top-level cache_control for automatic caching#1593

Open
bhuvansingla wants to merge 2 commits intoPortkey-AI:mainfrom
bhuvansingla:fix/anthropic-automatic-cache-control
Open

fix(anthropic): pass through top-level cache_control for automatic caching#1593
bhuvansingla wants to merge 2 commits intoPortkey-AI:mainfrom
bhuvansingla:fix/anthropic-automatic-cache-control

Conversation

@bhuvansingla
Copy link
Copy Markdown

Summary

  • Anthropic supports automatic prompt caching via a top-level cache_control field on the request body, which automatically applies the cache breakpoint to the last cacheable block
  • This field was not present in AnthropicChatCompleteConfig, so it was silently dropped during OpenAI→Anthropic request transformation
  • Added cache_control as a passthrough parameter in the config

param: 'thinking',
required: false,
},
cache_control: {
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

hey can you also add this to src/providers/anthropic-base/messages.ts

Copy link
Copy Markdown
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

bhuvansingla and others added 2 commits April 15, 2026 20:45
…ching

Anthropic supports automatic prompt caching via a top-level cache_control
field on the request body. This was not being forwarded to the provider.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
…ic caching

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
@bhuvansingla bhuvansingla force-pushed the fix/anthropic-automatic-cache-control branch from 0c911aa to 24086e7 Compare April 15, 2026 15:15
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants