Skip to content

feat: add stream_include_usage option for OpenAI-compatible providers#94

Open
hitsmaxft wants to merge 2 commits intoSaladDay:mainfrom
hitsmaxft:feature/stream_include_usage
Open

feat: add stream_include_usage option for OpenAI-compatible providers#94
hitsmaxft wants to merge 2 commits intoSaladDay:mainfrom
hitsmaxft:feature/stream_include_usage

Conversation

@hitsmaxft
Copy link
Copy Markdown

@hitsmaxft hitsmaxft commented Apr 8, 2026

  • 当提供者设置配置中启用stream_include_usage时,将stream_options.include_usage=true(默认对openai_chat提供者启用)注入流式请求
  • 延迟message_delta的发送至[DONE]事件,以便在构建事件前捕获尾部的使用量数据块(choices:[], usage:{...})
  • 缓存尾部使用量数据块,以便在message_delta中展示真实的输入/输出令牌计数而非空值
  • 更新README和README_ZH,添加配置说明
    修复 [feature request] OpenAI 协议 provider 需要提供选项开启流式返回 token usage #93

open ai端点输出格式为:

``
data: stop_reason=stop

data: token usage

[DONE]
``

因此转换过程应缓冲stop reason事件,待完成事件到达时结合令牌使用量一并发送

- Inject stream_options.include_usage=true into streaming requests when
  stream_include_usage is set in the provider settings config
- Defer message_delta emission to [DONE] so the trailing usage chunk
  (choices:[], usage:{...}) is captured before building the event
- Cache trailing usage chunk to surface real input/output token counts
  in message_delta instead of null
- Update README and README_ZH with configuration instructions
@hitsmaxft
Copy link
Copy Markdown
Author

hitsmaxft commented Apr 8, 2026

在dashscope API上测试过

个人而言,此选项可在所有openai_chat提供者上默认启用,openai/dashscope/azure_openai均支持此选项。

…de_usage()

- Add Provider::stream_include_usage() method that reads explicit config
  and falls back to api_format-based defaults (true for openai-compatible)
- Replace all scattered reads in proxy provider_router, claude adapter,
  and stream_check request builders with the centralized method
- OpenAI chat / responses compatible providers default to true;
  anthropic native defaults to false
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant