** Please make sure you read the contribution guide and file the issues in the rigth place. **
Contribution guide.
Describe the bug
When using streaming mode, the Gemini Model does not return the usageMetadata in the LlmResponse. When stream=true the usageMetadata should return in the final aggregated response.
To Reproduce
Steps to reproduce the behavior:
- Create an
LlmAgent with the Gemini model.
- In the
RunConfig set the streaming model to RunConfig.StreamingMode.SSE
- observe the
LlmResponse in afterModelCallback of a plugin or agent.
- See that both for partial and final LLM responses,
usageMetadata is Optional.Empty
Expected behavior
The final and non-partial LlmResponse should include the usageMetadata.
Desktop (please complete the following information):
- OS: MacOS
- Java version:
- ADK version(see maven dependency): 0.5.0
Additional context
The python version of ADK returns the usageMetadata correctly.
The usageMetadata is returned from the Gemini API in the last chunk.
** Please make sure you read the contribution guide and file the issues in the rigth place. **
Contribution guide.
Describe the bug
When using streaming mode, the Gemini Model does not return the
usageMetadatain theLlmResponse. Whenstream=truethe usageMetadata should return in the final aggregated response.To Reproduce
Steps to reproduce the behavior:
LlmAgentwith theGeminimodel.RunConfigset the streaming model toRunConfig.StreamingMode.SSELlmResponseinafterModelCallbackof a plugin or agent.usageMetadataisOptional.EmptyExpected behavior
The final and non-partial LlmResponse should include the
usageMetadata.Desktop (please complete the following information):
Additional context
The python version of ADK returns the usageMetadata correctly.
The usageMetadata is returned from the Gemini API in the last chunk.