Skip to content

fix(model): auto-generate ChatResponse.id when LLM doesn't provide it#721

Open
JGoP-L wants to merge 1 commit intoagentscope-ai:mainfrom
JGoP-L:fix(model)-auto-generate-ChatResponse.id-when-LLM-doesn't-provide-it-(#708)
Open

fix(model): auto-generate ChatResponse.id when LLM doesn't provide it#721
JGoP-L wants to merge 1 commit intoagentscope-ai:mainfrom
JGoP-L:fix(model)-auto-generate-ChatResponse.id-when-LLM-doesn't-provide-it-(#708)

Conversation

@JGoP-L
Copy link
Contributor

@JGoP-L JGoP-L commented Feb 4, 2026

AgentScope-Java Version

1.0.9-SNAPSHOT

Description

This PR fixes Issue #708 where OllamaChatModel fails with the AGUI protocol
due to missing id field in Ollama API responses.

Problem

Ollama API doesn't return an id field in its chat completion responses,
unlike other LLM providers (OpenAI, Anthropic, DashScope). This causes the
AGUI protocol and other components that depend on ChatResponse.getId() to
fail with NullPointerException: "messageId cannot be null".

Root Cause Chain:
Ollama API → no id field

OllamaResponseParser → ChatResponse.id = null

Msg.id = null

Event.getMessage().getId() = null

AguiEvent.TextMessageStart → NullPointerException

Solution

Modified ChatResponse.Builder.build() to automatically generate a UUID when
id is null or empty. This provides a unified solution that:

  1. Fixes the Ollama issue - Ollama responses now get auto-generated IDs
  2. Benefits all parsers - Any future provider without an id field is handled
    automatically
  3. Maintains backward compatibility - Providers that return an id keep their
    original value
  4. Follows single responsibility principle - The Builder ensures valid object
    state

Changes

Modified Files:

  • ChatResponse.java - Modified build() method to auto-generate UUID when id
    is null or empty
  • AnthropicResponseParserTest.java - Updated test assertion to verify
    auto-generated IDs

Code Change:
public ChatResponse build() {
// Auto-generate id if not set or empty (for providers like Ollama)
String responseId = this.id;
if (responseId == null || responseId.isEmpty()) {
responseId = UUID.randomUUID().toString();
}
return new ChatResponse(responseId, content, usage, metadata,
finishReason);
}

Testing

Manual Testing:

  • Created reproduction test using OllamaChatModel with AGUI protocol
  • Verified that all events now contain valid messageId values
  • Tested with Ollama model qwen2.5:0.5b running locally

Automated Testing:

  • All existing tests pass (3289 tests, 0 failures, 118 skipped)
  • Updated AnthropicResponseParserTest to reflect new behavior

Checklist

  • Code has been formatted with mvn spotless:apply
  • All tests are passing (mvn test)
  • Javadoc comments are complete and follow project conventions
  • Related documentation has been updated (e.g. links, examples, etc.)
  • Code is ready for review

@JGoP-L JGoP-L requested review from a team and Copilot February 4, 2026 10:48
@gemini-code-assist
Copy link

Summary of Changes

Hello @JGoP-L, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!

This pull request resolves an issue where the ChatResponse object would have a null id when interacting with LLMs that do not provide this field in their API responses, such as Ollama. This null id led to NullPointerExceptions in downstream components like the AGUI protocol. The solution implements a robust mechanism within the ChatResponse.Builder to automatically generate a UUID for the id if it's missing, ensuring all ChatResponse instances always have a valid identifier. This change improves system stability and compatibility with a wider range of LLM providers.

Highlights

  • Problem Resolution: Addresses NullPointerExceptions caused by LLMs (like Ollama) not providing an id field in their chat completion responses, which affected the AGUI protocol.
  • ID Auto-generation: The ChatResponse.Builder.build() method now automatically generates a UUID for the id field if it is null or empty.
  • Enhanced Compatibility: Ensures that the system is compatible with all LLM providers, even those that do not return an id field, without breaking existing integrations.
  • Test Updates: The AnthropicResponseParserTest.java was updated to verify the new auto-generation behavior for ChatResponse IDs.
Changelog
  • agentscope-core/src/main/java/io/agentscope/core/model/ChatResponse.java
    • Added java.util.UUID import.
    • Modified the build() method in ChatResponse.Builder to check if the id is null or empty, and if so, assign a newly generated UUID.randomUUID().toString() to it.
    • Updated Javadoc for the build() method to document the new auto-generation behavior.
  • agentscope-core/src/test/java/io/agentscope/core/formatter/anthropic/AnthropicResponseParserTest.java
    • Added org.junit.jupiter.api.Assertions.assertFalse import.
    • Updated the testParseStreamEventUnknownType method to assert that response.getId() is no longer null and is not empty, confirming the auto-generation of IDs.
Activity
  • Manual testing was performed using OllamaChatModel with the AGUI protocol, verifying valid messageId values and testing with a local Ollama model.
  • Automated tests were run, with all 3289 existing tests passing (118 skipped).
  • The AnthropicResponseParserTest was updated to reflect the new behavior.
  • Code formatting (mvn spotless:apply), test passing (mvn test), Javadoc completion, and documentation updates were confirmed as per the checklist.
Using Gemini Code Assist

The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.

Invoking Gemini

You can request assistance from Gemini at any point by creating a comment using either /gemini <command> or @gemini-code-assist <command>. Below is a summary of the supported commands on the current page.

Feature Command Description
Code Review /gemini review Performs a code review for the current pull request in its current state.
Pull Request Summary /gemini summary Provides a summary of the current pull request in its current state.
Comment @gemini-code-assist Responds in comments when explicitly tagged, both in pull request comments and review comments.
Help /gemini help Displays a list of available commands.

Customization

To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/ folder in the base of the repository. Detailed instructions can be found here.

Limitations & Feedback

Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here.

You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension.

Footnotes

  1. Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configure Gemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution.

Copy link

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request effectively addresses a NullPointerException that occurs when an LLM provider, such as Ollama, does not return an ID in its response. The solution of auto-generating a UUID in the ChatResponse.Builder is a clean and centralized approach that improves the robustness of the system for all model integrations. The accompanying test update correctly verifies this new behavior. I have one minor suggestion to further strengthen the ID validation.

Comment on lines 190 to 192
if (responseId == null || responseId.isEmpty()) {
responseId = UUID.randomUUID().toString();
}

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

The current check for id allows for strings containing only whitespace to be considered valid. This could lead to issues if downstream components expect a non-blank identifier. Using trim().isEmpty() would be more robust by ensuring that whitespace-only strings are also treated as invalid and trigger the UUID generation.

Suggested change
if (responseId == null || responseId.isEmpty()) {
responseId = UUID.randomUUID().toString();
}
if (responseId == null || responseId.trim().isEmpty()) {
responseId = UUID.randomUUID().toString();
}

Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR fixes issue #708 where the AGUI protocol fails when using OllamaChatModel because Ollama's API doesn't return an id field in chat completion responses, causing NullPointerException in AGUI event constructors that require non-null message IDs.

Changes:

  • Modified ChatResponse.Builder.build() to auto-generate a UUID when the id field is null or empty
  • Updated AnthropicResponseParserTest to verify auto-generated IDs instead of expecting null

Reviewed changes

Copilot reviewed 2 out of 2 changed files in this pull request and generated no comments.

File Description
agentscope-core/src/main/java/io/agentscope/core/model/ChatResponse.java Added UUID auto-generation logic in the Builder's build() method to ensure all ChatResponse instances have valid IDs
agentscope-core/src/test/java/io/agentscope/core/formatter/anthropic/AnthropicResponseParserTest.java Updated test to verify that IDs are auto-generated when not provided by the parser

@codecov
Copy link

codecov bot commented Feb 4, 2026

Codecov Report

❌ Patch coverage is 75.00000% with 1 line in your changes missing coverage. Please review.

Files with missing lines Patch % Lines
...in/java/io/agentscope/core/model/ChatResponse.java 75.00% 0 Missing and 1 partial ⚠️

📢 Thoughts on this report? Let us know!

…agentscope-ai#708)

Ollama API doesn't return an id field in its chat completion responses,
unlike other LLM providers (OpenAI, Anthropic, DashScope). This causes the
AGUI protocol and other components that depend on ChatResponse.getId() to
fail with "messageId cannot be null".

Solution:
- Modified ChatResponse.Builder.build() to auto-generate a UUID when id
  is null, empty, or blank (whitespace-only)
- Updated AnthropicResponseParserTest to verify auto-generated IDs

Per review feedback:
- Copilot: Added isEmpty() check for empty strings
- gemini-code-assist: Added trim().isEmpty() for blank strings

This ensures compatibility across all LLM providers while maintaining
backward compatibility with providers that return an id.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
@JGoP-L JGoP-L force-pushed the fix(model)-auto-generate-ChatResponse.id-when-LLM-doesn't-provide-it-(#708) branch from 7d93f73 to 9a23bcb Compare February 4, 2026 11:16
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant