Skip to content

Main (#3)#13057

Closed
kristentr wants to merge 2 commits intoComfy-Org:masterfrom
kristentr:master
Closed

Main (#3)#13057
kristentr wants to merge 2 commits intoComfy-Org:masterfrom
kristentr:master

Conversation

@kristentr
Copy link
Copy Markdown

@kristentr kristentr commented Mar 18, 2026

  • Add prompt_id to progress_text binary WS messages

Add supports_progress_text_metadata feature flag and extend send_progress_text() to accept optional prompt_id param. When prompt_id is provided and the client supports the new format, the binary wire format includes a length-prefixed prompt_id field:

[4B event_type][4B prompt_id_len][prompt_id][4B node_id_len][node_id][text]

Legacy format preserved for clients without the flag.

Both callers (nodes_images.py, client.py) updated to pass prompt_id from get_executing_context().

Part of COM-12671: parallel workflow execution support.

Amp-Thread-ID: https://ampcode.com/threads/T-019c79f7-f19b-70d9-b662-0687cc206282

  • refactor: add prompt_id as hidden type, fix imports, add docstrings
  • Add PROMPT_ID as a new hidden type in the Hidden enum, HiddenHolder, HiddenInputTypeDict, and execution engine resolution (both V3 and legacy)
  • Refactor GetImageSize to use cls.hidden.prompt_id instead of manually calling get_executing_context() — addresses reviewer feedback
  • Remove lazy import of get_executing_context from nodes_images.py
  • Add docstrings to send_progress_text, _display_text, HiddenHolder, and HiddenHolder.from_dict

Amp-Thread-ID: https://ampcode.com/threads/T-019ca1cb-0150-7549-8b1b-6713060d3408

  • fix: send_progress_text unicasts to client_id instead of broadcasting
  • Default sid to self.client_id when not explicitly provided, matching every other WS message dispatch (executing, executed, progress_state, etc.)
  • Previously sid=None caused broadcast to all connected clients
  • Format signature per ruff, remove redundant comments
  • Add unit tests for routing, legacy format, and new prompt_id format

Amp-Thread-ID: https://ampcode.com/threads/T-019ca3ce-c530-75dd-8d68-349e745a022e

  • remove send_progress_text stub tests

Copy-paste stub tests don't verify the real implementation and add maintenance burden without meaningful coverage.

Amp-Thread-ID: https://ampcode.com/threads/T-019ca3ce-c530-75dd-8d68-349e745a022e

  • fix: always send new binary format when client supports feature flag

When prompt_id is None, encode as zero-length string instead of falling back to old format. Prevents binary parse corruption on the frontend.

Addresses review feedback:
#12540 (comment)


API Node PR Checklist

Scope

  • Is API Node Change

Pricing & Billing

  • Need pricing update
  • No pricing update

If Need pricing update:

  • Metronome rate cards updated
  • Auto‑billing tests updated and passing

QA

  • QA done
  • QA not required

Comms

  • Informed Kosinkadink

* Add prompt_id to progress_text binary WS messages

Add supports_progress_text_metadata feature flag and extend
send_progress_text() to accept optional prompt_id param. When
prompt_id is provided and the client supports the new format,
the binary wire format includes a length-prefixed prompt_id field:

  [4B event_type][4B prompt_id_len][prompt_id][4B node_id_len][node_id][text]

Legacy format preserved for clients without the flag.

Both callers (nodes_images.py, client.py) updated to pass prompt_id
from get_executing_context().

Part of COM-12671: parallel workflow execution support.

Amp-Thread-ID: https://ampcode.com/threads/T-019c79f7-f19b-70d9-b662-0687cc206282

* refactor: add prompt_id as hidden type, fix imports, add docstrings

- Add PROMPT_ID as a new hidden type in the Hidden enum, HiddenHolder,
  HiddenInputTypeDict, and execution engine resolution (both V3 and legacy)
- Refactor GetImageSize to use cls.hidden.prompt_id instead of manually
  calling get_executing_context() — addresses reviewer feedback
- Remove lazy import of get_executing_context from nodes_images.py
- Add docstrings to send_progress_text, _display_text, HiddenHolder,
  and HiddenHolder.from_dict

Amp-Thread-ID: https://ampcode.com/threads/T-019ca1cb-0150-7549-8b1b-6713060d3408

* fix: send_progress_text unicasts to client_id instead of broadcasting

- Default sid to self.client_id when not explicitly provided, matching
  every other WS message dispatch (executing, executed, progress_state, etc.)
- Previously sid=None caused broadcast to all connected clients
- Format signature per ruff, remove redundant comments
- Add unit tests for routing, legacy format, and new prompt_id format

Amp-Thread-ID: https://ampcode.com/threads/T-019ca3ce-c530-75dd-8d68-349e745a022e

* remove send_progress_text stub tests

Copy-paste stub tests don't verify the real implementation and add
maintenance burden without meaningful coverage.

Amp-Thread-ID: https://ampcode.com/threads/T-019ca3ce-c530-75dd-8d68-349e745a022e

* fix: always send new binary format when client supports feature flag

When prompt_id is None, encode as zero-length string instead of falling
back to old format. Prevents binary parse corruption on the frontend.

Addresses review feedback:
Comfy-Org#12540 (comment)

---------

Co-authored-by: bymyself <cbyrne@comfy.org>
@notion-workspace
Copy link
Copy Markdown

@coderabbitai
Copy link
Copy Markdown

coderabbitai bot commented Mar 18, 2026

📝 Walkthrough

Walkthrough

These changes introduce prompt_id tracking across the execution pipeline. Hidden input types are extended to optionally carry a prompt identifier. A new server feature flag "supports_progress_text_metadata" indicates client capability. The execution flow now threads prompt_id through input data processing. Progress text messaging is enhanced with optional metadata that includes prompt_id, using feature-flag-based protocol framing—new format with metadata when supported, fallback to original format otherwise.

🚥 Pre-merge checks | ✅ 1 | ❌ 2

❌ Failed checks (1 warning, 1 inconclusive)

Check name Status Explanation Resolution
Docstring Coverage ⚠️ Warning Docstring coverage is 36.36% which is insufficient. The required threshold is 80.00%. Write docstrings for the functions missing them to satisfy the coverage threshold.
Title check ❓ Inconclusive The title 'Main (#3)' is vague and does not convey meaningful information about the changeset. Use a descriptive title that summarizes the main change, such as 'Add prompt_id support to progress_text WebSocket messages' or 'Support parallel workflow execution with prompt_id tracking'.
✅ Passed checks (1 passed)
Check name Status Explanation
Description check ✅ Passed The description is comprehensive and clearly related to the changeset, covering prompt_id additions, feature flags, WebSocket messaging, refactoring, and test updates.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

📝 Coding Plan
  • Generate coding plan for human review comments

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Tip

CodeRabbit can use your project's `pylint` configuration to improve the quality of Python code reviews.

Add a pylint configuration file to your project to customize how CodeRabbit runs pylint.

Copy link
Copy Markdown

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@server.py`:
- Around line 1269-1313: The code currently uses self.client_id as the default
target_sid which races between concurrent prompts; change send_progress_text so
when sid is None it first tries a prompt-scoped lookup (e.g. target_sid =
self.prompt_client_map.get(prompt_id) when prompt_id is provided) and only if
that returns None fall back to self.client_id, and ensure callers that expect
broadcast must pass sid explicitly; update send_progress_text (and
create/maintain self.prompt_client_map if not present) to use this prompt-scoped
resolution before using self.client_id and avoid relying on a single shared
client_id for per-prompt routing.

ℹ️ Review info
⚙️ Run configuration

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

Run ID: 25415065-301b-4164-b512-68b413d7ab4f

📥 Commits

Reviewing files that changed from the base of the PR and between 56ff88f and e4dbc95.

📒 Files selected for processing (7)
  • comfy/comfy_types/node_typing.py
  • comfy_api/feature_flags.py
  • comfy_api/latest/_io.py
  • comfy_api_nodes/util/client.py
  • comfy_extras/nodes_images.py
  • execution.py
  • server.py

Comment on lines 1269 to +1313
def send_progress_text(
self, text: Union[bytes, bytearray, str], node_id: str, sid=None
self,
text: Union[bytes, bytearray, str],
node_id: str,
prompt_id: Optional[str] = None,
sid=None,
):
"""Send a progress text message to the client via WebSocket.

Encodes the text as a binary message with length-prefixed node_id. When
the client supports the ``supports_progress_text_metadata`` feature flag,
the prompt_id is always prepended as a length-prefixed field (empty string
when None) to ensure consistent binary framing.

Args:
text: The progress text content to send.
node_id: The unique identifier of the node sending the progress.
prompt_id: Optional prompt/job identifier to associate the message with.
sid: Optional session ID to target a specific client.
"""
if isinstance(text, str):
text = text.encode("utf-8")
node_id_bytes = str(node_id).encode("utf-8")

# Pack the node_id length as a 4-byte unsigned integer, followed by the node_id bytes
message = struct.pack(">I", len(node_id_bytes)) + node_id_bytes + text
# Auto-resolve sid to the currently executing client
target_sid = sid if sid is not None else self.client_id

# When client supports the new format, always send
# [prompt_id_len][prompt_id][node_id_len][node_id][text]
# even when prompt_id is None (encoded as zero-length string)
if feature_flags.supports_feature(
self.sockets_metadata, target_sid, "supports_progress_text_metadata"
):
prompt_id_bytes = (prompt_id or "").encode("utf-8")
message = (
struct.pack(">I", len(prompt_id_bytes))
+ prompt_id_bytes
+ struct.pack(">I", len(node_id_bytes))
+ node_id_bytes
+ text
)
else:
message = struct.pack(">I", len(node_id_bytes)) + node_id_bytes + text

self.send_sync(BinaryEventTypes.TEXT, message, sid)
self.send_sync(BinaryEventTypes.TEXT, message, target_sid)
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Make the default socket lookup prompt-scoped before parallel execution.

Line 1294 still resolves target_sid from the single shared self.client_id. Once two prompts execute concurrently, whichever prompt updates that field last wins, so progress text for another prompt can be delivered to the wrong client; if it is unset, Line 1313 also falls back to the broadcast path. The new prompt_id metadata only helps if the frame reaches the correct socket, so this fallback should resolve from a prompt-scoped client mapping or require callers to pass sid explicitly.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@server.py` around lines 1269 - 1313, The code currently uses self.client_id
as the default target_sid which races between concurrent prompts; change
send_progress_text so when sid is None it first tries a prompt-scoped lookup
(e.g. target_sid = self.prompt_client_map.get(prompt_id) when prompt_id is
provided) and only if that returns None fall back to self.client_id, and ensure
callers that expect broadcast must pass sid explicitly; update
send_progress_text (and create/maintain self.prompt_client_map if not present)
to use this prompt-scoped resolution before using self.client_id and avoid
relying on a single shared client_id for per-prompt routing.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants