ComfyUI native API integration with ComfyStream#59
Closed
BuffMcBigHuge wants to merge 14 commits intolivepeer:mainfrom
Closed
ComfyUI native API integration with ComfyStream#59BuffMcBigHuge wants to merge 14 commits intolivepeer:mainfrom
BuffMcBigHuge wants to merge 14 commits intolivepeer:mainfrom
Conversation
…ced uncessary base64 input frame operations, prep for multi-instance, cleanup.
…dded config for server management.
…ame size handling, commented out some logging.
Collaborator
|
For reference, I had to qualify the For the workers, |
eliteprox
reviewed
Apr 1, 2025
Co-authored-by: John | Elite Encoder <john@eliteencoder.net>
Author
Fixed! |
…the ui, cleanup of tensor code.
…ediate step, moved prompt execution strategy to `execution_start` event, moved buffer to self variable to avoid reinitalization.
Author
|
This work will now continue in ComfyUI native API integration with Spawn #130. |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Introduction:
One of the primary limitations of building workflows within ComfyStream is the use of the Hidden Switch fork.
Many difficulties arise when particular node packs do not play well with the EmbeddedComfyClient. Careful testing and modifications to existing nodes is usually required to enable full functionality. Nonetheless, there are usually other issues that may come from dependency on the fork, such as delays or limitations with newer ComfyUI features, including native performance optimizations.
The primary issue is handling of multiple Comfy instances to brute force frame generation.
Objective:
I set out to replace the EmbeddedComfyClient with communication directly to running ComfyUI instances using the native ComfyUI API and Web Socket connection.
Method:
Sending Data: All data is sent via RESTful
POST /promptthrough the Comfy API. Custom nodes were added to support sending the input image as a base64 string to the prompt.Receiving Data: Message events from the webhook are parsed, and data can be received via the native
send_imagehandler to push as WebRTC frames. The comfyui-tooling-nodes inspired this via a Blob format with prefix similar to how Comfy sends previews to the UI. Upon successfully capturing the Blob, the prompt can then be called for the next subsequent frame.Limitations:
It is obvious that this process is not as efficient as the Hidden Switch method of communicating with the ComfyStream tensor_cache directly, however it opens up new opportunities for parallelization through multi-inference gpu scaling as well as multi-gpu scaling, an avenue I'm investigating as a performance increase.
Note that this preliminary DRAFT is very early and the proof of concept was just demonstrated as functional. More work is to be done.
TODOs:
ai-runnerGetting it Running:
app_api.pyfile instead ofapp.py:Visual example of multi-Comfy instance processing:
Screen.Recording.2025-03-25.183747.mp4