ComfyUI
ComfyUI copied to clipboard
Support for async execution functions
This commit adds support for node execution functions defined as async. When a node's execution function is defined as async, we can continue executing other nodes while it is processing.
Standard uses of await should "just work", but people will still have
to be careful if they spawn actual threads. Because torch doesn't really
have async/await versions of functions, this won't particularly help
with most locally-executing nodes, but it does work for e.g. web
requests to other machines.
In addition to the execute function, the VALIDATE_INPUTS and
check_lazy_status functions can also be defined as async, though we'll
only resolve one node at a time right now for those.
Currently the plan is to merge this after Subgraph is released.
Closing in favor of this rebased PR: https://github.com/comfyanonymous/ComfyUI/pull/8830
Hi I am a small filmmaker want to use ComfyUI to generate few shots up to 15 seconds. I have a very basic knowledge of Python, I have setup my Laptop Corei7, 24GB RAM, RTX 4060 with 8 GB VRAM, till now I am not able to run ComfyUI fully functional on my laptop due to incompatibility with various Python dependencies, Can anyone guide me how to setup ComfyUI so that I may try to generate a video from text or image