ComfyUI
ComfyUI copied to clipboard
[Question] [FeatureRequestApproval] Best approach to ComfyUI scalability, aka simultaneous generations?
Is there a way to tweak ComfyUI to allow for simultaneous processing of the job queue?
I would like to be able to achieve a scalable ComfyUI API, by default prompts will simply be left on the queue and be processed 1 by 1. I'd like to be able to scale to parallel processing without necessarily hosting more than 1 comfyui client, is it possible?
Thanks in advance for any tips or advice.
https://github.com/comfyanonymous/ComfyUI/blob/ee2c5fa72d4fc1714576fac7ba64aa5d607303d0/main.py#L216
Could it work if I simply added more workers to the thread? (will try tomorrow)
Would you accept a PR to have the amount of concurrent workers as a param perhaps?
Any updates on this? This is needed 😇
Im looking for a solution where different computers and gpu joins the queue and process jobs
https://github.com/comfyanonymous/ComfyUI/blob/ee2c5fa72d4fc1714576fac7ba64aa5d607303d0/main.py#L216
Could it work if I simply added more workers to the thread? (will try tomorrow)
Would you accept a PR to have the amount of concurrent workers as a param perhaps?
have you tried it?
Same confuse, any update on this?
any update? everyone...