ComfyUI icon indicating copy to clipboard operation
ComfyUI copied to clipboard

Multiple gpu's

Open SuperComboGamer opened this issue 2 years ago • 15 comments

Is there anyway to use multiple gpu's for the same image or use multiple gpus for high batches to spread out the load.

SuperComboGamer avatar Mar 19 '23 16:03 SuperComboGamer

It looks like it uses accelerate, so you could try

accelerate config

in the venv or environment and setup multi-gpu from there

78Alpha avatar Mar 19 '23 17:03 78Alpha

Right now accelerate is only enabled in --lowvram mode.

The plan is to add an option to set the GPU comfyui will run on.

This is going to be further in the future but I'm planning on eventually adding support for connecting the UI to multiple comfyui backends at the same time so you can queue prompts on multiple GPUs/machines over the network.

comfyanonymous avatar Mar 19 '23 17:03 comfyanonymous

I figured out how to do multiple gpus for separate images on a different ui. But I want to be able to use 2 gpu's for one image at a time

SuperComboGamer avatar Mar 19 '23 17:03 SuperComboGamer

Right now accelerate is only enabled in --lowvram mode.

The plan is to add an option to set the GPU comfyui will run on.

This is going to be further in the future but I'm planning on eventually adding support for connecting the UI to multiple comfyui backends at the same time so you can queue prompts on multiple GPUs/machines over the network.

Would be amazing for running Comfy on farms, and remoting it in for jobs.

WASasquatch avatar Mar 19 '23 17:03 WASasquatch

Right now accelerate is only enabled in --lowvram mode.

The plan is to add an option to set the GPU comfyui will run on.

This is going to be further in the future but I'm planning on eventually adding support for connecting the UI to multiple comfyui backends at the same time so you can queue prompts on multiple GPUs/machines over the network.

Curious as to how far in the future we will have the ability to choose GPU because I am trying my best to have it running on my system's build in GPU. My inner noob crashed while doing so...

I honestly cant wait and again THANK YOU FOR THIS GREAT PIECE OF WORK

s-marcelle avatar Mar 19 '23 18:03 s-marcelle

Right now accelerate is only enabled in --lowvram mode. The plan is to add an option to set the GPU comfyui will run on. This is going to be further in the future but I'm planning on eventually adding support for connecting the UI to multiple comfyui backends at the same time so you can queue prompts on multiple GPUs/machines over the network.

Curious as to how far in the future we will have the ability to choose GPU because I am trying my best to have it running on my system's build in GPU. My inner noob crashed while doing so...

I honestly cant wait and again THANK YOU FOR THIS GREAT PIECE OF WORK

If you use easy diffusion it will let u use more than one gpu for different images at a time but one 2 gpus for one at the same time. I have went through around 100 ui allready for stable diffusion I have found thag cumfyui is the fastest working one so u could use easy diffusion to create a huge batch at a time then go to comfy ui to make alot of steps for a singular image.

SuperComboGamer avatar Mar 19 '23 18:03 SuperComboGamer

did I add multi-gpu support for easy diffusion? I can't even remember anymore.

Right now accelerate is only enabled in --lowvram mode. The plan is to add an option to set the GPU comfyui will run on. This is going to be further in the future but I'm planning on eventually adding support for connecting the UI to multiple comfyui backends at the same time so you can queue prompts on multiple GPUs/machines over the network.

Curious as to how far in the future we will have the ability to choose GPU because I am trying my best to have it running on my system's build in GPU. My inner noob crashed while doing so... I honestly cant wait and again THANK YOU FOR THIS GREAT PIECE OF WORK

If you use easy diffusion it will let u use more than one gpu for different images at a time but one 2 gpus for one at the same time. I have went through around 100 ui allready for stable diffusion I have found thag cumfyui is the fastest working one so u could use easy diffusion to create a huge batch at a time then go to comfy ui to make alot of steps for a singular image.

WASasquatch avatar Mar 20 '23 00:03 WASasquatch

Can someone clarify if it's possible to "send" workflows defined by comfyui into EasyDiffusion to leverage the multiGPU capability?

unphased avatar Jun 21 '23 08:06 unphased

I hope to provide guidance on how to develop this feature

kxbin avatar Nov 04 '23 07:11 kxbin

Being able to use multiple GPUs would really help in the future with stable diffusion video and whatever comes later. SVD uses dramatically more memory.

dnalbach avatar Mar 16 '24 00:03 dnalbach

try this: https://github.com/city96/ComfyUI_NetDist

rrfaria avatar Apr 10 '24 02:04 rrfaria

Have you guys tried using Swarm to achieve this? https://github.com/mcmonkeyprojects/SwarmUI

robinjhuang avatar Jul 03 '24 19:07 robinjhuang

HF diffusers can use Multi GPU in parallel using distrifuser or PipeFusion. https://github.com/mit-han-lab/distrifuser https://github.com/PipeFusion/PipeFusion

I have tested distrifuser, and the result was quite good. (I used run_sdxl.py --mode benchmark, it may be generating one image with 50 steps)

1x 3090 2x 3090 (PCIe x8/x8) 1x 4090 4090 + 3090 (PCIe x16/x4)
13.88824 s 7.93942 s 6.82159 s 8.04754 s

Is there plan to support such thing?

bedovyy avatar Jul 20 '24 21:07 bedovyy

now, with flux being massive I fear that larger models will become more common. my 3090 cant handle flux alone, it has to offload into system ram or disk. it would be nice to have the ability to split the workflow off onto my p40 so that the model isnt being loaded and unloaded from the main 3090. the p40 will slow down the 3090, but not nearly as much as system ram or swap space does, and it can process at least something while the swap would just sit there waiting.

yggdrasil75 avatar Aug 06 '24 12:08 yggdrasil75