ComfyUI get's stuck after clicking queue prompt
Your question
Okay so I am trying to run FLUX locally using ComfyUI, I am using the flux.1 dev, and the t5xxl_fp16 clip
My GPU is: RTX 3080 10GB VRAM
And I also just downloaded the latest version of comfyui that was posted 10 hours ago when writing this. I am using the portable version.
Logs
Starting server
To see the GUI go to: http://127.0.0.1:8188
got prompt
model weight dtype torch.bfloat16, manual cast: None
model_type FLUX
Other
No response
Same problem here.
I tried debugging and see that the program terminates without any error message at layers.py, line 141
self.txt_mlp = nn.Sequential( operations.Linear(hidden_size, mlp_hidden_dim, bias=True, dtype=dtype, device=device), nn.GELU(approximate="tanh"), operations.Linear(mlp_hidden_dim, hidden_size, bias=True, dtype=dtype, device=device), )
Wrapping the code in a try-except didn't help. It just terminates without reaching the except block.
Same problem when I download flux first time work well but after updating comfy it takes forever to compute, load diffusion model takes forever ever and stuck there.
Same problem here.
I tried debugging and see that the program terminates without any error message at layers.py, line 141
self.txt_mlp = nn.Sequential( operations.Linear(hidden_size, mlp_hidden_dim, bias=True, dtype=dtype, device=device), nn.GELU(approximate="tanh"), operations.Linear(mlp_hidden_dim, hidden_size, bias=True, dtype=dtype, device=device), )
Wrapping the code in a try-except didn't help. It just terminates without reaching the except block.
It could be an issue occurring in torch's native code. Try downgrading it once.
Have you tried running the fp8 versions of the model and clip?
My Rig RTX260 6gb V-Ram 64gb Ram, and it works in my Rig but take like 1 hour to load the model but this behaviuo is after update ComfyUI before it takes from 20 to 30 min to load, I'm using Kijai fp8 model.
This issue is being marked stale because it has not had any activity for 30 days. Reply below within 7 days if your issue still isn't solved, and it will be left open. Otherwise, the issue will be closed automatically.