ComfyUI icon indicating copy to clipboard operation
ComfyUI copied to clipboard

CLIPTextEncodeFlux very slow

Open mno1337 opened this issue 1 year ago • 25 comments

Your question

CLIPTextEncodeFlux its taking around 80-90 seconds, started a few days ago. Before that it didnt take so long. No vram issues or anything, latest version of comfy etc...

Logs

No response

Other

No response

mno1337 avatar Sep 02 '24 10:09 mno1337

Ok it got fixed with updating pytorch to 2.5.0 but then sampler got twice as slow instead.. from 1.1s/it to 2.2s/it...

mno1337 avatar Sep 02 '24 12:09 mno1337

Same issue here, CLIPTextEncodeFlux taking painfully long times in the last few days. Didn't update pytorch to 2.5.0...

Zdeto avatar Sep 06 '24 16:09 Zdeto

What is your workflow and what is your hardware spec?

ltdrdata avatar Sep 07 '24 02:09 ltdrdata

4ltData.json comfyui.log

All right, I've attached the workflow and the comfyui log. The first run of the workflow took 444s, the next run just ~250s, because the cliptext was already encoded. This workflow has a Facedetailer in it, but the prompt I run generated an image without discernable faces, so this step was basically skipped. But I used to run this workflow (including the Facedetailer steps) in under 300s!

My hardware specs... i7-10700K, 32Gb RAM, RTX 3060 12Gb VRAM, Windows 11 Pro, Comfy and models are all on SSDs. Thank you! :)

Zdeto avatar Sep 07 '24 09:09 Zdeto

I got it fixed. I updated torch to 2.5.0 then downgraded again to 2.4.1 and now my text encode takes like 6 seconds.

Den 7 sep. 2024 11:51, kI 11:51, Zdeto @.***> skrev:

4ltData.json comfyui.log

All right, I've attached the workflow and the comfyui log. The first run of the workflow took 444s, the next run just ~250s, because the cliptext was already encoded. This workflow has a Facedetailer in it, but the prompt I run generated an image without discernable faces, so this step was basically skipped. But I used to run this workflow (including the Facedetailer steps) in under 300s!

My hardware specs... i7-10700K, 32Gb RAM, RTX 3060 12Gb VRAM, Windows 11 Pro, Comfy and models are all on SSDs. Thank you! :)

-- Reply to this email directly or view it on GitHub: https://github.com/comfyanonymous/ComfyUI/issues/4745#issuecomment-2335136325 You are receiving this because you authored the thread.

Message ID: @.***>

mno1337 avatar Sep 07 '24 10:09 mno1337

I got it fixed. I updated torch to 2.5.0 then downgraded again to 2.4.1 and now my text encode takes like 6 seconds.

Ok, how exactly did you do that? Because I get an error when I try to update torch to 2.5.0 "ERROR: No matching distribution found for torch==2.5.0"

Zdeto avatar Sep 07 '24 10:09 Zdeto

https://pytorch.org/ Choose Preview (Nightly)

Den 7 sep. 2024 12:40, kI 12:40, Zdeto @.***> skrev:

I got it fixed. I updated torch to 2.5.0 then downgraded again to 2.4.1 and now my text encode takes like 6 seconds.

Ok, how exactly did you do that? Because I get an error when I try to update torch to 2.5.0 "ERROR: No matching distribution found for torch==2.5.0"

-- Reply to this email directly or view it on GitHub: https://github.com/comfyanonymous/ComfyUI/issues/4745#issuecomment-2335147843 You are receiving this because you authored the thread.

Message ID: @.***>

mno1337 avatar Sep 07 '24 11:09 mno1337

https://pytorch.org/ Choose Preview (Nightly) Den 7 sep. 2024 12:40, kI 12:40, Zdeto @.***> skrev:

I got it fixed. I updated torch to 2.5.0 then downgraded again to 2.4.1 and now my text encode takes like 6 seconds. Ok, how exactly did you do that? Because I get an error when I try to update torch to 2.5.0 "ERROR: No matching distribution found for torch==2.5.0" -- Reply to this email directly or view it on GitHub: #4745 (comment) You are receiving this because you authored the thread. Message ID: @.***>

Please bare with me, I must be doing something wrong... So, I run this command and got errors about Requirement already satisfied.

e:\ComfyUI_windows_portable>python_embeded\python.exe -s -m pip install --pre torch torchvision torchaudio --index-url https://download.pytorch.org/whl/nightly/cu121 Looking in indexes: https://download.pytorch.org/whl/nightly/cu121 Requirement already satisfied: torch in e:\comfyui_windows_portable\python_embeded\lib\site-packages (2.4.0+cu121) Requirement already satisfied: torchvision in e:\comfyui_windows_portable\python_embeded\lib\site-packages (0.19.0+cu121) Requirement already satisfied: torchaudio in e:\comfyui_windows_portable\python_embeded\lib\site-packages (2.4.0+cu121) Requirement already satisfied: filelock in e:\comfyui_windows_portable\python_embeded\lib\site-packages (from torch) (3.15.4) Requirement already satisfied: typing-extensions>=4.8.0 in e:\comfyui_windows_portable\python_embeded\lib\site-packages (from torch) (4.12.2) Requirement already satisfied: sympy in e:\comfyui_windows_portable\python_embeded\lib\site-packages (from torch) (1.12.1) Requirement already satisfied: networkx in e:\comfyui_windows_portable\python_embeded\lib\site-packages (from torch) (3.3) Requirement already satisfied: jinja2 in e:\comfyui_windows_portable\python_embeded\lib\site-packages (from torch) (3.1.4) Requirement already satisfied: fsspec in e:\comfyui_windows_portable\python_embeded\lib\site-packages (from torch) (2024.5.0) Requirement already satisfied: numpy in e:\comfyui_windows_portable\python_embeded\lib\site-packages (from torchvision) (1.26.4) Requirement already satisfied: pillow!=8.3.*,>=5.3.0 in e:\comfyui_windows_portable\python_embeded\lib\site-packages (from torchvision) (10.4.0) Requirement already satisfied: MarkupSafe>=2.0 in e:\comfyui_windows_portable\python_embeded\lib\site-packages (from jinja2->torch) (2.1.5) Requirement already satisfied: mpmath<1.4.0,>=1.1.0 in e:\comfyui_windows_portable\python_embeded\lib\site-packages (from sympy->torch) (1.3.0)

Zdeto avatar Sep 07 '24 14:09 Zdeto

All right, I had to uninstall first... stupid me :))) So, I uninstalled it python_embeded\python.exe -s -m pip uninstall torch torchvision torchaudio then run again python_embeded\python.exe -s -m pip install --pre torch torchvision torchaudio --index-url https://download.pytorch.org/whl/nightly/cu121

After that, I fired up Comfy... got some errors of course, but it started. Run the workflow and bam, ~3s for the CLIPTextEncodeFlux. But, as it happened to you, it doubled my sampler's s/it. Uninstalled again, then reinstalled torch 2.4.1. Ran the Comfy again, reload the workflow and now it taskes ~20s for the CLIPTextEncodeFlux and went back to the inference time. Still a bit long, but at least I got back to under 300s for this workflow to run :)

Zdeto avatar Sep 07 '24 14:09 Zdeto

Hehe yea same here, sampler got alot slower with 2.5.0!

Den 7 sep. 2024 16:34, kI 16:34, Zdeto @.***> skrev:

All right, I had to uninstall first... stupid me :))) So, I uninstalled it python_embeded\python.exe -s -m pip uninstall torch torchvision torchaudio then run again python_embeded\python.exe -s -m pip install --pre torch torchvision torchaudio --index-url https://download.pytorch.org/whl/nightly/cu121

After that, I fired up Comfy... got some errors of course, but it started. Run the workflow and bam, ~3s for the CLIPTextEncodeFlux. But, as it happened to you, it doubled my sampler's it/s. Uninstalled again, then reinstalled torch 2.4.1. Ran the Comfy again, reload the workflow and now it taskes ~20s for the CLIPTextEncodeFlux and went back to the inference time. Still a bit long, but at least I got back to under 300s for this workflow to run :)

-- Reply to this email directly or view it on GitHub: https://github.com/comfyanonymous/ComfyUI/issues/4745#issuecomment-2335271548 You are receiving this because you authored the thread.

Message ID: @.***>

mno1337 avatar Sep 07 '24 14:09 mno1337

i updated comfy and now its damn slow again... 80-90s

mno1337 avatar Sep 11 '24 08:09 mno1337

Yeah, made the same mistake :( image

Zdeto avatar Sep 13 '24 14:09 Zdeto

"#112 [CLIPTextEncodeFlux]: 143.94s".... Getting sick of it haha

Den 13 sep. 2024 16:13, kI 16:13, Zdeto @.***> skrev:

Yeah, made the same mistake :( image

-- Reply to this email directly or view it on GitHub: https://github.com/comfyanonymous/ComfyUI/issues/4745#issuecomment-2349063085 You are receiving this because you authored the thread.

Message ID: @.***>

mno1337 avatar Sep 13 '24 16:09 mno1337

eventually I solved it as before, but it's annoying...

Zdeto avatar Sep 13 '24 18:09 Zdeto

Yea but it will come back ;)

Den 13 sep. 2024 20:02, kI 20:02, Zdeto @.***> skrev:

eventually I solved it as before, but it's annoying...

-- Reply to this email directly or view it on GitHub: https://github.com/comfyanonymous/ComfyUI/issues/4745#issuecomment-2349701214 You are receiving this because you authored the thread.

Message ID: @.***>

mno1337 avatar Sep 13 '24 18:09 mno1337

I can't believe we're the only lucky ones to encounter this. Nobody else has this issue?! :))

Zdeto avatar Sep 14 '24 08:09 Zdeto

Yea i know, its unbelievable... You should read about it everywhere! What hardware do you have?

Den 14 sep. 2024 10:25, kI 10:25, Zdeto @.***> skrev:

I can't believe we're the only lucky ones to encounter this. Nobody else has this issue?! :))

-- Reply to this email directly or view it on GitHub: https://github.com/comfyanonymous/ComfyUI/issues/4745#issuecomment-2350911297 You are receiving this because you authored the thread.

Message ID: @.***>

mno1337 avatar Sep 14 '24 08:09 mno1337

Look above, at the first part of this thread. I have listed my hardware and comfy.log (there are hardware specs in it)

Zdeto avatar Sep 14 '24 13:09 Zdeto

Look above, at the first part of this thread. I have listed my hardware and comfy.log (there are hardware specs in it)

Ah i see.. not same as me so nothing to do with that. Everyone i asked have never had problem with it so idk wtf is going on, annoying as hell anyway :D

mno1337 avatar Sep 14 '24 13:09 mno1337

I have the problem too. it takes the same amount of time as the sampler does.

RaySteve312 avatar Sep 21 '24 17:09 RaySteve312

For those experiencing this issue, please check the following:

  1. Does the same phenomenon occur when using a normal clip model instead of gguf?
  2. When checking memory usage through task manager, is swapping not occurring?

ltdrdata avatar Sep 22 '24 00:09 ltdrdata

got same issue, and I found GPU is not using during CLIPTextEncodeFlux runing

swim2sun avatar Oct 11 '24 19:10 swim2sun

I found my problem. It was when i restarted comfy from the manager. Then it doesnt start with my args.

Den 11 okt. 2024 21:25, kI 21:25, Xiangyang You @.***> skrev:

got same issue, and I found GPU is not using during CLIPTextEncodeFlux runing

-- Reply to this email directly or view it on GitHub: https://github.com/comfyanonymous/ComfyUI/issues/4745#issuecomment-2408003863 You are receiving this because you authored the thread.

Message ID: @.***>

mno1337 avatar Oct 11 '24 19:10 mno1337

got same issue, and I found GPU is not using during CLIPTextEncodeFlux runing

I fixed by following steps, I have no idea which one works, pls try 3rd step firstly:

  1. run update_comfyui_and_python_dependencies.bat
  2. install pytorch 2.4.0 manually
  3. remove the --lowvram in run_nvidia_gpu.bat

swim2sun avatar Oct 12 '24 02:10 swim2sun

I found my problem. It was when i restarted comfy from the manager. Then it doesnt start with my args. Den 11 okt. 2024 21:25, kI 21:25, Xiangyang You @.> skrev: got same issue, and I found GPU is not using during CLIPTextEncodeFlux runing -- Reply to this email directly or view it on GitHub: #4745 (comment) You are receiving this because you authored the thread. Message ID: @.>

oh...... I need to check that. edit: As I tested it, when rebooting through the manager, all args are being passed correctly without any omissions.

ltdrdata avatar Oct 13 '24 04:10 ltdrdata

This issue is being marked stale because it has not had any activity for 30 days. Reply below within 7 days if your issue still isn't solved, and it will be left open. Otherwise, the issue will be closed automatically.

github-actions[bot] avatar Feb 11 '25 11:02 github-actions[bot]