ComfyUI icon indicating copy to clipboard operation
ComfyUI copied to clipboard

"new" warnings on a CLEAN install (portable)

Open stephantual opened this issue 1 year ago • 1 comments

ComfyUI Revision: 1965 [f44225fd] | Released on '2024-02-09'

Just a got a new Win 11 box so installed CUI on a completely unadultered machine. There are NO 3rd party nodes installed yet.

I get the following warnings when queuing anything:

Number 1

model_type EPS
adm 0
Using pytorch attention in VAE
Working with z of shape (1, 4, 32, 32) = 4096 dimensions.
Using pytorch attention in VAE
missing {'cond_stage_model.clip_l.logit_scale', 'cond_stage_model.clip_l.text_projection'}

Number 2

C:\AI\ComfyUI_windows_portable\ComfyUI\comfy\ldm\modules\attention.py:325: UserWarning: 1Torch was not compiled with flash attention. (Triggered internally at ..\aten\src\ATen\native\transformers\cuda\sdp_utils.cpp:263.)
  out = torch.nn.functional.scaled_dot_product_attention(q, k, v, attn_mask=mask, dropout_p=0.0, is_causal=False)

I've been reading about these on 3rd party forums and couldn't figure out their origins. Can they be ignored safely?

Thank you.

stephantual avatar Feb 11 '24 20:02 stephantual

This!!! Starting using ComfyUI yesterday and this is bugging me out!

carlosalberto2000 avatar Feb 17 '24 16:02 carlosalberto2000

same here.

wangfeng35 avatar Mar 11 '24 14:03 wangfeng35

same

adrianoamalfi avatar Mar 11 '24 18:03 adrianoamalfi

It does not seem to affect the output, but a little insight into this would be amazing...

carlosalberto2000 avatar Mar 13 '24 15:03 carlosalberto2000

It's fixed now.

stephantual avatar Mar 20 '24 15:03 stephantual

It's fixed now.

Can tell it's not. Installed fresh windows_portable yesterday and getting all these same errors.

jonecky816739 avatar Apr 02 '24 13:04 jonecky816739

Mmm sorry to hear that but on ### ComfyUI Revision: 2096 [6c6a3925] | Released on '2024-04-02' with 95 node sets installed, I don't see it anymore. In any case,it's harmless.

stephantual avatar Apr 04 '24 22:04 stephantual

Mmm sorry to hear that but on ### ComfyUI Revision: 2096 [6c6a392] | Released on '2024-04-02' with 95 node sets installed, I don't see it anymore. In any case,it's harmless.

I have adopted a fresh installation, encountering the same issue. I've already spent three days trying to resolve it. So far, none of the methods I've tried have worked, and I also feel like the speed when using the sdxl model is not as fast as before (this might be my perception). In order to address this warning, I have switched the CUDA version in the system to 12.1 and tried different versions of Torch, but the warning persists. I want to know if this has any negative impact on my use of ComfyUI?

wibur0620 avatar Apr 14 '24 20:04 wibur0620

Mmm sorry to hear that but on ### ComfyUI Revision: 2096 [6c6a392] | Released on '2024-04-02' with 95 node sets installed, I don't see it anymore. In any case,it's harmless.

I have adopted a fresh installation, encountering the same issue. I've already spent three days trying to resolve it. So far, none of the methods I've tried have worked, and I also feel like the speed when using the sdxl model is not as fast as before (this might be my perception). In order to address this warning, I have switched the CUDA version in the system to 12.1 and tried different versions of Torch, but the warning persists. I want to know if this has any negative impact on my use of ComfyUI?

Same here. I am new to ComfyUi, and to try solve this the last few days still not working no matter which version of CUDA or Torch I install. Many nodes is not working such as showing meshes on TripoSR

powerwarlord avatar Apr 14 '24 21:04 powerwarlord

Mmm sorry to hear that but on ### ComfyUI Revision: 2096 [6c6a392] | Released on '2024-04-02' with 95 node sets installed, I don't see it anymore. In any case,it's harmless.

I have adopted a fresh installation, encountering the same issue. I've already spent three days trying to resolve it. So far, none of the methods I've tried have worked, and I also feel like the speed when using the sdxl model is not as fast as before (this might be my perception). In order to address this warning, I have switched the CUDA version in the system to 12.1 and tried different versions of Torch, but the warning persists. I want to know if this has any negative impact on my use of ComfyUI?

Same here. I am new to ComfyUi, and to try solve this the last few days still not working no matter which version of CUDA or Torch I install. Many nodes is not working such as showing meshes on TripoSR

After reinstalling and installing 50 plugins, I noticed that it takes a long time to load the model when using the ComfyUI_VLM_nodes node. Then I saw this warning, so I reinstalled comfyui without installing any nodes and found that the warning still exists. To resolve this warning, I even reinstalled the entire Windows system, but the warning persists.

wibur0620 avatar Apr 14 '24 22:04 wibur0620

After reinstalling and installing 50 plugins, I noticed that it takes a long time to load the model when using the ComfyUI_VLM_nodes node. Then I saw this warning, so I reinstalled comfyui without installing any nodes and found that the warning still exists. To resolve this warning, I even reinstalled the entire Windows system, but the warning persists.

Friend, you're overthinking this. It's harmless. It's litterally a warning message. It has no impact on performance whatsoever. Even if it DID have an impact, it would be so small, it would be dwarfed by the comfyui general recursion issues and whatnot which happen even on a totally clean install (hence the rghtree fix).

I have >150 node sets installed, with them being updated every 3 days or so, I learned to accept the warnings. If you ABSOLUTELY want to have the best perf (client side) possible, just install 3 or 4 version of comfy with 3 or 4 different node sets. It's portable, so it's like a venv but better :)

If you're seeing crashes / major perf issues with a clean build, given Im not part of the CA team, I would recommend opening a new issue because no one but me and you are reading this thread at this point :)

stephantual avatar Apr 15 '24 11:04 stephantual