ComfyUI-LTXVideo icon indicating copy to clipboard operation
ComfyUI-LTXVideo copied to clipboard

Flash Attention failed, using default SDPA

Open Ngardos opened this issue 10 months ago • 2 comments

I have no issues with Flash Attention on my comfy install. It loads and works fine with other things but I'm getting an error stating: Flash Attention failed, using default SDPA when the frames are being generated. Does anyone have any solution for this? Thank you. `

Checkpoint files will always be loaded safely.
Total VRAM 24576 MB, total RAM 32457 MB
pytorch version: 2.6.0+cu126
Set vram state to: NORMAL_VRAM
Device: cuda:0 NVIDIA GeForce RTX 3090 : cudaMallocAsync
Using Flash Attention
Python version: 3.12.7 (tags/v3.12.7:0b05ead, Oct  1 2024, 03:06:41) [MSC v.1941 64 bit (AMD64)]
ComfyUI version: 0.3.29
ComfyUI frontend version: 1.17.11

Ngardos avatar Apr 24 '25 09:04 Ngardos

I'm having the same issue on my system.

Checkpoint files will always be loaded safely. Total VRAM 24560 MB, total RAM 31867 MB pytorch version: 2.5.1+rocm6.2 AMD arch: gfx1100 Set vram state to: NORMAL_VRAM Device: cuda:0 AMD Radeon RX 7900 XTX : native Using Flash Attention Python version: 3.9.21 (main, Jan 7 2025, 18:39:12) [GCC 14.2.1 20240910] ComfyUI version: 0.3.30 ComfyUI frontend version: 1.17.11

Noobkrusher3000 avatar Apr 29 '25 15:04 Noobkrusher3000

Yeah, I'm having this problem as well.

Got a Nvidia 4090 with 24G VRAM.

The error goes away when I stop using it, but I'm having other issues.

Decidedly unfun.

Brie-Wensleydale avatar May 10 '25 10:05 Brie-Wensleydale