stable-diffusion-webui-amdgpu-forge icon indicating copy to clipboard operation
stable-diffusion-webui-amdgpu-forge copied to clipboard

errors in the use of FLUX models

Open Stinkmorchel1 opened this issue 9 months ago • 5 comments

When I try to generate an image from a FLux model I get an error message and the UI hangs. I used Zluda on a Windows 11 PC

[Unload] Trying to free all memory for cuda:0 with 0 models keep loaded ... Done. StateDict Keys: {'transformer': 1722, 'vae': 244, 'text_encoder': 198, 'text_encoder_2': 220, 'ignore': 0} Using Detected T5 Data Type: torch.float8_e4m3fn Using Detected UNet Type: nf4 Using pre-quant state dict! Working with z of shape (1, 16, 32, 32) = 16384 dimensions. K-Model Created: {'storage_dtype': 'nf4', 'computation_dtype': torch.bfloat16} Model loaded in 0.6s (unload existing model: 0.2s, forge model load: 0.4s). Skipping unconditional conditioning when CFG = 1. Negative Prompts are ignored. [Unload] Trying to free 7725.00 MB for cuda:0 with 0 models keep loaded ... Done. [Memory Management] Target: JointTextEncoder, Free GPU: 13814.45 MB, Model Require: 5154.62 MB, Previously Loaded: 0.00 MB, Inference Require: 1024.00 MB, Remaining: 7635.84 MB, All loaded to GPU. Moving model(s) has taken 2.90 seconds Distilled CFG Scale: 3.5 [Unload] Trying to free 9411.13 MB for cuda:0 with 0 models keep loaded ... Current free memory is 8545.02 MB ... Unload model JointTextEncoder Done. [Memory Management] Target: KModel, Free GPU: 13773.38 MB, Model Require: 6246.84 MB, Previously Loaded: 0.00 MB, Inference Require: 1024.00 MB, Remaining: 6502.53 MB, All loaded to GPU. Moving model(s) has taken 6.07 seconds 0%| | 0/20 [00:00<?, ?it/s]Error named symbol not found at line 90 in file D:\a\bitsandbytes\bitsandbytes\csrc\ops.cu

is there a solution here, what have I done wrong?

Stinkmorchel1 avatar Mar 08 '25 15:03 Stinkmorchel1

#16 bitsandbytes is not available for now.

lshqqytiger avatar Mar 08 '25 15:03 lshqqytiger

Ok, Thank You

Stinkmorchel1 avatar Mar 08 '25 15:03 Stinkmorchel1

#16 bitsandbytes is not available for now.

When it will be available ?

MaelHan avatar Mar 16 '25 05:03 MaelHan

It will be available when WMMA is implemented on ZLUDA. In June-July? I don't know..

lshqqytiger avatar Mar 16 '25 06:03 lshqqytiger

If you must support Flux models, you can switch to zluda for ComfyUI, my 7900XTX is smooth with Flux models under Comfyui, before that I used Flux models on Stable diffiusion which also often failed and reported errors and slow generation.

Image

bigwinboy avatar Mar 24 '25 07:03 bigwinboy