stable-diffusion-webui-forge icon indicating copy to clipboard operation
stable-diffusion-webui-forge copied to clipboard

"TypeError: tuple indices must be integers or slices, not float" when attempting to run flux models

Open aaronjolson opened this issue 5 months ago • 2 comments

I wanted to try and run a flux model in Forge. I deleted and reinstalled my venv via webui.bat. I tested that I can still run sdxl models. I am running with these settings:

image

Attempting to generate results in the following error

Error:

Model selected: {'checkpoint_info': {'filename': 'E:\\stable-diffusion-webui-forge\\models\\Stable-diffusion\\nepotismFUXDEVSCHNELL_nepotismFUXV3Dit.safetensors', 'hash': '76323669'}, 'additional_modules': ['E:\\stable-diffusion-webui-forge\\models\\text_encoder\\t5xxl_fp16.safetensors'], 'unet_storage_dtype': None}
Using online LoRAs in FP16: False
Model selected: {'checkpoint_info': {'filename': 'E:\\stable-diffusion-webui-forge\\models\\Stable-diffusion\\nepotismFUXDEVSCHNELL_nepotismFUXV3Dit.safetensors', 'hash': '76323669'}, 'additional_modules': ['E:\\stable-diffusion-webui-forge\\models\\text_encoder\\t5xxl_fp16.safetensors', 'E:\\stable-diffusion-webui-forge\\models\\text_encoder\\clip_l.safetensors'], 'unet_storage_dtype': None}
Using online LoRAs in FP16: False
Model selected: {'checkpoint_info': {'filename': 'E:\\stable-diffusion-webui-forge\\models\\Stable-diffusion\\nepotismFUXDEVSCHNELL_nepotismFUXV3Dit.safetensors', 'hash': '76323669'}, 'additional_modules': ['E:\\stable-diffusion-webui-forge\\models\\text_encoder\\t5xxl_fp16.safetensors', 'E:\\stable-diffusion-webui-forge\\models\\text_encoder\\clip_l.safetensors', 'E:\\stable-diffusion-webui-forge\\models\\VAE\\ae.safetensors'], 'unet_storage_dtype': None}
Using online LoRAs in FP16: False
Loading Model: {'checkpoint_info': {'filename': 'E:\\stable-diffusion-webui-forge\\models\\Stable-diffusion\\nepotismFUXDEVSCHNELL_nepotismFUXV3Dit.safetensors', 'hash': '76323669'}, 'additional_modules': ['E:\\stable-diffusion-webui-forge\\models\\text_encoder\\t5xxl_fp16.safetensors', 'E:\\stable-diffusion-webui-forge\\models\\text_encoder\\clip_l.safetensors', 'E:\\stable-diffusion-webui-forge\\models\\VAE\\ae.safetensors'], 'unet_storage_dtype': None}
[Unload] Trying to free all memory for cuda:0 with 0 models keep loaded ... Current free memory is 13247.48 MB ... Unload model JointTextEncoder Done.
StateDict Keys: {'transformer': 780, 'vae': 244, 'text_encoder': 196, 'text_encoder_2': 220, 'ignore': 0}
Using Default T5 Data Type: torch.float16
Using Detected UNet Type: torch.float8_e4m3fn
Working with z of shape (1, 16, 32, 32) = 16384 dimensions.
K-Model Created: {'storage_dtype': torch.float8_e4m3fn, 'computation_dtype': torch.bfloat16}
Calculating sha256 for E:\stable-diffusion-webui-forge\models\Stable-diffusion\nepotismFUXDEVSCHNELL_nepotismFUXV3Dit.safetensors: a41eaa1b3d3d6cf8f9dec5835aa0537c684b93c9ee660cb9cd06b652fb70eb3b
Model loaded in 6.2s (unload existing model: 5.0s, forge model load: 1.2s).
Skipping unconditional conditioning when CFG = 1. Negative Prompts are ignored.
[Unload] Trying to free 21095.34 MB for cuda:0 with 0 models keep loaded ... Done.
[Memory Management] Target: JointTextEncoder, Free GPU: 22898.09 MB, Model Require: 9569.49 MB, Previously Loaded: 0.00 MB, Inference Require: 8655.00 MB, Remaining: 4673.60 MB, All loaded to GPU.
Moving model(s) has taken 7.27 seconds
Traceback (most recent call last):
  File "E:\stable-diffusion-webui-forge\modules_forge\main_thread.py", line 30, in work
    self.result = self.func(*self.args, **self.kwargs)
  File "E:\stable-diffusion-webui-forge\modules\txt2img.py", line 121, in txt2img_function
    processed = processing.process_images(p)
  File "E:\stable-diffusion-webui-forge\modules\processing.py", line 816, in process_images
    res = process_images_inner(p)
  File "E:\stable-diffusion-webui-forge\modules\processing.py", line 929, in process_images_inner
    p.setup_conds()
  File "E:\stable-diffusion-webui-forge\modules\processing.py", line 1519, in setup_conds
    super().setup_conds()
  File "E:\stable-diffusion-webui-forge\modules\processing.py", line 501, in setup_conds
    self.c = self.get_conds_with_caching(prompt_parser.get_multicond_learned_conditioning, prompts, total_steps, [self.cached_c], self.extra_network_data)
  File "E:\stable-diffusion-webui-forge\modules\processing.py", line 470, in get_conds_with_caching
    cache[1] = function(shared.sd_model, required_prompts, steps, hires_steps, shared.opts.use_old_scheduling)
  File "E:\stable-diffusion-webui-forge\modules\prompt_parser.py", line 262, in get_multicond_learned_conditioning
    learned_conditioning = get_learned_conditioning(model, prompt_flat_list, steps, hires_steps, use_old_scheduling)
  File "E:\stable-diffusion-webui-forge\modules\prompt_parser.py", line 189, in get_learned_conditioning
    conds = model.get_learned_conditioning(texts)
  File "E:\stable-diffusion-webui-forge\venv\lib\site-packages\torch\utils\_contextlib.py", line 115, in decorate_context
    return func(*args, **kwargs)
  File "E:\stable-diffusion-webui-forge\backend\diffusion_engine\flux.py", line 78, in get_learned_conditioning
    cond_l, pooled_l = self.text_processing_engine_l(prompt)
  File "E:\stable-diffusion-webui-forge\backend\text_processing\classic_engine.py", line 268, in __call__
    z = self.process_tokens(tokens, multipliers)
  File "E:\stable-diffusion-webui-forge\backend\text_processing\classic_engine.py", line 301, in process_tokens
    z = self.encode_with_transformers(tokens)
  File "E:\stable-diffusion-webui-forge\backend\text_processing\classic_engine.py", line 134, in encode_with_transformers
    z = outputs.hidden_states[layer_id]
TypeError: tuple indices must be integers or slices, not float
tuple indices must be integers or slices, not float

Any suggestions for things to try to work around this issue?

aaronjolson avatar Sep 09 '24 16:09 aaronjolson