ComfyUI icon indicating copy to clipboard operation
ComfyUI copied to clipboard

"lora key not loaded": when I use my Z-Image lora trained on Musubi Tuner

Open peepeepeepoopoopoo opened this issue 6 days ago • 2 comments

i train the lora with musubi tuner. I tested my LoRa trained on ComfyUI and it shows a "LoRa key not loaded" log; however, I'm comparing it to other LoRas from Civit AI and they don't have that problem.

launch:

cd /workspace/musubi-tuner source venv/bin/activate

python src/musubi_tuner/zimage_cache_latents.py --dataset_config /workspace/musubi-tuner/dataset/dataset.toml --vae /workspace/musubi-tuner/models/vae/chroma_flux_z_image/ae.safetensors python src/musubi_tuner/zimage_cache_text_encoder_outputs.py --dataset_config /workspace/musubi-tuner/dataset/dataset.toml --text_encoder /workspace/musubi-tuner/models/clip/z_image/qwen_3_4b.safetensors

mkdir -p /workspace/musubi-tuner/output && accelerate launch --mixed_precision bf16 --dynamo_backend inductor --num_processes 1 --num_machines 1 --num_cpu_threads_per_process 8 src/musubi_tuner/zimage_train_network.py --dit /workspace/ComfyUI/models/diffusion_models/z_image/z_image_de_turbo_v1_bf16.safetensors --vae /workspace/musubi-tuner/models/vae/chroma_flux_z_image/ae.safetensors --text_encoder /workspace/musubi-tuner/models/text_encoders/z_image/qwen_3_4b.safetensors --dataset_config /workspace/musubi-tuner/dataset/dataset.toml --flash_attn --timestep_sampling shift --weighting_scheme none --discrete_flow_shift 2.0 --optimizer_type adamw --learning_rate 3e-4 --max_data_loader_n_workers 48 --persistent_data_loader_workers --network_module networks.lora_zimage --network_dim 128 --max_train_epochs 200 --save_every_n_epochs 25 --seed 42 --output_dir /workspace/musubi-tuner/output --output_name egirl_feat_tali

and this for default: https://github.com/kohya-ss/musubi-tuner/pull/757#issuecomment-3621396779

both have problems

log:

source venv/bin/activate python main.py --listen 0.0.0.0 --port 9999 --cuda-device 0 --use-sage-attention --preview-method none --auto-launch --cuda-malloc + Set cuda device to: 0 [ComfyUI-Manager] Using uv as Python module for pip operations. Using Python 3.12.3 environment at: venv [START] Security scan [DONE] Security scan

ComfyUI-Manager: installing dependencies done.

** ComfyUI startup time: 2025-12-24 05:41:55.526 ** Platform: Linux ** Python version: 3.12.3 (main, Aug 14 2025, 17:47:21) [GCC 13.3.0] ** Python executable: /workspace/ComfyUI/venv/bin/python ** ComfyUI Path: /workspace/ComfyUI ** ComfyUI Base Folder Path: /workspace/ComfyUI ** User directory: /workspace/ComfyUI/user ** ComfyUI-Manager config path: /workspace/ComfyUI/user/__manager/config.ini ** Log path: /workspace/ComfyUI/user/comfyui.log Using Python 3.12.3 environment at: venv Using Python 3.12.3 environment at: venv

Prestartup times for custom nodes: 0.0 seconds: /workspace/ComfyUI/custom_nodes/rgthree-comfy 3.4 seconds: /workspace/ComfyUI/custom_nodes/ComfyUI-Manager

Checkpoint files will always be loaded safely. Total VRAM 32120 MB, total RAM 190812 MB pytorch version: 2.9.1+cu128 Set vram state to: NORMAL_VRAM Device: cuda:0 NVIDIA GeForce RTX 5090 : cudaMallocAsync Using async weight offloading with 2 streams Enabled pinned memory 181271.0 working around nvidia conv3d memory bug. Using sage attention Python version: 3.12.3 (main, Aug 14 2025, 17:47:21) [GCC 13.3.0] ComfyUI version: 0.6.0 ComfyUI frontend version: 1.34.9 [Prompt Server] web root: /workspace/ComfyUI/venv/lib/python3.12/site-packages/comfyui_frontend_package/static Total VRAM 32120 MB, total RAM 190812 MB pytorch version: 2.9.1+cu128 Set vram state to: NORMAL_VRAM Device: cuda:0 NVIDIA GeForce RTX 5090 : cudaMallocAsync Using async weight offloading with 2 streams Enabled pinned memory 181271.0 Using sage attention (RES4LYF) Init (RES4LYF) Importing beta samplers. (RES4LYF) Importing legacy samplers.

Loading: ComfyUI-Manager (V3.39)

[ComfyUI-Manager] network_mode: public [ComfyUI-Manager] ComfyUI per-queue preview override detected (PR #11261). Manager's preview method feature is disabled. Use ComfyUI's --preview-method CLI option or 'Settings > Execution > Live preview method'.

ComfyUI Revision: 4431 [e4c61d75] *DETACHED | Released on '2025-12-23'

[ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/alter-list.json [ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/model-list.json [ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/github-stats.json [ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/extension-node-map.json [ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/custom-node-list.json

Loading: ComfyUI-Impact-Subpack (V1.3.5)

[Impact Pack/Subpack] Using folder_paths to determine whitelist path: /workspace/ComfyUI/user/default/ComfyUI-Impact-Subpack/model-whitelist.txt [Impact Pack/Subpack] Ensured whitelist directory exists: /workspace/ComfyUI/user/default/ComfyUI-Impact-Subpack [Impact Pack/Subpack] Loaded 0 model(s) from whitelist: /workspace/ComfyUI/user/default/ComfyUI-Impact-Subpack/model-whitelist.txt [Impact Subpack] ultralytics_bbox: /workspace/ComfyUI/models/ultralytics/bbox [Impact Subpack] ultralytics_segm: /workspace/ComfyUI/models/ultralytics/segm

Loading: ComfyUI-Impact-Pack (V8.28)

FETCH ComfyRegistry Data: 5/115 [Impact Pack] Wildcard total size (0.00 MB) is within cache limit (50.00 MB). Using full cache mode.

Loading: ComfyUI-Inspire-Pack (V1.23)

[Impact Pack] Wildcards loading done. ⚡ SeedVR2 optimizations check: SageAttention ✅ | Flash Attention ✅ | Triton ✅ 🔧 Conv3d workaround active: PyTorch 2.9.1, cuDNN 91002 (fixing VAE 3x memory bug) 📊 Initial CUDA memory: 30.76GB free / 31.37GB total

[rgthree-comfy] Loaded 48 magnificent nodes. 🎉

[rgthree-comfy] ComfyUI's new Node 2.0 rendering may be incompatible with some rgthree-comfy nodes and features, breaking some rendering as well as losing the ability to access a node's properties (a vital part of many nodes). It also appears to run MUCH more slowly spiking CPU usage and causing jankiness and unresponsiveness, especially with large workflows. Personally I am not planning to use the new Nodes 2.0 and, unfortunately, am not able to invest the time to investigate and overhaul rgthree-comfy where needed. If you have issues when Nodes 2.0 is enabled, I'd urge you to switch it off as well and join me in hoping ComfyUI is not planning to deprecate the existing, stable canvas rendering all together.

FETCH ComfyRegistry Data: 10/115

Import times for custom nodes: 0.0 seconds: /workspace/ComfyUI/custom_nodes/websocket_image_save.py 0.0 seconds: /workspace/ComfyUI/custom_nodes/ComfyUI-MagCache 0.0 seconds: /workspace/ComfyUI/custom_nodes/comfyui-cache-dit 0.0 seconds: /workspace/ComfyUI/custom_nodes/ComfyUI-Inpaint-CropAndStitch 0.0 seconds: /workspace/ComfyUI/custom_nodes/LanPaint 0.1 seconds: /workspace/ComfyUI/custom_nodes/ComfyUI-WanAnimatePreprocess 0.1 seconds: /workspace/ComfyUI/custom_nodes/ComfyUI_essentials 0.1 seconds: /workspace/ComfyUI/custom_nodes/comfyui-various 0.1 seconds: /workspace/ComfyUI/custom_nodes/ComfyUI-KJNodes 0.1 seconds: /workspace/ComfyUI/custom_nodes/ComfyUI-NAG 0.1 seconds: /workspace/ComfyUI/custom_nodes/ComfyUI-Custom-Scripts 0.1 seconds: /workspace/ComfyUI/custom_nodes/ComfyUI-Inspire-Pack 0.2 seconds: /workspace/ComfyUI/custom_nodes/ComfyUI-segment-anything-2 0.3 seconds: /workspace/ComfyUI/custom_nodes/rgthree-comfy 0.3 seconds: /workspace/ComfyUI/custom_nodes/ComfyUI-VideoHelperSuite 0.6 seconds: /workspace/ComfyUI/custom_nodes/ComfyUI-Manager 0.7 seconds: /workspace/ComfyUI/custom_nodes/ComfyUI-Impact-Subpack 1.2 seconds: /workspace/ComfyUI/custom_nodes/ComfyUI-Impact-Pack 1.8 seconds: /workspace/ComfyUI/custom_nodes/comfyui-tensorops 1.9 seconds: /workspace/ComfyUI/custom_nodes/ComfyUI-WanVideoWrapper 2.8 seconds: /workspace/ComfyUI/custom_nodes/ComfyUI-SeedVR2_VideoUpscaler 5.5 seconds: /workspace/ComfyUI/custom_nodes/RES4LYF 10.4 seconds: /workspace/ComfyUI/custom_nodes/ComfyUI-TeaCache

Context impl SQLiteImpl. Will assume non-transactional DDL. No target revision found. Starting server

To see the GUI go to: http://0.0.0.0:9999 FETCH ComfyRegistry Data: 15/115 FETCH ComfyRegistry Data: 20/115 FETCH ComfyRegistry Data: 25/115 FETCH ComfyRegistry Data: 30/115 FETCH ComfyRegistry Data: 35/115 FETCH ComfyRegistry Data: 40/115 FETCH ComfyRegistry Data: 45/115 FETCH ComfyRegistry Data: 50/115 FETCH ComfyRegistry Data: 55/115 FETCH ComfyRegistry Data: 60/115 FETCH ComfyRegistry Data: 65/115 FETCH ComfyRegistry Data: 70/115 FETCH ComfyRegistry Data: 75/115 FETCH ComfyRegistry Data: 80/115 FETCH ComfyRegistry Data: 85/115 FETCH ComfyRegistry Data: 90/115 FETCH ComfyRegistry Data: 95/115 FETCH ComfyRegistry Data: 100/115 [DEPRECATION WARNING] Detected import of deprecated legacy API: /scripts/ui.js. This is likely caused by a custom node extension using outdated APIs. Please update your extensions or contact the extension author for an updated version. [DEPRECATION WARNING] Detected import of deprecated legacy API: /extensions/core/groupNode.js. This is likely caused by a custom node extension using outdated APIs. Please update your extensions or contact the extension author for an updated version. [DEPRECATION WARNING] Detected import of deprecated legacy API: /extensions/core/clipspace.js. This is likely caused by a custom node extension using outdated APIs. Please update your extensions or contact the extension author for an updated version. [DEPRECATION WARNING] Detected import of deprecated legacy API: /extensions/core/widgetInputs.js. This is likely caused by a custom node extension using outdated APIs. Please update your extensions or contact the extension author for an updated version. FETCH ComfyRegistry Data: 105/115 [Inspire Pack] IPAdapterPlus is not installed. [DEPRECATION WARNING] Detected import of deprecated legacy API: /scripts/ui/components/button.js. This is likely caused by a custom node extension using outdated APIs. Please update your extensions or contact the extension author for an updated version. [DEPRECATION WARNING] Detected import of deprecated legacy API: /scripts/ui/components/buttonGroup.js. This is likely caused by a custom node extension using outdated APIs. Please update your extensions or contact the extension author for an updated version. FETCH ComfyRegistry Data: 110/115 got prompt Using pytorch attention in VAE Using pytorch attention in VAE FETCH ComfyRegistry Data: 115/115 FETCH ComfyRegistry Data [DONE] [ComfyUI-Manager] default cache updated: https://api.comfy.org/nodes VAE load device: cuda:0, offload device: cpu, dtype: torch.bfloat16 FETCH DATA from: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/custom-node-list.jsonmodel weight dtype torch.bfloat16, manual cast: None model_type FLOW [DONE] [ComfyUI-Manager] All startup tasks have been completed. unet missing: ['norm_final.weight'] lora key not loaded: lora_unet_layers_0_attention_to_k.alpha lora key not loaded: lora_unet_layers_0_attention_to_k.lora_down.weight lora key not loaded: lora_unet_layers_0_attention_to_k.lora_up.weight lora key not loaded: lora_unet_layers_0_attention_to_out_0.alpha lora key not loaded: lora_unet_layers_0_attention_to_out_0.lora_down.weight lora key not loaded: lora_unet_layers_0_attention_to_out_0.lora_up.weight lora key not loaded: lora_unet_layers_0_attention_to_q.alpha lora key not loaded: lora_unet_layers_0_attention_to_q.lora_down.weight lora key not loaded: lora_unet_layers_0_attention_to_q.lora_up.weight lora key not loaded: lora_unet_layers_0_attention_to_v.alpha lora key not loaded: lora_unet_layers_0_attention_to_v.lora_down.weight lora key not loaded: lora_unet_layers_0_attention_to_v.lora_up.weight lora key not loaded: lora_unet_layers_10_attention_to_k.alpha lora key not loaded: lora_unet_layers_10_attention_to_k.lora_down.weight lora key not loaded: lora_unet_layers_10_attention_to_k.lora_up.weight lora key not loaded: lora_unet_layers_10_attention_to_out_0.alpha lora key not loaded: lora_unet_layers_10_attention_to_out_0.lora_down.weight lora key not loaded: lora_unet_layers_10_attention_to_out_0.lora_up.weight lora key not loaded: lora_unet_layers_10_attention_to_q.alpha lora key not loaded: lora_unet_layers_10_attention_to_q.lora_down.weight lora key not loaded: lora_unet_layers_10_attention_to_q.lora_up.weight lora key not loaded: lora_unet_layers_10_attention_to_v.alpha lora key not loaded: lora_unet_layers_10_attention_to_v.lora_down.weight lora key not loaded: lora_unet_layers_10_attention_to_v.lora_up.weight lora key not loaded: lora_unet_layers_11_attention_to_k.alpha lora key not loaded: lora_unet_layers_11_attention_to_k.lora_down.weight lora key not loaded: lora_unet_layers_11_attention_to_k.lora_up.weight lora key not loaded: lora_unet_layers_11_attention_to_out_0.alpha lora key not loaded: lora_unet_layers_11_attention_to_out_0.lora_down.weight lora key not loaded: lora_unet_layers_11_attention_to_out_0.lora_up.weight lora key not loaded: lora_unet_layers_11_attention_to_q.alpha lora key not loaded: lora_unet_layers_11_attention_to_q.lora_down.weight lora key not loaded: lora_unet_layers_11_attention_to_q.lora_up.weight lora key not loaded: lora_unet_layers_11_attention_to_v.alpha lora key not loaded: lora_unet_layers_11_attention_to_v.lora_down.weight lora key not loaded: lora_unet_layers_11_attention_to_v.lora_up.weight lora key not loaded: lora_unet_layers_12_attention_to_k.alpha lora key not loaded: lora_unet_layers_12_attention_to_k.lora_down.weight lora key not loaded: lora_unet_layers_12_attention_to_k.lora_up.weight lora key not loaded: lora_unet_layers_12_attention_to_out_0.alpha lora key not loaded: lora_unet_layers_12_attention_to_out_0.lora_down.weight lora key not loaded: lora_unet_layers_12_attention_to_out_0.lora_up.weight lora key not loaded: lora_unet_layers_12_attention_to_q.alpha lora key not loaded: lora_unet_layers_12_attention_to_q.lora_down.weight lora key not loaded: lora_unet_layers_12_attention_to_q.lora_up.weight lora key not loaded: lora_unet_layers_12_attention_to_v.alpha lora key not loaded: lora_unet_layers_12_attention_to_v.lora_down.weight lora key not loaded: lora_unet_layers_12_attention_to_v.lora_up.weight lora key not loaded: lora_unet_layers_13_attention_to_k.alpha lora key not loaded: lora_unet_layers_13_attention_to_k.lora_down.weight lora key not loaded: lora_unet_layers_13_attention_to_k.lora_up.weight lora key not loaded: lora_unet_layers_13_attention_to_out_0.alpha lora key not loaded: lora_unet_layers_13_attention_to_out_0.lora_down.weight lora key not loaded: lora_unet_layers_13_attention_to_out_0.lora_up.weight lora key not loaded: lora_unet_layers_13_attention_to_q.alpha lora key not loaded: lora_unet_layers_13_attention_to_q.lora_down.weight lora key not loaded: lora_unet_layers_13_attention_to_q.lora_up.weight lora key not loaded: lora_unet_layers_13_attention_to_v.alpha lora key not loaded: lora_unet_layers_13_attention_to_v.lora_down.weight lora key not loaded: lora_unet_layers_13_attention_to_v.lora_up.weight lora key not loaded: lora_unet_layers_14_attention_to_k.alpha lora key not loaded: lora_unet_layers_14_attention_to_k.lora_down.weight lora key not loaded: lora_unet_layers_14_attention_to_k.lora_up.weight lora key not loaded: lora_unet_layers_14_attention_to_out_0.alpha lora key not loaded: lora_unet_layers_14_attention_to_out_0.lora_down.weight lora key not loaded: lora_unet_layers_14_attention_to_out_0.lora_up.weight lora key not loaded: lora_unet_layers_14_attention_to_q.alpha lora key not loaded: lora_unet_layers_14_attention_to_q.lora_down.weight lora key not loaded: lora_unet_layers_14_attention_to_q.lora_up.weight lora key not loaded: lora_unet_layers_14_attention_to_v.alpha lora key not loaded: lora_unet_layers_14_attention_to_v.lora_down.weight lora key not loaded: lora_unet_layers_14_attention_to_v.lora_up.weight lora key not loaded: lora_unet_layers_15_attention_to_k.alpha lora key not loaded: lora_unet_layers_15_attention_to_k.lora_down.weight lora key not loaded: lora_unet_layers_15_attention_to_k.lora_up.weight lora key not loaded: lora_unet_layers_15_attention_to_out_0.alpha lora key not loaded: lora_unet_layers_15_attention_to_out_0.lora_down.weight lora key not loaded: lora_unet_layers_15_attention_to_out_0.lora_up.weight lora key not loaded: lora_unet_layers_15_attention_to_q.alpha lora key not loaded: lora_unet_layers_15_attention_to_q.lora_down.weight lora key not loaded: lora_unet_layers_15_attention_to_q.lora_up.weight lora key not loaded: lora_unet_layers_15_attention_to_v.alpha lora key not loaded: lora_unet_layers_15_attention_to_v.lora_down.weight lora key not loaded: lora_unet_layers_15_attention_to_v.lora_up.weight lora key not loaded: lora_unet_layers_16_attention_to_k.alpha lora key not loaded: lora_unet_layers_16_attention_to_k.lora_down.weight lora key not loaded: lora_unet_layers_16_attention_to_k.lora_up.weight lora key not loaded: lora_unet_layers_16_attention_to_out_0.alpha lora key not loaded: lora_unet_layers_16_attention_to_out_0.lora_down.weight lora key not loaded: lora_unet_layers_16_attention_to_out_0.lora_up.weight lora key not loaded: lora_unet_layers_16_attention_to_q.alpha lora key not loaded: lora_unet_layers_16_attention_to_q.lora_down.weight lora key not loaded: lora_unet_layers_16_attention_to_q.lora_up.weight lora key not loaded: lora_unet_layers_16_attention_to_v.alpha lora key not loaded: lora_unet_layers_16_attention_to_v.lora_down.weight lora key not loaded: lora_unet_layers_16_attention_to_v.lora_up.weight lora key not loaded: lora_unet_layers_17_attention_to_k.alpha lora key not loaded: lora_unet_layers_17_attention_to_k.lora_down.weight lora key not loaded: lora_unet_layers_17_attention_to_k.lora_up.weight lora key not loaded: lora_unet_layers_17_attention_to_out_0.alpha lora key not loaded: lora_unet_layers_17_attention_to_out_0.lora_down.weight lora key not loaded: lora_unet_layers_17_attention_to_out_0.lora_up.weight lora key not loaded: lora_unet_layers_17_attention_to_q.alpha lora key not loaded: lora_unet_layers_17_attention_to_q.lora_down.weight lora key not loaded: lora_unet_layers_17_attention_to_q.lora_up.weight lora key not loaded: lora_unet_layers_17_attention_to_v.alpha lora key not loaded: lora_unet_layers_17_attention_to_v.lora_down.weight lora key not loaded: lora_unet_layers_17_attention_to_v.lora_up.weight lora key not loaded: lora_unet_layers_18_attention_to_k.alpha lora key not loaded: lora_unet_layers_18_attention_to_k.lora_down.weight lora key not loaded: lora_unet_layers_18_attention_to_k.lora_up.weight lora key not loaded: lora_unet_layers_18_attention_to_out_0.alpha lora key not loaded: lora_unet_layers_18_attention_to_out_0.lora_down.weight lora key not loaded: lora_unet_layers_18_attention_to_out_0.lora_up.weight lora key not loaded: lora_unet_layers_18_attention_to_q.alpha lora key not loaded: lora_unet_layers_18_attention_to_q.lora_down.weight lora key not loaded: lora_unet_layers_18_attention_to_q.lora_up.weight lora key not loaded: lora_unet_layers_18_attention_to_v.alpha lora key not loaded: lora_unet_layers_18_attention_to_v.lora_down.weight lora key not loaded: lora_unet_layers_18_attention_to_v.lora_up.weight lora key not loaded: lora_unet_layers_19_attention_to_k.alpha lora key not loaded: lora_unet_layers_19_attention_to_k.lora_down.weight lora key not loaded: lora_unet_layers_19_attention_to_k.lora_up.weight lora key not loaded: lora_unet_layers_19_attention_to_out_0.alpha lora key not loaded: lora_unet_layers_19_attention_to_out_0.lora_down.weight lora key not loaded: lora_unet_layers_19_attention_to_out_0.lora_up.weight lora key not loaded: lora_unet_layers_19_attention_to_q.alpha lora key not loaded: lora_unet_layers_19_attention_to_q.lora_down.weight lora key not loaded: lora_unet_layers_19_attention_to_q.lora_up.weight lora key not loaded: lora_unet_layers_19_attention_to_v.alpha lora key not loaded: lora_unet_layers_19_attention_to_v.lora_down.weight lora key not loaded: lora_unet_layers_19_attention_to_v.lora_up.weight lora key not loaded: lora_unet_layers_1_attention_to_k.alpha lora key not loaded: lora_unet_layers_1_attention_to_k.lora_down.weight lora key not loaded: lora_unet_layers_1_attention_to_k.lora_up.weight lora key not loaded: lora_unet_layers_1_attention_to_out_0.alpha lora key not loaded: lora_unet_layers_1_attention_to_out_0.lora_down.weight lora key not loaded: lora_unet_layers_1_attention_to_out_0.lora_up.weight lora key not loaded: lora_unet_layers_1_attention_to_q.alpha lora key not loaded: lora_unet_layers_1_attention_to_q.lora_down.weight lora key not loaded: lora_unet_layers_1_attention_to_q.lora_up.weight lora key not loaded: lora_unet_layers_1_attention_to_v.alpha lora key not loaded: lora_unet_layers_1_attention_to_v.lora_down.weight lora key not loaded: lora_unet_layers_1_attention_to_v.lora_up.weight lora key not loaded: lora_unet_layers_20_attention_to_k.alpha lora key not loaded: lora_unet_layers_20_attention_to_k.lora_down.weight lora key not loaded: lora_unet_layers_20_attention_to_k.lora_up.weight lora key not loaded: lora_unet_layers_20_attention_to_out_0.alpha lora key not loaded: lora_unet_layers_20_attention_to_out_0.lora_down.weight lora key not loaded: lora_unet_layers_20_attention_to_out_0.lora_up.weight lora key not loaded: lora_unet_layers_20_attention_to_q.alpha lora key not loaded: lora_unet_layers_20_attention_to_q.lora_down.weight lora key not loaded: lora_unet_layers_20_attention_to_q.lora_up.weight lora key not loaded: lora_unet_layers_20_attention_to_v.alpha lora key not loaded: lora_unet_layers_20_attention_to_v.lora_down.weight lora key not loaded: lora_unet_layers_20_attention_to_v.lora_up.weight lora key not loaded: lora_unet_layers_21_attention_to_k.alpha lora key not loaded: lora_unet_layers_21_attention_to_k.lora_down.weight lora key not loaded: lora_unet_layers_21_attention_to_k.lora_up.weight lora key not loaded: lora_unet_layers_21_attention_to_out_0.alpha lora key not loaded: lora_unet_layers_21_attention_to_out_0.lora_down.weight lora key not loaded: lora_unet_layers_21_attention_to_out_0.lora_up.weight lora key not loaded: lora_unet_layers_21_attention_to_q.alpha lora key not loaded: lora_unet_layers_21_attention_to_q.lora_down.weight lora key not loaded: lora_unet_layers_21_attention_to_q.lora_up.weight lora key not loaded: lora_unet_layers_21_attention_to_v.alpha lora key not loaded: lora_unet_layers_21_attention_to_v.lora_down.weight lora key not loaded: lora_unet_layers_21_attention_to_v.lora_up.weight lora key not loaded: lora_unet_layers_22_attention_to_k.alpha lora key not loaded: lora_unet_layers_22_attention_to_k.lora_down.weight lora key not loaded: lora_unet_layers_22_attention_to_k.lora_up.weight lora key not loaded: lora_unet_layers_22_attention_to_out_0.alpha lora key not loaded: lora_unet_layers_22_attention_to_out_0.lora_down.weight lora key not loaded: lora_unet_layers_22_attention_to_out_0.lora_up.weight lora key not loaded: lora_unet_layers_22_attention_to_q.alpha lora key not loaded: lora_unet_layers_22_attention_to_q.lora_down.weight lora key not loaded: lora_unet_layers_22_attention_to_q.lora_up.weight lora key not loaded: lora_unet_layers_22_attention_to_v.alpha lora key not loaded: lora_unet_layers_22_attention_to_v.lora_down.weight lora key not loaded: lora_unet_layers_22_attention_to_v.lora_up.weight lora key not loaded: lora_unet_layers_23_attention_to_k.alpha lora key not loaded: lora_unet_layers_23_attention_to_k.lora_down.weight lora key not loaded: lora_unet_layers_23_attention_to_k.lora_up.weight lora key not loaded: lora_unet_layers_23_attention_to_out_0.alpha lora key not loaded: lora_unet_layers_23_attention_to_out_0.lora_down.weight lora key not loaded: lora_unet_layers_23_attention_to_out_0.lora_up.weight lora key not loaded: lora_unet_layers_23_attention_to_q.alpha lora key not loaded: lora_unet_layers_23_attention_to_q.lora_down.weight lora key not loaded: lora_unet_layers_23_attention_to_q.lora_up.weight lora key not loaded: lora_unet_layers_23_attention_to_v.alpha lora key not loaded: lora_unet_layers_23_attention_to_v.lora_down.weight lora key not loaded: lora_unet_layers_23_attention_to_v.lora_up.weight lora key not loaded: lora_unet_layers_24_attention_to_k.alpha lora key not loaded: lora_unet_layers_24_attention_to_k.lora_down.weight lora key not loaded: lora_unet_layers_24_attention_to_k.lora_up.weight lora key not loaded: lora_unet_layers_24_attention_to_out_0.alpha lora key not loaded: lora_unet_layers_24_attention_to_out_0.lora_down.weight lora key not loaded: lora_unet_layers_24_attention_to_out_0.lora_up.weight lora key not loaded: lora_unet_layers_24_attention_to_q.alpha lora key not loaded: lora_unet_layers_24_attention_to_q.lora_down.weight lora key not loaded: lora_unet_layers_24_attention_to_q.lora_up.weight lora key not loaded: lora_unet_layers_24_attention_to_v.alpha lora key not loaded: lora_unet_layers_24_attention_to_v.lora_down.weight lora key not loaded: lora_unet_layers_24_attention_to_v.lora_up.weight lora key not loaded: lora_unet_layers_25_attention_to_k.alpha lora key not loaded: lora_unet_layers_25_attention_to_k.lora_down.weight lora key not loaded: lora_unet_layers_25_attention_to_k.lora_up.weight lora key not loaded: lora_unet_layers_25_attention_to_out_0.alpha lora key not loaded: lora_unet_layers_25_attention_to_out_0.lora_down.weight lora key not loaded: lora_unet_layers_25_attention_to_out_0.lora_up.weight lora key not loaded: lora_unet_layers_25_attention_to_q.alpha lora key not loaded: lora_unet_layers_25_attention_to_q.lora_down.weight lora key not loaded: lora_unet_layers_25_attention_to_q.lora_up.weight lora key not loaded: lora_unet_layers_25_attention_to_v.alpha lora key not loaded: lora_unet_layers_25_attention_to_v.lora_down.weight lora key not loaded: lora_unet_layers_25_attention_to_v.lora_up.weight lora key not loaded: lora_unet_layers_26_attention_to_k.alpha lora key not loaded: lora_unet_layers_26_attention_to_k.lora_down.weight lora key not loaded: lora_unet_layers_26_attention_to_k.lora_up.weight lora key not loaded: lora_unet_layers_26_attention_to_out_0.alpha lora key not loaded: lora_unet_layers_26_attention_to_out_0.lora_down.weight lora key not loaded: lora_unet_layers_26_attention_to_out_0.lora_up.weight lora key not loaded: lora_unet_layers_26_attention_to_q.alpha lora key not loaded: lora_unet_layers_26_attention_to_q.lora_down.weight lora key not loaded: lora_unet_layers_26_attention_to_q.lora_up.weight lora key not loaded: lora_unet_layers_26_attention_to_v.alpha lora key not loaded: lora_unet_layers_26_attention_to_v.lora_down.weight lora key not loaded: lora_unet_layers_26_attention_to_v.lora_up.weight lora key not loaded: lora_unet_layers_27_attention_to_k.alpha lora key not loaded: lora_unet_layers_27_attention_to_k.lora_down.weight lora key not loaded: lora_unet_layers_27_attention_to_k.lora_up.weight lora key not loaded: lora_unet_layers_27_attention_to_out_0.alpha lora key not loaded: lora_unet_layers_27_attention_to_out_0.lora_down.weight lora key not loaded: lora_unet_layers_27_attention_to_out_0.lora_up.weight lora key not loaded: lora_unet_layers_27_attention_to_q.alpha lora key not loaded: lora_unet_layers_27_attention_to_q.lora_down.weight lora key not loaded: lora_unet_layers_27_attention_to_q.lora_up.weight lora key not loaded: lora_unet_layers_27_attention_to_v.alpha lora key not loaded: lora_unet_layers_27_attention_to_v.lora_down.weight lora key not loaded: lora_unet_layers_27_attention_to_v.lora_up.weight lora key not loaded: lora_unet_layers_28_attention_to_k.alpha lora key not loaded: lora_unet_layers_28_attention_to_k.lora_down.weight lora key not loaded: lora_unet_layers_28_attention_to_k.lora_up.weight lora key not loaded: lora_unet_layers_28_attention_to_out_0.alpha lora key not loaded: lora_unet_layers_28_attention_to_out_0.lora_down.weight lora key not loaded: lora_unet_layers_28_attention_to_out_0.lora_up.weight lora key not loaded: lora_unet_layers_28_attention_to_q.alpha lora key not loaded: lora_unet_layers_28_attention_to_q.lora_down.weight lora key not loaded: lora_unet_layers_28_attention_to_q.lora_up.weight lora key not loaded: lora_unet_layers_28_attention_to_v.alpha lora key not loaded: lora_unet_layers_28_attention_to_v.lora_down.weight lora key not loaded: lora_unet_layers_28_attention_to_v.lora_up.weight lora key not loaded: lora_unet_layers_29_attention_to_k.alpha lora key not loaded: lora_unet_layers_29_attention_to_k.lora_down.weight lora key not loaded: lora_unet_layers_29_attention_to_k.lora_up.weight lora key not loaded: lora_unet_layers_29_attention_to_out_0.alpha lora key not loaded: lora_unet_layers_29_attention_to_out_0.lora_down.weight lora key not loaded: lora_unet_layers_29_attention_to_out_0.lora_up.weight lora key not loaded: lora_unet_layers_29_attention_to_q.alpha lora key not loaded: lora_unet_layers_29_attention_to_q.lora_down.weight lora key not loaded: lora_unet_layers_29_attention_to_q.lora_up.weight lora key not loaded: lora_unet_layers_29_attention_to_v.alpha lora key not loaded: lora_unet_layers_29_attention_to_v.lora_down.weight lora key not loaded: lora_unet_layers_29_attention_to_v.lora_up.weight lora key not loaded: lora_unet_layers_2_attention_to_k.alpha lora key not loaded: lora_unet_layers_2_attention_to_k.lora_down.weight lora key not loaded: lora_unet_layers_2_attention_to_k.lora_up.weight lora key not loaded: lora_unet_layers_2_attention_to_out_0.alpha lora key not loaded: lora_unet_layers_2_attention_to_out_0.lora_down.weight lora key not loaded: lora_unet_layers_2_attention_to_out_0.lora_up.weight lora key not loaded: lora_unet_layers_2_attention_to_q.alpha lora key not loaded: lora_unet_layers_2_attention_to_q.lora_down.weight lora key not loaded: lora_unet_layers_2_attention_to_q.lora_up.weight lora key not loaded: lora_unet_layers_2_attention_to_v.alpha lora key not loaded: lora_unet_layers_2_attention_to_v.lora_down.weight lora key not loaded: lora_unet_layers_2_attention_to_v.lora_up.weight lora key not loaded: lora_unet_layers_3_attention_to_k.alpha lora key not loaded: lora_unet_layers_3_attention_to_k.lora_down.weight lora key not loaded: lora_unet_layers_3_attention_to_k.lora_up.weight lora key not loaded: lora_unet_layers_3_attention_to_out_0.alpha lora key not loaded: lora_unet_layers_3_attention_to_out_0.lora_down.weight lora key not loaded: lora_unet_layers_3_attention_to_out_0.lora_up.weight lora key not loaded: lora_unet_layers_3_attention_to_q.alpha lora key not loaded: lora_unet_layers_3_attention_to_q.lora_down.weight lora key not loaded: lora_unet_layers_3_attention_to_q.lora_up.weight lora key not loaded: lora_unet_layers_3_attention_to_v.alpha lora key not loaded: lora_unet_layers_3_attention_to_v.lora_down.weight lora key not loaded: lora_unet_layers_3_attention_to_v.lora_up.weight lora key not loaded: lora_unet_layers_4_attention_to_k.alpha lora key not loaded: lora_unet_layers_4_attention_to_k.lora_down.weight lora key not loaded: lora_unet_layers_4_attention_to_k.lora_up.weight lora key not loaded: lora_unet_layers_4_attention_to_out_0.alpha lora key not loaded: lora_unet_layers_4_attention_to_out_0.lora_down.weight lora key not loaded: lora_unet_layers_4_attention_to_out_0.lora_up.weight lora key not loaded: lora_unet_layers_4_attention_to_q.alpha lora key not loaded: lora_unet_layers_4_attention_to_q.lora_down.weight lora key not loaded: lora_unet_layers_4_attention_to_q.lora_up.weight lora key not loaded: lora_unet_layers_4_attention_to_v.alpha lora key not loaded: lora_unet_layers_4_attention_to_v.lora_down.weight lora key not loaded: lora_unet_layers_4_attention_to_v.lora_up.weight lora key not loaded: lora_unet_layers_5_attention_to_k.alpha lora key not loaded: lora_unet_layers_5_attention_to_k.lora_down.weight lora key not loaded: lora_unet_layers_5_attention_to_k.lora_up.weight lora key not loaded: lora_unet_layers_5_attention_to_out_0.alpha lora key not loaded: lora_unet_layers_5_attention_to_out_0.lora_down.weight lora key not loaded: lora_unet_layers_5_attention_to_out_0.lora_up.weight lora key not loaded: lora_unet_layers_5_attention_to_q.alpha lora key not loaded: lora_unet_layers_5_attention_to_q.lora_down.weight lora key not loaded: lora_unet_layers_5_attention_to_q.lora_up.weight lora key not loaded: lora_unet_layers_5_attention_to_v.alpha lora key not loaded: lora_unet_layers_5_attention_to_v.lora_down.weight lora key not loaded: lora_unet_layers_5_attention_to_v.lora_up.weight lora key not loaded: lora_unet_layers_6_attention_to_k.alpha lora key not loaded: lora_unet_layers_6_attention_to_k.lora_down.weight lora key not loaded: lora_unet_layers_6_attention_to_k.lora_up.weight lora key not loaded: lora_unet_layers_6_attention_to_out_0.alpha lora key not loaded: lora_unet_layers_6_attention_to_out_0.lora_down.weight lora key not loaded: lora_unet_layers_6_attention_to_out_0.lora_up.weight lora key not loaded: lora_unet_layers_6_attention_to_q.alpha lora key not loaded: lora_unet_layers_6_attention_to_q.lora_down.weight lora key not loaded: lora_unet_layers_6_attention_to_q.lora_up.weight lora key not loaded: lora_unet_layers_6_attention_to_v.alpha lora key not loaded: lora_unet_layers_6_attention_to_v.lora_down.weight lora key not loaded: lora_unet_layers_6_attention_to_v.lora_up.weight lora key not loaded: lora_unet_layers_7_attention_to_k.alpha lora key not loaded: lora_unet_layers_7_attention_to_k.lora_down.weight lora key not loaded: lora_unet_layers_7_attention_to_k.lora_up.weight lora key not loaded: lora_unet_layers_7_attention_to_out_0.alpha lora key not loaded: lora_unet_layers_7_attention_to_out_0.lora_down.weight lora key not loaded: lora_unet_layers_7_attention_to_out_0.lora_up.weight lora key not loaded: lora_unet_layers_7_attention_to_q.alpha lora key not loaded: lora_unet_layers_7_attention_to_q.lora_down.weight lora key not loaded: lora_unet_layers_7_attention_to_q.lora_up.weight lora key not loaded: lora_unet_layers_7_attention_to_v.alpha lora key not loaded: lora_unet_layers_7_attention_to_v.lora_down.weight lora key not loaded: lora_unet_layers_7_attention_to_v.lora_up.weight lora key not loaded: lora_unet_layers_8_attention_to_k.alpha lora key not loaded: lora_unet_layers_8_attention_to_k.lora_down.weight lora key not loaded: lora_unet_layers_8_attention_to_k.lora_up.weight lora key not loaded: lora_unet_layers_8_attention_to_out_0.alpha lora key not loaded: lora_unet_layers_8_attention_to_out_0.lora_down.weight lora key not loaded: lora_unet_layers_8_attention_to_out_0.lora_up.weight lora key not loaded: lora_unet_layers_8_attention_to_q.alpha lora key not loaded: lora_unet_layers_8_attention_to_q.lora_down.weight lora key not loaded: lora_unet_layers_8_attention_to_q.lora_up.weight lora key not loaded: lora_unet_layers_8_attention_to_v.alpha lora key not loaded: lora_unet_layers_8_attention_to_v.lora_down.weight lora key not loaded: lora_unet_layers_8_attention_to_v.lora_up.weight lora key not loaded: lora_unet_layers_9_attention_to_k.alpha lora key not loaded: lora_unet_layers_9_attention_to_k.lora_down.weight lora key not loaded: lora_unet_layers_9_attention_to_k.lora_up.weight lora key not loaded: lora_unet_layers_9_attention_to_out_0.alpha lora key not loaded: lora_unet_layers_9_attention_to_out_0.lora_down.weight lora key not loaded: lora_unet_layers_9_attention_to_out_0.lora_up.weight lora key not loaded: lora_unet_layers_9_attention_to_q.alpha lora key not loaded: lora_unet_layers_9_attention_to_q.lora_down.weight lora key not loaded: lora_unet_layers_9_attention_to_q.lora_up.weight lora key not loaded: lora_unet_layers_9_attention_to_v.alpha lora key not loaded: lora_unet_layers_9_attention_to_v.lora_down.weight lora key not loaded: lora_unet_layers_9_attention_to_v.lora_up.weight CLIP/text encoder model load device: cuda:0, offload device: cpu, current: cpu, dtype: torch.float16 Requested to load ZImageTEModel_ loaded completely; 30304.86 MB usable, 7672.25 MB loaded, full load: True Requested to load Lumina2 loaded completely; 22520.49 MB usable, 11739.55 MB loaded, full load: True 100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [00:07<00:00, 2.26it/s] Requested to load AutoencodingEngine loaded completely; 7768.60 MB usable, 159.87 MB loaded, full load: True Prompt executed in 69.71 seconds

workflow:

ComfyUI_00626_ (2).json

peepeepeepoopoopoo avatar Dec 24 '25 05:12 peepeepeepoopoopoo