Given groups=1, weight of size [4, 4, 1, 1], expected input[1, 16, 64, 64] to have 4 channels, but got 16 channels instead
Your question
How do I fix this error? :')
Logs
# ComfyUI Error Report
## Error Details
- **Node Type:** VAEDecode
- **Exception Type:** RuntimeError
- **Exception Message:** Given groups=1, weight of size [4, 4, 1, 1], expected input[1, 16, 64, 64] to have 4 channels, but got 16 channels instead
## Stack Trace
File "D:\ComfyUI_windows_portable\ComfyUI\execution.py", line 323, in execute
output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\ComfyUI_windows_portable\ComfyUI\execution.py", line 198, in get_output_data
return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\ComfyUI_windows_portable\ComfyUI\execution.py", line 169, in _map_node_over_list
process_inputs(input_dict, i)
File "D:\ComfyUI_windows_portable\ComfyUI\execution.py", line 158, in process_inputs
results.append(getattr(obj, func)(**inputs))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\ComfyUI_windows_portable\ComfyUI\nodes.py", line 284, in decode
images = vae.decode(samples["samples"])
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\ComfyUI_windows_portable\ComfyUI\comfy\sd.py", line 347, in decode
out = self.process_output(self.first_stage_model.decode(samples).to(self.output_device).float())
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\ComfyUI_windows_portable\ComfyUI\comfy\ldm\models\autoencoder.py", line 199, in decode
dec = self.post_quant_conv(z)
^^^^^^^^^^^^^^^^^^^^^^^
File "D:\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1736, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1747, in _call_impl
return forward_call(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\ComfyUI_windows_portable\ComfyUI\comfy\ops.py", line 98, in forward
return super().forward(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\nn\modules\conv.py", line 554, in forward
return self._conv_forward(input, self.weight, self.bias)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\nn\modules\conv.py", line 549, in _conv_forward
return F.conv2d(
^^^^^^^^^
File "D:\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\_tensor.py", line 1512, in __torch_function__
ret = func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
System Information
- ComfyUI Version: v0.2.7
- Arguments: ComfyUI\main.py --cpu --windows-standalone-build
- OS: nt
- Python Version: 3.12.7 (tags/v3.12.7:0b05ead, Oct 1 2024, 03:06:41) [MSC v.1941 64 bit (AMD64)]
- Embedded Python: true
- PyTorch Version: 2.5.1+cu124
Devices
- Name: cpu
- Type: cpu
- VRAM Total: 16901771264
- VRAM Free: 1937580032
- Torch VRAM Total: 16901771264
- Torch VRAM Free: 1937580032
Logs
2024-11-15 11:24:22,029 - root - INFO - Total VRAM 16119 MB, total RAM 16119 MB
2024-11-15 11:24:22,029 - root - INFO - pytorch version: 2.5.1+cu124
2024-11-15 11:24:22,029 - root - INFO - Set vram state to: DISABLED
2024-11-15 11:24:22,029 - root - INFO - Device: cpu
2024-11-15 11:24:32,396 - root - INFO - Using sub quadratic optimization for cross attention, if you have memory or speed issues try using: --use-split-cross-attention
2024-11-15 11:25:43,955 - root - INFO - [Prompt Server] web root: D:\ComfyUI_windows_portable\ComfyUI\web
2024-11-15 11:25:58,337 - root - INFO -
Import times for custom nodes:
2024-11-15 11:25:58,337 - root - INFO - 0.0 seconds: D:\ComfyUI_windows_portable\ComfyUI\custom_nodes\websocket_image_save.py
2024-11-15 11:25:58,337 - root - INFO - 0.1 seconds: D:\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-GGUF
2024-11-15 11:25:58,337 - root - INFO -
2024-11-15 11:25:58,346 - root - INFO - Starting server
2024-11-15 11:25:58,346 - root - INFO - To see the GUI go to: http://127.0.0.1:8188
2024-11-15 11:41:27,815 - root - INFO - got prompt
2024-11-15 11:41:27,830 - root - ERROR - Failed to validate prompt for output 9:
2024-11-15 11:41:27,830 - root - ERROR - * DualCLIPLoader 11:
2024-11-15 11:41:27,830 - root - ERROR - - Value not in list: clip_name1: 'clip-vit-large-patch14 .safetensors' not in ['model.safetensors', 't5xxl_fp8_e4m3fn_scaled.safetensors']
2024-11-15 11:41:27,830 - root - ERROR - - Value not in list: clip_name2: 't5xxl_fp8_e4m3fn.safetensors' not in ['model.safetensors', 't5xxl_fp8_e4m3fn_scaled.safetensors']
2024-11-15 11:41:27,834 - root - ERROR - * LoraLoader 29:
2024-11-15 11:41:27,834 - root - ERROR - - Value not in list: lora_name: '{'content': 'flux_realism_lora.safetensors', 'image': None, 'title': 'flux_realism_lora.safetensors'}' not in ['lora.safetensors']
2024-11-15 11:41:27,834 - root - ERROR - Output will be ignored
2024-11-15 11:41:27,835 - root - WARNING - invalid prompt: {'type': 'prompt_outputs_failed_validation', 'message': 'Prompt outputs failed validation', 'details': '', 'extra_info': {}}
2024-11-15 11:42:57,954 - root - INFO - got prompt
2024-11-15 11:42:57,966 - root - ERROR - Failed to validate prompt for output 9:
2024-11-15 11:42:57,966 - root - ERROR - * DualCLIPLoader 11:
2024-11-15 11:42:57,966 - root - ERROR - - Value not in list: clip_name1: 'clip-vit-large-patch14 .safetensors' not in ['model.safetensors', 't5xxl_fp8_e4m3fn_scaled.safetensors']
2024-11-15 11:42:57,966 - root - ERROR - - Value not in list: clip_name2: 't5xxl_fp8_e4m3fn.safetensors' not in ['model.safetensors', 't5xxl_fp8_e4m3fn_scaled.safetensors']
2024-11-15 11:42:57,966 - root - ERROR - Output will be ignored
2024-11-15 11:42:57,966 - root - WARNING - invalid prompt: {'type': 'prompt_outputs_failed_validation', 'message': 'Prompt outputs failed validation', 'details': '', 'extra_info': {}}
2024-11-15 11:43:17,593 - root - INFO - got prompt
2024-11-15 11:43:17,724 - root - INFO - Using split attention in VAE
2024-11-15 11:43:17,733 - root - INFO - Using split attention in VAE
2024-11-15 11:43:19,334 - root - INFO - Requested to load FluxClipModel_
2024-11-15 11:43:19,334 - root - INFO - Loading 1 new model
2024-11-15 11:43:19,365 - root - INFO - loaded completely 0.0 4903.231597900391 True
2024-11-15 11:43:44,347 - root - INFO - model weight dtype torch.float32, manual cast: None
2024-11-15 11:43:44,404 - root - INFO - model_type FLUX
2024-11-15 11:43:44,789 - root - WARNING - lora key not loaded: double_blocks.0.processor.proj_lora1.down.weight
2024-11-15 11:43:44,789 - root - WARNING - lora key not loaded: double_blocks.0.processor.proj_lora1.up.weight
2024-11-15 11:43:44,789 - root - WARNING - lora key not loaded: double_blocks.0.processor.proj_lora2.down.weight
2024-11-15 11:43:44,789 - root - WARNING - lora key not loaded: double_blocks.0.processor.proj_lora2.up.weight
2024-11-15 11:43:44,789 - root - WARNING - lora key not loaded: double_blocks.0.processor.qkv_lora1.down.weight
2024-11-15 11:43:44,789 - root - WARNING - lora key not loaded: double_blocks.0.processor.qkv_lora1.up.weight
2024-11-15 11:43:44,789 - root - WARNING - lora key not loaded: double_blocks.0.processor.qkv_lora2.down.weight
2024-11-15 11:43:44,789 - root - WARNING - lora key not loaded: double_blocks.0.processor.qkv_lora2.up.weight
2024-11-15 11:43:44,789 - root - WARNING - lora key not loaded: double_blocks.1.processor.proj_lora1.down.weight
2024-11-15 11:43:44,792 - root - WARNING - lora key not loaded: double_blocks.1.processor.proj_lora1.up.weight
2024-11-15 11:43:44,792 - root - WARNING - lora key not loaded: double_blocks.1.processor.proj_lora2.down.weight
2024-11-15 11:43:44,792 - root - WARNING - lora key not loaded: double_blocks.1.processor.proj_lora2.up.weight
2024-11-15 11:43:44,792 - root - WARNING - lora key not loaded: double_blocks.1.processor.qkv_lora1.down.weight
2024-11-15 11:43:44,792 - root - WARNING - lora key not loaded: double_blocks.1.processor.qkv_lora1.up.weight
2024-11-15 11:43:44,792 - root - WARNING - lora key not loaded: double_blocks.1.processor.qkv_lora2.down.weight
2024-11-15 11:43:44,792 - root - WARNING - lora key not loaded: double_blocks.1.processor.qkv_lora2.up.weight
2024-11-15 11:43:44,792 - root - WARNING - lora key not loaded: double_blocks.10.processor.proj_lora1.down.weight
2024-11-15 11:43:44,792 - root - WARNING - lora key not loaded: double_blocks.10.processor.proj_lora1.up.weight
2024-11-15 11:43:44,792 - root - WARNING - lora key not loaded: double_blocks.10.processor.proj_lora2.down.weight
2024-11-15 11:43:44,792 - root - WARNING - lora key not loaded: double_blocks.10.processor.proj_lora2.up.weight
2024-11-15 11:43:44,792 - root - WARNING - lora key not loaded: double_blocks.10.processor.qkv_lora1.down.weight
2024-11-15 11:43:44,792 - root - WARNING - lora key not loaded: double_blocks.10.processor.qkv_lora1.up.weight
2024-11-15 11:43:44,792 - root - WARNING - lora key not loaded: double_blocks.10.processor.qkv_lora2.down.weight
2024-11-15 11:43:44,792 - root - WARNING - lora key not loaded: double_blocks.10.processor.qkv_lora2.up.weight
2024-11-15 11:43:44,792 - root - WARNING - lora key not loaded: double_blocks.11.processor.proj_lora1.down.weight
2024-11-15 11:43:44,792 - root - WARNING - lora key not loaded: double_blocks.11.processor.proj_lora1.up.weight
2024-11-15 11:43:44,792 - root - WARNING - lora key not loaded: double_blocks.11.processor.proj_lora2.down.weight
2024-11-15 11:43:44,792 - root - WARNING - lora key not loaded: double_blocks.11.processor.proj_lora2.up.weight
2024-11-15 11:43:44,792 - root - WARNING - lora key not loaded: double_blocks.11.processor.qkv_lora1.down.weight
2024-11-15 11:43:44,792 - root - WARNING - lora key not loaded: double_blocks.11.processor.qkv_lora1.up.weight
2024-11-15 11:43:44,792 - root - WARNING - lora key not loaded: double_blocks.11.processor.qkv_lora2.down.weight
2024-11-15 11:43:44,792 - root - WARNING - lora key not loaded: double_blocks.11.processor.qkv_lora2.up.weight
2024-11-15 11:43:44,792 - root - WARNING - lora key not loaded: double_blocks.12.processor.proj_lora1.down.weight
2024-11-15 11:43:44,792 - root - WARNING - lora key not loaded: double_blocks.12.processor.proj_lora1.up.weight
2024-11-15 11:43:44,792 - root - WARNING - lora key not loaded: double_blocks.12.processor.proj_lora2.down.weight
2024-11-15 11:43:44,798 - root - WARNING - lora key not loaded: double_blocks.12.processor.proj_lora2.up.weight
2024-11-15 11:43:44,798 - root - WARNING - lora key not loaded: double_blocks.12.processor.qkv_lora1.down.weight
2024-11-15 11:43:44,798 - root - WARNING - lora key not loaded: double_blocks.12.processor.qkv_lora1.up.weight
2024-11-15 11:43:44,798 - root - WARNING - lora key not loaded: double_blocks.12.processor.qkv_lora2.down.weight
2024-11-15 11:43:44,798 - root - WARNING - lora key not loaded: double_blocks.12.processor.qkv_lora2.up.weight
2024-11-15 11:43:44,798 - root - WARNING - lora key not loaded: double_blocks.13.processor.proj_lora1.down.weight
2024-11-15 11:43:44,798 - root - WARNING - lora key not loaded: double_blocks.13.processor.proj_lora1.up.weight
2024-11-15 11:43:44,798 - root - WARNING - lora key not loaded: double_blocks.13.processor.proj_lora2.down.weight
2024-11-15 11:43:44,798 - root - WARNING - lora key not loaded: double_blocks.13.processor.proj_lora2.up.weight
2024-11-15 11:43:44,798 - root - WARNING - lora key not loaded: double_blocks.13.processor.qkv_lora1.down.weight
2024-11-15 11:43:44,798 - root - WARNING - lora key not loaded: double_blocks.13.processor.qkv_lora1.up.weight
2024-11-15 11:43:44,798 - root - WARNING - lora key not loaded: double_blocks.13.processor.qkv_lora2.down.weight
2024-11-15 11:43:44,798 - root - WARNING - lora key not loaded: double_blocks.13.processor.qkv_lora2.up.weight
2024-11-15 11:43:44,798 - root - WARNING - lora key not loaded: double_blocks.14.processor.proj_lora1.down.weight
2024-11-15 11:43:44,798 - root - WARNING - lora key not loaded: double_blocks.14.processor.proj_lora1.up.weight
2024-11-15 11:43:44,798 - root - WARNING - lora key not loaded: double_blocks.14.processor.proj_lora2.down.weight
2024-11-15 11:43:44,801 - root - WARNING - lora key not loaded: double_blocks.14.processor.proj_lora2.up.weight
2024-11-15 11:43:44,801 - root - WARNING - lora key not loaded: double_blocks.14.processor.qkv_lora1.down.weight
2024-11-15 11:43:44,801 - root - WARNING - lora key not loaded: double_blocks.14.processor.qkv_lora1.up.weight
2024-11-15 11:43:44,801 - root - WARNING - lora key not loaded: double_blocks.14.processor.qkv_lora2.down.weight
2024-11-15 11:43:44,801 - root - WARNING - lora key not loaded: double_blocks.14.processor.qkv_lora2.up.weight
2024-11-15 11:43:44,801 - root - WARNING - lora key not loaded: double_blocks.15.processor.proj_lora1.down.weight
2024-11-15 11:43:44,801 - root - WARNING - lora key not loaded: double_blocks.15.processor.proj_lora1.up.weight
2024-11-15 11:43:44,801 - root - WARNING - lora key not loaded: double_blocks.15.processor.proj_lora2.down.weight
2024-11-15 11:43:44,801 - root - WARNING - lora key not loaded: double_blocks.15.processor.proj_lora2.up.weight
2024-11-15 11:43:44,801 - root - WARNING - lora key not loaded: double_blocks.15.processor.qkv_lora1.down.weight
2024-11-15 11:43:44,801 - root - WARNING - lora key not loaded: double_blocks.15.processor.qkv_lora1.up.weight
2024-11-15 11:43:44,801 - root - WARNING - lora key not loaded: double_blocks.15.processor.qkv_lora2.down.weight
2024-11-15 11:43:44,803 - root - WARNING - lora key not loaded: double_blocks.15.processor.qkv_lora2.up.weight
2024-11-15 11:43:44,803 - root - WARNING - lora key not loaded: double_blocks.16.processor.proj_lora1.down.weight
2024-11-15 11:43:44,803 - root - WARNING - lora key not loaded: double_blocks.16.processor.proj_lora1.up.weight
2024-11-15 11:43:44,803 - root - WARNING - lora key not loaded: double_blocks.16.processor.proj_lora2.down.weight
2024-11-15 11:43:44,803 - root - WARNING - lora key not loaded: double_blocks.16.processor.proj_lora2.up.weight
2024-11-15 11:43:44,803 - root - WARNING - lora key not loaded: double_blocks.16.processor.qkv_lora1.down.weight
2024-11-15 11:43:44,804 - root - WARNING - lora key not loaded: double_blocks.16.processor.qkv_lora1.up.weight
2024-11-15 11:43:44,804 - root - WARNING - lora key not loaded: double_blocks.16.processor.qkv_lora2.down.weight
2024-11-15 11:43:44,804 - root - WARNING - lora key not loaded: double_blocks.16.processor.qkv_lora2.up.weight
2024-11-15 11:43:44,804 - root - WARNING - lora key not loaded: double_blocks.17.processor.proj_lora1.down.weight
2024-11-15 11:43:44,804 - root - WARNING - lora key not loaded: double_blocks.17.processor.proj_lora1.up.weight
2024-11-15 11:43:44,804 - root - WARNING - lora key not loaded: double_blocks.17.processor.proj_lora2.down.weight
2024-11-15 11:43:44,804 - root - WARNING - lora key not loaded: double_blocks.17.processor.proj_lora2.up.weight
2024-11-15 11:43:44,804 - root - WARNING - lora key not loaded: double_blocks.17.processor.qkv_lora1.down.weight
2024-11-15 11:43:44,804 - root - WARNING - lora key not loaded: double_blocks.17.processor.qkv_lora1.up.weight
2024-11-15 11:43:44,804 - root - WARNING - lora key not loaded: double_blocks.17.processor.qkv_lora2.down.weight
2024-11-15 11:43:44,806 - root - WARNING - lora key not loaded: double_blocks.17.processor.qkv_lora2.up.weight
2024-11-15 11:43:44,806 - root - WARNING - lora key not loaded: double_blocks.18.processor.proj_lora1.down.weight
2024-11-15 11:43:44,806 - root - WARNING - lora key not loaded: double_blocks.18.processor.proj_lora1.up.weight
2024-11-15 11:43:44,806 - root - WARNING - lora key not loaded: double_blocks.18.processor.proj_lora2.down.weight
2024-11-15 11:43:44,807 - root - WARNING - lora key not loaded: double_blocks.18.processor.proj_lora2.up.weight
2024-11-15 11:43:44,807 - root - WARNING - lora key not loaded: double_blocks.18.processor.qkv_lora1.down.weight
2024-11-15 11:43:44,807 - root - WARNING - lora key not loaded: double_blocks.18.processor.qkv_lora1.up.weight
2024-11-15 11:43:44,807 - root - WARNING - lora key not loaded: double_blocks.18.processor.qkv_lora2.down.weight
2024-11-15 11:43:44,808 - root - WARNING - lora key not loaded: double_blocks.18.processor.qkv_lora2.up.weight
2024-11-15 11:43:44,808 - root - WARNING - lora key not loaded: double_blocks.2.processor.proj_lora1.down.weight
2024-11-15 11:43:44,808 - root - WARNING - lora key not loaded: double_blocks.2.processor.proj_lora1.up.weight
2024-11-15 11:43:44,808 - root - WARNING - lora key not loaded: double_blocks.2.processor.proj_lora2.down.weight
2024-11-15 11:43:44,808 - root - WARNING - lora key not loaded: double_blocks.2.processor.proj_lora2.up.weight
2024-11-15 11:43:44,808 - root - WARNING - lora key not loaded: double_blocks.2.processor.qkv_lora1.down.weight
2024-11-15 11:43:44,808 - root - WARNING - lora key not loaded: double_blocks.2.processor.qkv_lora1.up.weight
2024-11-15 11:43:44,808 - root - WARNING - lora key not loaded: double_blocks.2.processor.qkv_lora2.down.weight
2024-11-15 11:43:44,808 - root - WARNING - lora key not loaded: double_blocks.2.processor.qkv_lora2.up.weight
2024-11-15 11:43:44,808 - root - WARNING - lora key not loaded: double_blocks.3.processor.proj_lora1.down.weight
2024-11-15 11:43:44,808 - root - WARNING - lora key not loaded: double_blocks.3.processor.proj_lora1.up.weight
2024-11-15 11:43:44,808 - root - WARNING - lora key not loaded: double_blocks.3.processor.proj_lora2.down.weight
2024-11-15 11:43:44,808 - root - WARNING - lora key not loaded: double_blocks.3.processor.proj_lora2.up.weight
2024-11-15 11:43:44,808 - root - WARNING - lora key not loaded: double_blocks.3.processor.qkv_lora1.down.weight
2024-11-15 11:43:44,808 - root - WARNING - lora key not loaded: double_blocks.3.processor.qkv_lora1.up.weight
2024-11-15 11:43:44,808 - root - WARNING - lora key not loaded: double_blocks.3.processor.qkv_lora2.down.weight
2024-11-15 11:43:44,808 - root - WARNING - lora key not loaded: double_blocks.3.processor.qkv_lora2.up.weight
2024-11-15 11:43:44,808 - root - WARNING - lora key not loaded: double_blocks.4.processor.proj_lora1.down.weight
2024-11-15 11:43:44,808 - root - WARNING - lora key not loaded: double_blocks.4.processor.proj_lora1.up.weight
2024-11-15 11:43:44,808 - root - WARNING - lora key not loaded: double_blocks.4.processor.proj_lora2.down.weight
2024-11-15 11:43:44,808 - root - WARNING - lora key not loaded: double_blocks.4.processor.proj_lora2.up.weight
2024-11-15 11:43:44,808 - root - WARNING - lora key not loaded: double_blocks.4.processor.qkv_lora1.down.weight
2024-11-15 11:43:44,808 - root - WARNING - lora key not loaded: double_blocks.4.processor.qkv_lora1.up.weight
2024-11-15 11:43:44,808 - root - WARNING - lora key not loaded: double_blocks.4.processor.qkv_lora2.down.weight
2024-11-15 11:43:44,808 - root - WARNING - lora key not loaded: double_blocks.4.processor.qkv_lora2.up.weight
2024-11-15 11:43:44,808 - root - WARNING - lora key not loaded: double_blocks.5.processor.proj_lora1.down.weight
2024-11-15 11:43:44,808 - root - WARNING - lora key not loaded: double_blocks.5.processor.proj_lora1.up.weight
2024-11-15 11:43:44,814 - root - WARNING - lora key not loaded: double_blocks.5.processor.proj_lora2.down.weight
2024-11-15 11:43:44,814 - root - WARNING - lora key not loaded: double_blocks.5.processor.proj_lora2.up.weight
2024-11-15 11:43:44,814 - root - WARNING - lora key not loaded: double_blocks.5.processor.qkv_lora1.down.weight
2024-11-15 11:43:44,814 - root - WARNING - lora key not loaded: double_blocks.5.processor.qkv_lora1.up.weight
2024-11-15 11:43:44,814 - root - WARNING - lora key not loaded: double_blocks.5.processor.qkv_lora2.down.weight
2024-11-15 11:43:44,814 - root - WARNING - lora key not loaded: double_blocks.5.processor.qkv_lora2.up.weight
2024-11-15 11:43:44,814 - root - WARNING - lora key not loaded: double_blocks.6.processor.proj_lora1.down.weight
2024-11-15 11:43:44,814 - root - WARNING - lora key not loaded: double_blocks.6.processor.proj_lora1.up.weight
2024-11-15 11:43:44,814 - root - WARNING - lora key not loaded: double_blocks.6.processor.proj_lora2.down.weight
2024-11-15 11:43:44,814 - root - WARNING - lora key not loaded: double_blocks.6.processor.proj_lora2.up.weight
2024-11-15 11:43:44,814 - root - WARNING - lora key not loaded: double_blocks.6.processor.qkv_lora1.down.weight
2024-11-15 11:43:44,814 - root - WARNING - lora key not loaded: double_blocks.6.processor.qkv_lora1.up.weight
2024-11-15 11:43:44,814 - root - WARNING - lora key not loaded: double_blocks.6.processor.qkv_lora2.down.weight
2024-11-15 11:43:44,814 - root - WARNING - lora key not loaded: double_blocks.6.processor.qkv_lora2.up.weight
2024-11-15 11:43:44,814 - root - WARNING - lora key not loaded: double_blocks.7.processor.proj_lora1.down.weight
2024-11-15 11:43:44,814 - root - WARNING - lora key not loaded: double_blocks.7.processor.proj_lora1.up.weight
2024-11-15 11:43:44,814 - root - WARNING - lora key not loaded: double_blocks.7.processor.proj_lora2.down.weight
2024-11-15 11:43:44,814 - root - WARNING - lora key not loaded: double_blocks.7.processor.proj_lora2.up.weight
2024-11-15 11:43:44,814 - root - WARNING - lora key not loaded: double_blocks.7.processor.qkv_lora1.down.weight
2024-11-15 11:43:44,814 - root - WARNING - lora key not loaded: double_blocks.7.processor.qkv_lora1.up.weight
2024-11-15 11:43:44,822 - root - WARNING - lora key not loaded: double_blocks.7.processor.qkv_lora2.down.weight
2024-11-15 11:43:44,822 - root - WARNING - lora key not loaded: double_blocks.7.processor.qkv_lora2.up.weight
2024-11-15 11:43:44,822 - root - WARNING - lora key not loaded: double_blocks.8.processor.proj_lora1.down.weight
2024-11-15 11:43:44,822 - root - WARNING - lora key not loaded: double_blocks.8.processor.proj_lora1.up.weight
2024-11-15 11:43:44,822 - root - WARNING - lora key not loaded: double_blocks.8.processor.proj_lora2.down.weight
2024-11-15 11:43:44,822 - root - WARNING - lora key not loaded: double_blocks.8.processor.proj_lora2.up.weight
2024-11-15 11:43:44,822 - root - WARNING - lora key not loaded: double_blocks.8.processor.qkv_lora1.down.weight
2024-11-15 11:43:44,822 - root - WARNING - lora key not loaded: double_blocks.8.processor.qkv_lora1.up.weight
2024-11-15 11:43:44,822 - root - WARNING - lora key not loaded: double_blocks.8.processor.qkv_lora2.down.weight
2024-11-15 11:43:44,822 - root - WARNING - lora key not loaded: double_blocks.8.processor.qkv_lora2.up.weight
2024-11-15 11:43:44,822 - root - WARNING - lora key not loaded: double_blocks.9.processor.proj_lora1.down.weight
2024-11-15 11:43:44,822 - root - WARNING - lora key not loaded: double_blocks.9.processor.proj_lora1.up.weight
2024-11-15 11:43:44,822 - root - WARNING - lora key not loaded: double_blocks.9.processor.proj_lora2.down.weight
2024-11-15 11:43:44,822 - root - WARNING - lora key not loaded: double_blocks.9.processor.proj_lora2.up.weight
2024-11-15 11:43:44,822 - root - WARNING - lora key not loaded: double_blocks.9.processor.qkv_lora1.down.weight
2024-11-15 11:43:44,822 - root - WARNING - lora key not loaded: double_blocks.9.processor.qkv_lora1.up.weight
2024-11-15 11:43:44,822 - root - WARNING - lora key not loaded: double_blocks.9.processor.qkv_lora2.down.weight
2024-11-15 11:43:44,822 - root - WARNING - lora key not loaded: double_blocks.9.processor.qkv_lora2.up.weight
2024-11-15 11:43:44,854 - root - INFO - Requested to load FluxClipModel_
2024-11-15 11:43:44,854 - root - INFO - Loading 1 new model
2024-11-15 11:44:45,985 - root - INFO - got prompt
2024-11-15 11:45:16,543 - root - INFO - Processing interrupted
2024-11-15 11:45:16,549 - root - INFO - Prompt executed in 118.94 seconds
2024-11-15 11:53:47,017 - root - INFO - got prompt
2024-11-15 11:53:47,104 - root - INFO - Requested to load Flux
2024-11-15 11:53:47,104 - root - INFO - Loading 1 new model
2024-11-15 11:53:47,250 - root - INFO - loaded completely 0.0 6476.4727783203125 True
2024-11-15 12:18:34,935 - root - INFO - Requested to load AutoencoderKL
2024-11-15 12:18:34,937 - root - INFO - Loading 1 new model
2024-11-15 12:18:35,010 - root - INFO - loaded completely 0.0 319.11416244506836 True
2024-11-15 12:18:35,253 - root - ERROR - !!! Exception during processing !!! Given groups=1, weight of size [4, 4, 1, 1], expected input[1, 16, 64, 64] to have 4 channels, but got 16 channels instead
2024-11-15 12:18:35,326 - root - ERROR - Traceback (most recent call last):
File "D:\ComfyUI_windows_portable\ComfyUI\execution.py", line 323, in execute
output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\ComfyUI_windows_portable\ComfyUI\execution.py", line 198, in get_output_data
return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\ComfyUI_windows_portable\ComfyUI\execution.py", line 169, in _map_node_over_list
process_inputs(input_dict, i)
File "D:\ComfyUI_windows_portable\ComfyUI\execution.py", line 158, in process_inputs
results.append(getattr(obj, func)(**inputs))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\ComfyUI_windows_portable\ComfyUI\nodes.py", line 284, in decode
images = vae.decode(samples["samples"])
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\ComfyUI_windows_portable\ComfyUI\comfy\sd.py", line 347, in decode
out = self.process_output(self.first_stage_model.decode(samples).to(self.output_device).float())
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\ComfyUI_windows_portable\ComfyUI\comfy\ldm\models\autoencoder.py", line 199, in decode
dec = self.post_quant_conv(z)
^^^^^^^^^^^^^^^^^^^^^^^
File "D:\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1736, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1747, in _call_impl
return forward_call(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\ComfyUI_windows_portable\ComfyUI\comfy\ops.py", line 98, in forward
return super().forward(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\nn\modules\conv.py", line 554, in forward
return self._conv_forward(input, self.weight, self.bias)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\nn\modules\conv.py", line 549, in _conv_forward
return F.conv2d(
^^^^^^^^^
File "D:\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\_tensor.py", line 1512, in __torch_function__
ret = func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
RuntimeError: Given groups=1, weight of size [4, 4, 1, 1], expected input[1, 16, 64, 64] to have 4 channels, but got 16 channels instead
2024-11-15 12:18:35,342 - root - INFO - Prompt executed in 1488.30 seconds
Attached Workflow
Please make sure that workflow does not contain any sensitive information such as API keys or passwords.
{"last_node_id":31,"last_link_id":60,"nodes":[{"id":5,"type":"EmptyLatentImage","pos":{"0":467,"1":692},"size":{"0":315,"1":106},"flags":{},"order":0,"mode":0,"inputs":[],"outputs":[{"name":"LATENT","type":"LATENT","links":[53],"slot_index":0}],"properties":{"Node name for S&R":"EmptyLatentImage"},"widgets_values":[512,512,1]},{"id":8,"type":"VAEDecode","pos":{"0":1248,"1":192},"size":{"0":210,"1":46},"flags":{},"order":8,"mode":0,"inputs":[{"name":"samples","type":"LATENT","link":58},{"name":"vae","type":"VAE","link":47}],"outputs":[{"name":"IMAGE","type":"IMAGE","links":[9],"slot_index":0}],"properties":{"Node name for S&R":"VAEDecode"},"widgets_values":[]},{"id":9,"type":"SaveImage","pos":{"0":1249,"1":319},"size":{"0":985.3012084960938,"1":1060.3828125},"flags":{},"order":9,"mode":0,"inputs":[{"name":"images","type":"IMAGE","link":9}],"outputs":[],"properties":{},"widgets_values":["ComfyUI"]},{"id":31,"type":"CLIPTextEncode","pos":{"0":379,"1":458},"size":{"0":422.84503173828125,"1":164.31304931640625},"flags":{},"order":6,"mode":0,"inputs":[{"name":"clip","type":"CLIP","link":55}],"outputs":[{"name":"CONDITIONING","type":"CONDITIONING","links":[56],"slot_index":0}],"properties":{"Node name for S&R":"Negative Prompt"},"widgets_values":[""]},{"id":27,"type":"UnetLoaderGGUF","pos":{"0":-28,"1":280},"size":{"0":315,"1":58},"flags":{},"order":1,"mode":0,"inputs":[],"outputs":[{"name":"MODEL","type":"MODEL","links":[59],"slot_index":0,"shape":3}],"properties":{"Node name for S&R":"UnetLoaderGGUF"},"widgets_values":["flux1-dev-Q4_0.gguf"]},{"id":10,"type":"VAELoader","pos":{"0":865,"1":621},"size":{"0":342.8424072265625,"1":58},"flags":{},"order":2,"mode":0,"inputs":[],"outputs":[{"name":"VAE","type":"VAE","links":[47],"slot_index":0,"shape":3}],"properties":{"Node name for S&R":"VAELoader"},"widgets_values":["diffusion_pytorch_model.safetensors"]},{"id":30,"type":"KSampler","pos":{"0":871,"1":259},"size":{"0":315,"1":262},"flags":{},"order":7,"mode":0,"inputs":[{"name":"model","type":"MODEL","link":60},{"name":"positive","type":"CONDITIONING","link":54},{"name":"negative","type":"CONDITIONING","link":56},{"name":"latent_image","type":"LATENT","link":53}],"outputs":[{"name":"LATENT","type":"LATENT","links":[58],"slot_index":0,"shape":3}],"properties":{"Node name for S&R":"KSampler"},"widgets_values":[186090833170789,"randomize",4,1,"euler","simple",1]},{"id":29,"type":"LoraLoader","pos":{"0":-20,"1":620},"size":{"0":352.6841735839844,"1":190.88589477539062},"flags":{},"order":4,"mode":0,"inputs":[{"name":"model","type":"MODEL","link":59},{"name":"clip","type":"CLIP","link":49}],"outputs":[{"name":"MODEL","type":"MODEL","links":[60],"slot_index":0,"shape":3},{"name":"CLIP","type":"CLIP","links":[51,55],"slot_index":1,"shape":3},{"name":"STRING","type":"STRING","links":null,"slot_index":2,"shape":3}],"properties":{"Node name for S&R":"LoraLoader|pysssss"},"widgets_values":["lora.safetensors",1,1]},{"id":11,"type":"DualCLIPLoader","pos":{"0":-9,"1":445},"size":{"0":315,"1":106},"flags":{},"order":3,"mode":0,"inputs":[],"outputs":[{"name":"CLIP","type":"CLIP","links":[49],"slot_index":0,"shape":3}],"properties":{"Node name for S&R":"DualCLIPLoader"},"widgets_values":["model.safetensors","t5xxl_fp8_e4m3fn_scaled.safetensors","flux"]},{"id":6,"type":"CLIPTextEncode","pos":{"0":363,"1":231},"size":{"0":422.84503173828125,"1":164.31304931640625},"flags":{},"order":5,"mode":0,"inputs":[{"name":"clip","type":"CLIP","link":51}],"outputs":[{"name":"CONDITIONING","type":"CONDITIONING","links":[54],"slot_index":0}],"properties":{"Node name for S&R":"CLIPTextEncode"},"widgets_values":["Astronaut in a jungle, cold color palette, muted colors, very detailed, sharp focus"]}],"links":[[9,8,0,9,0,"IMAGE"],[47,10,0,8,1,"VAE"],[49,11,0,29,1,"CLIP"],[51,29,1,6,0,"CLIP"],[53,5,0,30,3,"LATENT"],[54,6,0,30,1,"CONDITIONING"],[55,29,1,31,0,"CLIP"],[56,31,0,30,2,"CONDITIONING"],[58,30,0,8,0,"LATENT"],[59,27,0,29,0,"MODEL"],[60,29,0,30,0,"MODEL"]],"groups":[],"config":{},"extra":{"ds":{"scale":0.6934334949441332,"offset":[197.4006043750865,-22.58295267413774]}},"version":0.4}
Additional Context
(Please add any additional context or steps to reproduce the error here)
### Other
_No response_
One of the models is probably wrong, I guess it's the VAE (suggestion: you need to better rename the models)
Try using this VAE: https://huggingface.co/black-forest-labs/FLUX.1-schnell/blob/main/ae.safetensors
Here if you need clip L and T5xxl text encoders: https://huggingface.co/comfyanonymous/flux_text_encoders/tree/main
Anyway, LoRA is also not compatible, this is why you have all those WARNING - lora key not loaded
For an accurate diagnosis, it is essential to clearly specify where each model you are using has been sourced from. (clip, vae, lora)
This issue is being marked stale because it has not had any activity for 30 days. Reply below within 7 days if your issue still isn't solved, and it will be left open. Otherwise, the issue will be closed automatically.
Comment from a noob. July 2025, hope to get better. What I find with errors like this is that it depends on the models input. Your workflow can probably work if you use different inputs. I suggest you try different models to see which work. I had an error comment like this one only to find my workflow worked with a different checkpoint and VAE.