ComfyUI icon indicating copy to clipboard operation
ComfyUI copied to clipboard

Can't load Qwen image edit nunchaku after i updated to new comfy

Open trollver9000 opened this issue 5 days ago • 4 comments

Custom Node Testing

ERROR--

ComfyUI Error Report

Error Details

  • Node ID: 3
  • Node Type: KSampler
  • Exception Type: AttributeError
  • Exception Message: 'list' object has no attribute 'dtype'

Stack Trace

  File "M:\ComfyUI_windows_portable\ComfyUI\execution.py", line 516, in execute
    output_data, output_ui, has_subgraph, has_pending_tasks = await get_output_data(prompt_id, unique_id, obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb, v3_data=v3_data)
                                                              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "M:\ComfyUI_windows_portable\ComfyUI\execution.py", line 330, in get_output_data
    return_values = await _async_map_node_over_list(prompt_id, unique_id, obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb, v3_data=v3_data)
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "M:\ComfyUI_windows_portable\ComfyUI\execution.py", line 304, in _async_map_node_over_list
    await process_inputs(input_dict, i)

  File "M:\ComfyUI_windows_portable\ComfyUI\execution.py", line 292, in process_inputs
    result = f(**inputs)

  File "M:\ComfyUI_windows_portable\ComfyUI\nodes.py", line 1538, in sample
    return common_ksampler(model, seed, steps, cfg, sampler_name, scheduler, positive, negative, latent_image, denoise=denoise)

  File "M:\ComfyUI_windows_portable\ComfyUI\nodes.py", line 1505, in common_ksampler
    samples = comfy.sample.sample(model, noise, steps, cfg, sampler_name, scheduler, positive, negative, latent_image,
                                  denoise=denoise, disable_noise=disable_noise, start_step=start_step, last_step=last_step,
                                  force_full_denoise=force_full_denoise, noise_mask=noise_mask, callback=callback, disable_pbar=disable_pbar, seed=seed)

  File "M:\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-Advanced-ControlNet\adv_control\sampling.py", line 116, in acn_sample
    return orig_comfy_sample(model, *args, **kwargs)

  File "M:\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-Advanced-ControlNet\adv_control\utils.py", line 117, in uncond_multiplier_check_cn_sample
    return orig_comfy_sample(model, *args, **kwargs)

  File "M:\ComfyUI_windows_portable\ComfyUI\comfy\sample.py", line 60, in sample
    samples = sampler.sample(noise, positive, negative, cfg=cfg, latent_image=latent_image, start_step=start_step, last_step=last_step, force_full_denoise=force_full_denoise, denoise_mask=noise_mask, sigmas=sigmas, callback=callback, disable_pbar=disable_pbar, seed=seed)

  File "M:\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-TiledDiffusion\utils.py", line 51, in KSampler_sample
    return orig_fn(*args, **kwargs)

  File "M:\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI_smZNodes\smZNodes.py", line 104, in KSampler_sample
    return orig_fn(*args, **kwargs)

  File "M:\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 1178, in sample
    return sample(self.model, noise, positive, negative, cfg, self.device, sampler, sigmas, self.model_options, latent_image=latent_image, denoise_mask=denoise_mask, callback=callback, disable_pbar=disable_pbar, seed=seed)

  File "M:\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI_smZNodes\smZNodes.py", line 122, in sample
    return orig_fn(*args, **kwargs)

  File "M:\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 1068, in sample
    return cfg_guider.sample(noise, latent_image, sampler, sigmas, denoise_mask, callback, disable_pbar, seed)
           ~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "M:\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 1050, in sample
    output = executor.execute(noise, latent_image, sampler, sigmas, denoise_mask, callback, disable_pbar, seed, latent_shapes=latent_shapes)

  File "M:\ComfyUI_windows_portable\ComfyUI\comfy\patcher_extension.py", line 112, in execute
    return self.original(*args, **kwargs)
           ~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^

  File "M:\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 994, in outer_sample
    output = self.inner_sample(noise, latent_image, device, sampler, sigmas, denoise_mask, callback, disable_pbar, seed, latent_shapes=latent_shapes)

  File "M:\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 980, in inner_sample
    samples = executor.execute(self, sigmas, extra_args, callback, noise, latent_image, denoise_mask, disable_pbar)

  File "M:\ComfyUI_windows_portable\ComfyUI\comfy\patcher_extension.py", line 112, in execute
    return self.original(*args, **kwargs)
           ~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^

  File "M:\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-TiledDiffusion\utils.py", line 34, in KSAMPLER_sample
    return orig_fn(*args, **kwargs)

  File "M:\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI_smZNodes\smZNodes.py", line 87, in KSAMPLER_sample
    return orig_fn(*args, **kwargs)

  File "M:\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 752, in sample
    samples = self.sampler_function(model_k, noise, sigmas, extra_args=extra_args, callback=k_callback, disable=disable_pbar, **self.extra_options)

  File "M:\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\utils\_contextlib.py", line 120, in decorate_context
    return func(*args, **kwargs)

  File "M:\ComfyUI_windows_portable\ComfyUI\comfy\k_diffusion\sampling.py", line 199, in sample_euler
    denoised = model(x, sigma_hat * s_in, **extra_args)

  File "M:\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 401, in __call__
    out = self.inner_model(x, sigma, model_options=model_options, seed=seed)

  File "M:\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 953, in __call__
    return self.outer_predict_noise(*args, **kwargs)
           ~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^

  File "M:\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 960, in outer_predict_noise
    ).execute(x, timestep, model_options, seed)
      ~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "M:\ComfyUI_windows_portable\ComfyUI\comfy\patcher_extension.py", line 112, in execute
    return self.original(*args, **kwargs)
           ~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^

  File "M:\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 963, in predict_noise
    return sampling_function(self.inner_model, x, timestep, self.conds.get("negative", None), self.conds.get("positive", None), self.cfg, model_options=model_options, seed=seed)

  File "M:\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI_smZNodes\smZNodes.py", line 162, in sampling_function
    out = orig_fn(*args, **kwargs)

  File "M:\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 381, in sampling_function
    out = calc_cond_batch(model, conds, x, timestep, model_options)

  File "M:\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 206, in calc_cond_batch
    return _calc_cond_batch_outer(model, conds, x_in, timestep, model_options)

  File "M:\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 214, in _calc_cond_batch_outer
    return executor.execute(model, conds, x_in, timestep, model_options)
           ~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "M:\ComfyUI_windows_portable\ComfyUI\comfy\patcher_extension.py", line 112, in execute
    return self.original(*args, **kwargs)
           ~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^

  File "M:\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 326, in _calc_cond_batch
    output = model.apply_model(input_x, timestep_, **c).chunk(batch_chunks)
             ~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^

  File "M:\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-Advanced-ControlNet\adv_control\utils.py", line 69, in apply_model_uncond_cleanup_wrapper
    return orig_apply_model(self, *args, **kwargs)

  File "M:\ComfyUI_windows_portable\ComfyUI\comfy\model_base.py", line 162, in apply_model
    return comfy.patcher_extension.WrapperExecutor.new_class_executor(
           ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    ...<2 lines>...
        comfy.patcher_extension.get_all_wrappers(comfy.patcher_extension.WrappersMP.APPLY_MODEL, transformer_options)
        ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    ).execute(x, t, c_concat, c_crossattn, control, transformer_options, **kwargs)
    ~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "M:\ComfyUI_windows_portable\ComfyUI\comfy\patcher_extension.py", line 112, in execute
    return self.original(*args, **kwargs)
           ~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^

  File "M:\ComfyUI_windows_portable\ComfyUI\comfy\model_base.py", line 204, in _apply_model
    model_output = self.diffusion_model(xc, t, context=context, control=control, transformer_options=transformer_options, **extra_conds)

  File "M:\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1773, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
           ~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^

  File "M:\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1784, in _call_impl
    return forward_call(*args, **kwargs)

  File "M:\ComfyUI_windows_portable\ComfyUI\comfy\ldm\qwen_image\model.py", line 411, in forward
    return comfy.patcher_extension.WrapperExecutor.new_class_executor(
           ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    ...<2 lines>...
        comfy.patcher_extension.get_all_wrappers(comfy.patcher_extension.WrappersMP.DIFFUSION_MODEL, transformer_options)
        ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    ).execute(x, timestep, context, attention_mask, ref_latents, additional_t_cond, transformer_options, **kwargs)
    ~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "M:\ComfyUI_windows_portable\ComfyUI\comfy\patcher_extension.py", line 112, in execute
    return self.original(*args, **kwargs)
           ~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^

  File "M:\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-nunchaku\models\qwenimage.py", line 726, in _forward
    else self.time_text_embed(timestep, guidance, hidden_states)
         ~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "M:\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1773, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
           ~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^

  File "M:\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1784, in _call_impl
    return forward_call(*args, **kwargs)

  File "M:\ComfyUI_windows_portable\ComfyUI\comfy\ldm\qwen_image\model.py", line 81, in forward
    timesteps_emb = self.timestep_embedder(timesteps_proj.to(dtype=hidden_states.dtype))
                                                                   ^^^^^^^^^^^^^^^^^^^

System Information

  • ComfyUI Version: 0.6.0
  • Arguments: ComfyUI\main.py --windows-standalone-build
  • OS: win32
  • Python Version: 3.13.6 (tags/v3.13.6:4e66535, Aug 6 2025, 14:36:00) [MSC v.1944 64 bit (AMD64)]
  • Embedded Python: true
  • PyTorch Version: 2.8.0+cu129

Devices

  • Name: cuda:0 NVIDIA GeForce RTX 3090 : cudaMallocAsync
    • Type: cuda
    • VRAM Total: 25769279488
    • VRAM Free: 24162675312
    • Torch VRAM Total: 335544320
    • Torch VRAM Free: 116730480

What is going on? Should i revert or devs broke fundamentals ?

Steps to Reproduce

load qwen edit in nunchaku node

Debug Logs

~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "M:\ComfyUI_windows_portable\ComfyUI\comfy\model_management.py", line 509, in model_load
    self.model_use_more_vram(use_more_vram, force_patch_weights=force_patch_weights)
    ~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "M:\ComfyUI_windows_portable\ComfyUI\comfy\model_management.py", line 539, in model_use_more_vram
    return self.model.partially_load(self.device, extra_memory, force_patch_weights=force_patch_weights)
           ~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "M:\ComfyUI_windows_portable\ComfyUI\comfy\model_patcher.py", line 980, in partially_load
    self.detach()
    ~~~~~~~~~~~^^
  File "M:\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-nunchaku\model_patcher.py", line 41, in detach
    self.model.diffusion_model.to_safely(self.offload_device)
    ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^
  File "M:\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-QwenImageLoraLoader\wrappers\qwenimage.py", line 65, in to_safely
    self.model.to(device)
    ^^^^^^^^^^^^^
AttributeError: 'NoneType' object has no attribute 'to'

Other

No response

trollver9000 avatar Dec 25 '25 15:12 trollver9000

https://github.com/nunchaku-tech/ComfyUI-nunchaku/issues/734

trollver9000 avatar Dec 25 '25 16:12 trollver9000

l来信已收到..缓后回复....

CGgangzi avatar Dec 25 '25 16:12 CGgangzi

Issue #25: ComfyUI 0.4.0 Model Management Errors Status: ⚠️ Environment Dependent - May require ComfyUI core fixes Issue: After the ComfyUI 0.4.0 update, multiple nodes (including this one in some environments) experienced errors such as TypeError: 'NoneType' object is not callable and AttributeError: 'NoneType' object has no attribute. In our environment, we resolved these errors by modifying ComfyUI's core model_management.py. Note that in our environment, these errors did not occur with this node (ComfyUI-QwenImageLoraLoader). Nunchaku library and ComfyUI-Nunchaku nodes should use the latest versions. If errors persist even after applying the latest version of this node (ComfyUI-QwenImageLoraLoader), modification of ComfyUI's core model_management.py may be necessary. Root Cause: In ComfyUI 0.4.0, ComfyUI's core model_management.py lacks sufficient None checks, causing TypeError and AttributeError when accessing methods or attributes on objects that became None after models were unloaded or garbage collected. This problem is not a bug in individual nodes, but rather a structural issue in ComfyUI 0.4.0's model management (model_management.py). GC Changes in ComfyUI 0.4.0: Compared to ComfyUI 0.3.x, automatic model unloading occurs earlier, making the following flow more likely: ModelPatcher → GC → weakref(None) This also explains why the occurrence of the issue varies by user environment.

Technical Basis: Multiple Locations with "Missing None Checks" - This is not a bug in individual nodes, but the core main logic crashes when accessing attributes on None. The added fixes like if model is None: continue are defensive code that ComfyUI core should have in all paths. Post-Weak-Reference GC Behavior Not Considered - The introduction of LoadedModel._model = weakref.ref(ModelPatcher) in ComfyUI 0.4.0 is a breaking change. When the weak reference target is garbage collected, it returns None, but this is not handled. Post-processing for the breaking update is incomplete. Multiple Nodes Were Affected in a Chain Reaction - This is not a problem with nodes, but multiple nodes were affected in a chain reaction due to core behavior changes. Model loading/unloading, memory calculation, GPU/CPU offloading, and ModelPatcher lifecycle are all controlled by ComfyUI core. All Fix Locations Are Core Responsibility Areas - The locations fixed (model_memory, model_offloaded_memory, load_models_gpu, free_memory, model_unload, is_dead checks, etc.) are all ComfyUI core functions. These are not areas that node developers should touch. The fact that all fix locations are core logic leaves no explanation other than a core defect. Result of Applying Fixes - After applying None check fixes to ComfyUI's core model_management.py in our environment, similar errors were resolved. This demonstrates that the problem can be solved by adding defensive code that the core should have. Model Lifecycle and ModelPatcher Initialization Relationship: Fact 1: Relationship between LoadedModel and ModelPatcher - The LoadedModel class (lines 502-524 in ComfyUI's model_management.py) holds a weak reference to ModelPatcher: def _set_model(self, model): self._model = weakref.ref(model) # Weak reference to ModelPatcher

@property def model(self): return self._model() # Returns None when garbage collected Fact 2: ModelPatcher Initialization - In the init of the ModelPatcher class (lines 215-237 in model_patcher.py), the pinned attribute is initialized: def init(self, model, load_device, offload_device, size=0, weight_inplace_update=False): # ... self.pinned = set() # Line 237: Initialized Fact 3: Fix Content in ComfyUI Core's model_management.py - The fix in ComfyUI's core model_management.py now skips LoadedModel instances where model is None: In load_models_gpu(), skips LoadedModel instances where model is None (lines 712, 727, 743); In free_memory(), excludes LoadedModel instances where model is None (line 646). Fact 4: Problem Before Fix - LoadedModel holds a weak reference to ModelPatcher. When garbage collected, LoadedModel.model returns None. Before the fix, methods were called on LoadedModel instances where model was None, causing errors. Fact 5: Behavior After Fix - By skipping LoadedModel instances where model is None, errors do not occur. Because errors do not occur, processing continues normally. Fact 6: Why copy.deepcopy Fails - copy.deepcopy fails because references to GC'd ModelPatcher instances remain in the dictionary being deepcopied. When these references are accessed, they return None, causing deepcopy to stop. Fact 7: Confirmation Items - After applying the fix, copy.deepcopy and pinned attribute errors do not occur in our environment. Nunchaku library and ComfyUI-Nunchaku nodes should use the latest versions, but this may still be insufficient. While these errors did not occur with this node (ComfyUI-QwenImageLoraLoader) in our environment, the fix to ComfyUI's core model_management.py may have indirectly affected it, making errors less likely to occur. Important Note: Not a Problem with Nunchaku Library - This problem is not caused by the Nunchaku library's implementation. Nunchaku's model_config and ModelPatcher itself are normal. The problem is in the upstream = ComfyUI core's model_management.py GC processing. Speculation (Items That May Be Environment-Dependent): The fix allows ModelPatcher initialization to complete normally. As a result, the pinned attribute is also properly initialized. Accessing self.pinned in del does not cause errors. Recommendations: Update Nunchaku library and ComfyUI-Nunchaku nodes to the latest version (addresses model_config issues) Consider applying None check fixes to ComfyUI's core model_management.py (may address the root cause) Note: This is the first support measure. I have published the technical details of the fixes I applied to ComfyUI's core model_management.py in my environment. See COMFYUI_0.4.0_UPDATE_ERROR_FIXES.md for details. Note that these fixes were applied in my specific environment and may not work universally in all environments. This may also resolve copy.deepcopy and pinned attribute errors. Related Issues: https://github.com/ussoewwin/ComfyUI-QwenImageLoraLoader/issues/25 - AttributeError: 'NunchakuModelPatcher' object has no attribute 'pinned' and deepcopy errors with model_config https://github.com/ussoewwin/ComfyUI-QwenImageLoraLoader/issues/33 - AttributeError: 'NoneType' object has no attribute 'to' in to_safely method (Fixed in v2.1.0) https://github.com/comfyanonymous/ComfyUI/issues/6590: 'NoneType' object has no attribute 'shape' https://github.com/comfyanonymous/ComfyUI/issues/6600: 'NoneType' object is not callable (Loader-related) https://github.com/comfyanonymous/ComfyUI/issues/6532: Crash after referencing models after model unload Issue #30: TypeError: got multiple values for argument 'guidance' (v2.0+) Issue Link: https://github.com/ussoewwin/ComfyUI-QwenImageLoraLoader/issues/30 Status: ⚠️ May Still Occur in Some Environments - Even with v2.0.8 fixes Issue: TypeError: got multiple values for argument 'guidance' error may still occur in some user environments when using v2.0+ versions with diffsynth ControlNet support, despite multiple fixes applied from v2.0.2 to v2.0.8. Root Cause: v2.0+ versions include diffsynth ControlNet support, which requires complex argument handling between ComfyUI's scheduler patches, external patches (e.g., ComfyUI-EulerDiscreteScheduler), and the QwenImageTransformer2DModel.forward signature. Even with multiple layers of defense (exclusion logic in both forward and _execute_model methods), some edge cases in certain environments may still cause argument duplication. Solution for Affected Users: If you continue to experience TypeError: got multiple values for argument 'guidance' errors with v2.0+ versions even after updating to v2.0.8, please use v1.72 instead, which does not include diffsynth ControlNet support and therefore avoids these argument passing complexities. v1.72 Release: v1.72 Release Note: v1.72 is the latest v1.x release before v2.0+ diffsynth ControlNet support was added. If you don't need diffsynth ControlNet functionality, v1.72 provides stable LoRA loading without the argument passing complexities introduced in v2.0+. Related Issues: https://github.com/ussoewwin/ComfyUI-QwenImageLoraLoader/issues/32 - TypeError: got multiple values for argument 'guidance' error when using LoRA with KSampler Known Limitations

trollver9000 avatar Dec 25 '25 16:12 trollver9000

i use my comfy setup in a venv configuration, so i activated my venv then reverted to 0.4.0, in console using

git checkout 791e30f

then restart comfy. so both nunchaku qwen and loras worked fine after.

for portable, go to ...\ComfyUI_windows_portable\comfyui in explorer adress list type cmd (+enter) and in console type git fetch --all git checkout v0.4.0

should work for comfyUI_windows portable hope this helps.

thefirstangel3d avatar Dec 26 '25 05:12 thefirstangel3d

now try to install new nunchaku to get new zimage loader node to show up, good luck, i cant do it . i am on 4.0

trollver9000 avatar Dec 28 '25 01:12 trollver9000

It's just a temporary fix that worked for me. I seen it as an okayish trade off since z-image is a small fast model and for me runs fine without nunchaku, and the new manager is just a beta which i can live a few days without. however, I mentinoned in my other post I'm sure devs already work on solution, other solutions I tried didn't work for me yet, and I too hope everything will go back to fully working soon including the recent update. If you find a better solution that resolves the compatibility/timestep issue with most recent comfyUI update included I'm more than happy to take it too.

thefirstangel3d avatar Dec 29 '25 18:12 thefirstangel3d