ComfyUI icon indicating copy to clipboard operation
ComfyUI copied to clipboard

Cannot import name 'triton_key' from 'triton.compiler.compiler'

Open MrBuubles007 opened this issue 4 months ago • 2 comments

Custom Node Testing

Your question

Tried to install ComfyUI(portable version) with Triton and SageAttention, but after 6 hours I kept getting this error(see log below). I'm using Windows 11 Pro with a RTX4090. My virtual environment runs on Miniconda3 with Python V.3.12.12. I've installed torch 2.9.0 + cuda1.30 . The workflow I'm testing, is a WAN2.2 workflow. The error comes up if it tries to execute the "KSampler (Advanced)" node. The sampler is called res_2s and the scheduler is beta57.

Logs

# ComfyUI Error Report
## Error Details
- **Node ID:** 35
- **Node Type:** KSamplerAdvanced
- **Exception Type:** torch._dynamo.exc.BackendCompilerFailed
- **Exception Message:** backend='inductor' raised:
ImportError: cannot import name 'triton_key' from 'triton.compiler.compiler' (E:\AIProjects\ComfyUI\python_embeded\Lib\site-packages\triton\compiler\compiler.py)

Set TORCHDYNAMO_VERBOSE=1 for the internal stack trace (please do this especially if you're reporting a bug to PyTorch). For even more developer context, set TORCH_LOGS="+dynamo"


## Stack Trace

  File "E:\AIProjects\ComfyUI\ComfyUI\execution.py", line 496, in execute
    output_data, output_ui, has_subgraph, has_pending_tasks = await get_output_data(prompt_id, unique_id, obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb, hidden_inputs=hidden_inputs)
                                                              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "E:\AIProjects\ComfyUI\ComfyUI\execution.py", line 315, in get_output_data
    return_values = await _async_map_node_over_list(prompt_id, unique_id, obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb, hidden_inputs=hidden_inputs)
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "E:\AIProjects\ComfyUI\ComfyUI\execution.py", line 289, in _async_map_node_over_list
    await process_inputs(input_dict, i)

  File "E:\AIProjects\ComfyUI\ComfyUI\execution.py", line 277, in process_inputs
    result = f(**inputs)

  File "E:\AIProjects\ComfyUI\ComfyUI\nodes.py", line 1559, in sample
    return common_ksampler(model, noise_seed, steps, cfg, sampler_name, scheduler, positive, negative, latent_image, denoise=denoise, disable_noise=disable_noise, start_step=start_at_step, last_step=end_at_step, force_full_denoise=force_full_denoise)

  File "E:\AIProjects\ComfyUI\ComfyUI\nodes.py", line 1492, in common_ksampler
    samples = comfy.sample.sample(model, noise, steps, cfg, sampler_name, scheduler, positive, negative, latent_image,
                                  denoise=denoise, disable_noise=disable_noise, start_step=start_step, last_step=last_step,
                                  force_full_denoise=force_full_denoise, noise_mask=noise_mask, callback=callback, disable_pbar=disable_pbar, seed=seed)

  File "E:\AIProjects\ComfyUI\ComfyUI\comfy\sample.py", line 45, in sample
    samples = sampler.sample(noise, positive, negative, cfg=cfg, latent_image=latent_image, start_step=start_step, last_step=last_step, force_full_denoise=force_full_denoise, denoise_mask=noise_mask, sigmas=sigmas, callback=callback, disable_pbar=disable_pbar, seed=seed)

  File "E:\AIProjects\ComfyUI\ComfyUI\comfy\samplers.py", line 1154, in sample
    return sample(self.model, noise, positive, negative, cfg, self.device, sampler, sigmas, self.model_options, latent_image=latent_image, denoise_mask=denoise_mask, callback=callback, disable_pbar=disable_pbar, seed=seed)

  File "E:\AIProjects\ComfyUI\ComfyUI\comfy\samplers.py", line 1044, in sample
    return cfg_guider.sample(noise, latent_image, sampler, sigmas, denoise_mask, callback, disable_pbar, seed)
           ~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "E:\AIProjects\ComfyUI\ComfyUI\comfy\samplers.py", line 1029, in sample
    output = executor.execute(noise, latent_image, sampler, sigmas, denoise_mask, callback, disable_pbar, seed)

  File "E:\AIProjects\ComfyUI\ComfyUI\comfy\patcher_extension.py", line 112, in execute
    return self.original(*args, **kwargs)
           ~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^

  File "E:\AIProjects\ComfyUI\ComfyUI\comfy\samplers.py", line 997, in outer_sample
    output = self.inner_sample(noise, latent_image, device, sampler, sigmas, denoise_mask, callback, disable_pbar, seed)

  File "E:\AIProjects\ComfyUI\ComfyUI\comfy\samplers.py", line 980, in inner_sample
    samples = executor.execute(self, sigmas, extra_args, callback, noise, latent_image, denoise_mask, disable_pbar)

  File "E:\AIProjects\ComfyUI\ComfyUI\comfy\patcher_extension.py", line 112, in execute
    return self.original(*args, **kwargs)
           ~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^

  File "E:\AIProjects\ComfyUI\ComfyUI\comfy\samplers.py", line 752, in sample
    samples = self.sampler_function(model_k, noise, sigmas, extra_args=extra_args, callback=k_callback, disable=disable_pbar, **self.extra_options)

  File "E:\AIProjects\ComfyUI\ComfyUI\custom_nodes\RES4LYF\beta\__init__.py", line 167, in sample_res_2s
    return rk_sampler_beta.sample_rk_beta(model, x, sigmas, None, extra_args, callback, disable, rk_type="res_2s",)
           ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "E:\AIProjects\ComfyUI\python_embeded\Lib\site-packages\torch\utils\_contextlib.py", line 120, in decorate_context
    return func(*args, **kwargs)

  File "E:\AIProjects\ComfyUI\ComfyUI\custom_nodes\RES4LYF\beta\rk_sampler_beta.py", line 1665, in sample_rk_beta
    eps_[row], data_[row] = RK(x_tmp, s_tmp, x_0, sigma, transformer_options={'row': row, 'x_tmp': x_tmp, 'sigma_next': sigma_next})
                            ~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "E:\AIProjects\ComfyUI\ComfyUI\custom_nodes\RES4LYF\beta\rk_method_beta.py", line 901, in __call__
    denoised = self.model_denoised(x.to(self.model_device), sub_sigma.to(self.model_device), **self.extra_args).to(sigma.device)
               ~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "E:\AIProjects\ComfyUI\ComfyUI\custom_nodes\RES4LYF\beta\rk_method_beta.py", line 241, in model_denoised
    denoised = self.model(x, sigma * s_in, **extra_args)

  File "E:\AIProjects\ComfyUI\ComfyUI\comfy\samplers.py", line 401, in __call__
    out = self.inner_model(x, sigma, model_options=model_options, seed=seed)

  File "E:\AIProjects\ComfyUI\ComfyUI\comfy\samplers.py", line 953, in __call__
    return self.outer_predict_noise(*args, **kwargs)
           ~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^

  File "E:\AIProjects\ComfyUI\ComfyUI\comfy\samplers.py", line 960, in outer_predict_noise
    ).execute(x, timestep, model_options, seed)
      ~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "E:\AIProjects\ComfyUI\ComfyUI\comfy\patcher_extension.py", line 112, in execute
    return self.original(*args, **kwargs)
           ~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^

  File "E:\AIProjects\ComfyUI\ComfyUI\comfy\samplers.py", line 963, in predict_noise
    return sampling_function(self.inner_model, x, timestep, self.conds.get("negative", None), self.conds.get("positive", None), self.cfg, model_options=model_options, seed=seed)

  File "E:\AIProjects\ComfyUI\ComfyUI\comfy\samplers.py", line 381, in sampling_function
    out = calc_cond_batch(model, conds, x, timestep, model_options)

  File "E:\AIProjects\ComfyUI\ComfyUI\comfy\samplers.py", line 206, in calc_cond_batch
    return _calc_cond_batch_outer(model, conds, x_in, timestep, model_options)

  File "E:\AIProjects\ComfyUI\ComfyUI\comfy\samplers.py", line 214, in _calc_cond_batch_outer
    return executor.execute(model, conds, x_in, timestep, model_options)
           ~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "E:\AIProjects\ComfyUI\ComfyUI\comfy\patcher_extension.py", line 112, in execute
    return self.original(*args, **kwargs)
           ~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^

  File "E:\AIProjects\ComfyUI\ComfyUI\comfy\samplers.py", line 326, in _calc_cond_batch
    output = model.apply_model(input_x, timestep_, **c).chunk(batch_chunks)
             ~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^

  File "E:\AIProjects\ComfyUI\ComfyUI\comfy\model_base.py", line 161, in apply_model
    return comfy.patcher_extension.WrapperExecutor.new_class_executor(
           ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    ...<2 lines>...
        comfy.patcher_extension.get_all_wrappers(comfy.patcher_extension.WrappersMP.APPLY_MODEL, transformer_options)
        ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    ).execute(x, t, c_concat, c_crossattn, control, transformer_options, **kwargs)
    ~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "E:\AIProjects\ComfyUI\ComfyUI\comfy\patcher_extension.py", line 113, in execute
    return self.wrappers[self.idx](self, *args, **kwargs)
           ~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^

  File "E:\AIProjects\ComfyUI\ComfyUI\comfy_api\torch_helpers\torch_compile.py", line 26, in apply_torch_compile_wrapper
    return executor(*args, **kwargs)

  File "E:\AIProjects\ComfyUI\ComfyUI\comfy\patcher_extension.py", line 105, in __call__
    return new_executor.execute(*args, **kwargs)
           ~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^

  File "E:\AIProjects\ComfyUI\ComfyUI\comfy\patcher_extension.py", line 112, in execute
    return self.original(*args, **kwargs)
           ~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^

  File "E:\AIProjects\ComfyUI\ComfyUI\comfy\model_base.py", line 200, in _apply_model
    model_output = self.diffusion_model(xc, t, context=context, control=control, transformer_options=transformer_options, **extra_conds).float()
                   ~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "E:\AIProjects\ComfyUI\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1773, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
           ~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^

  File "E:\AIProjects\ComfyUI\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1784, in _call_impl
    return forward_call(*args, **kwargs)

  File "E:\AIProjects\ComfyUI\ComfyUI\comfy\ldm\wan\model.py", line 614, in forward
    return comfy.patcher_extension.WrapperExecutor.new_class_executor(
           ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    ...<2 lines>...
        comfy.patcher_extension.get_all_wrappers(comfy.patcher_extension.WrappersMP.DIFFUSION_MODEL, transformer_options)
        ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    ).execute(x, timestep, context, clip_fea, time_dim_concat, transformer_options, **kwargs)
    ~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "E:\AIProjects\ComfyUI\ComfyUI\comfy\patcher_extension.py", line 112, in execute
    return self.original(*args, **kwargs)
           ~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^

  File "E:\AIProjects\ComfyUI\ComfyUI\comfy\ldm\wan\model.py", line 634, in _forward
    return self.forward_orig(x, timestep, context, clip_fea=clip_fea, freqs=freqs, transformer_options=transformer_options, **kwargs)[:, :, :t, :h, :w]
           ~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "E:\AIProjects\ComfyUI\ComfyUI\comfy\ldm\wan\model.py", line 579, in forward_orig
    x = block(x, e=e0, freqs=freqs, context=context, context_img_len=context_img_len, transformer_options=transformer_options)

  File "E:\AIProjects\ComfyUI\python_embeded\Lib\site-packages\torch\_dynamo\eval_frame.py", line 375, in __call__
    return super().__call__(*args, **kwargs)
           ~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^

  File "E:\AIProjects\ComfyUI\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1773, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
           ~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^

  File "E:\AIProjects\ComfyUI\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1784, in _call_impl
    return forward_call(*args, **kwargs)

  File "E:\AIProjects\ComfyUI\python_embeded\Lib\site-packages\torch\_dynamo\eval_frame.py", line 749, in compile_wrapper
    raise e.remove_dynamo_frames() from None  # see TORCHDYNAMO_VERBOSE=1
    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "E:\AIProjects\ComfyUI\python_embeded\Lib\site-packages\torch\_dynamo\output_graph.py", line 1871, in _call_user_compiler
    raise BackendCompilerFailed(
        self.compiler_fn, e, inspect.currentframe()
    ).with_traceback(e.__traceback__) from None

  File "E:\AIProjects\ComfyUI\python_embeded\Lib\site-packages\torch\_dynamo\output_graph.py", line 1846, in _call_user_compiler
    compiled_fn = compiler_fn(gm, example_inputs)

  File "E:\AIProjects\ComfyUI\python_embeded\Lib\site-packages\torch\_dynamo\repro\after_dynamo.py", line 150, in __call__
    compiled_gm = compiler_fn(gm, example_inputs)

  File "E:\AIProjects\ComfyUI\python_embeded\Lib\site-packages\torch\__init__.py", line 2380, in __call__
    return compile_fx(model_, inputs_, config_patches=self.config)

  File "E:\AIProjects\ComfyUI\python_embeded\Lib\site-packages\torch\_inductor\compile_fx.py", line 2418, in compile_fx
    return aot_autograd(
           ~~~~~~~~~~~~~
    ...<8 lines>...
        ignore_shape_env=ignore_shape_env,
        ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    )(model_, example_inputs_)
    ~^^^^^^^^^^^^^^^^^^^^^^^^^

  File "E:\AIProjects\ComfyUI\python_embeded\Lib\site-packages\torch\_dynamo\backends\common.py", line 109, in __call__
    cg = aot_module_simplified(gm, example_inputs, **self.kwargs)

  File "E:\AIProjects\ComfyUI\python_embeded\Lib\site-packages\torch\_functorch\aot_autograd.py", line 1199, in aot_module_simplified
    compiled_fn = AOTAutogradCache.load(
        dispatch_and_compile,
    ...<6 lines>...
        remote,
    )

  File "E:\AIProjects\ComfyUI\python_embeded\Lib\site-packages\torch\_functorch\_aot_autograd\autograd_cache.py", line 1140, in load
    compiled_fn = dispatch_and_compile()

  File "E:\AIProjects\ComfyUI\python_embeded\Lib\site-packages\torch\_functorch\aot_autograd.py", line 1184, in dispatch_and_compile
    compiled_fn, _ = create_aot_dispatcher_function(
                     ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^
        functional_call,
        ^^^^^^^^^^^^^^^^
    ...<3 lines>...
        shape_env,
        ^^^^^^^^^^
    )
    ^

  File "E:\AIProjects\ComfyUI\python_embeded\Lib\site-packages\torch\_functorch\aot_autograd.py", line 576, in create_aot_dispatcher_function
    return _create_aot_dispatcher_function(
        flat_fn, fake_flat_args, aot_config, fake_mode, shape_env
    )

  File "E:\AIProjects\ComfyUI\python_embeded\Lib\site-packages\torch\_functorch\aot_autograd.py", line 836, in _create_aot_dispatcher_function
    compiled_fn, fw_metadata = compiler_fn(
                               ~~~~~~~~~~~^
        flat_fn,
        ^^^^^^^^
    ...<2 lines>...
        fw_metadata=fw_metadata,
        ^^^^^^^^^^^^^^^^^^^^^^^^
    )
    ^

  File "E:\AIProjects\ComfyUI\python_embeded\Lib\site-packages\torch\_functorch\_aot_autograd\jit_compile_runtime_wrappers.py", line 245, in aot_dispatch_base
    compiled_fw = compiler(fw_module, updated_flat_args)

  File "E:\AIProjects\ComfyUI\python_embeded\Lib\site-packages\torch\_functorch\aot_autograd.py", line 483, in __call__
    return self.compiler_fn(gm, example_inputs)
           ~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^

  File "E:\AIProjects\ComfyUI\python_embeded\Lib\site-packages\torch\_inductor\compile_fx.py", line 2250, in fw_compiler_base
    return inner_compile(
        gm,
    ...<5 lines>...
        boxed_forward_device_index=forward_device,
    )

  File "E:\AIProjects\ComfyUI\python_embeded\Lib\site-packages\torch\_inductor\compile_fx.py", line 745, in compile_fx_inner
    return wrap_compiler_debug(_compile_fx_inner, compiler_name="inductor")(
           ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^
        gm,
        ^^^
        example_inputs,
        ^^^^^^^^^^^^^^^
        **kwargs,
        ^^^^^^^^^
    )
    ^

  File "E:\AIProjects\ComfyUI\python_embeded\Lib\site-packages\torch\_dynamo\repro\after_aot.py", line 124, in debug_wrapper
    inner_compiled_fn = compiler_fn(gm, example_inputs)

  File "E:\AIProjects\ComfyUI\python_embeded\Lib\site-packages\torch\_inductor\compile_fx.py", line 860, in _compile_fx_inner
    (key_info, cache_info) = FxGraphCache.prepare_key(
                             ~~~~~~~~~~~~~~~~~~~~~~~~^
        gm, example_inputs, graph_kwargs, inputs_to_check, remote
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    )
    ^

  File "E:\AIProjects\ComfyUI\python_embeded\Lib\site-packages\torch\_inductor\codecache.py", line 1474, in prepare_key
    key, debug_lines = compiled_fx_graph_hash(
                       ~~~~~~~~~~~~~~~~~~~~~~^
        gm, example_inputs, fx_kwargs, inputs_to_check
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    )
    ^

  File "E:\AIProjects\ComfyUI\python_embeded\Lib\site-packages\torch\_inductor\codecache.py", line 960, in compiled_fx_graph_hash
    details = FxGraphHashDetails(gm, example_inputs, fx_kwargs, inputs_to_check)

  File "E:\AIProjects\ComfyUI\python_embeded\Lib\site-packages\torch\_inductor\codecache.py", line 896, in __init__
    self.system_info = CacheBase.get_system()
                       ~~~~~~~~~~~~~~~~~~~~^^

  File "E:\AIProjects\ComfyUI\python_embeded\Lib\site-packages\torch\_inductor\codecache.py", line 205, in get_system
    from triton.compiler.compiler import triton_key


## System Information
- **ComfyUI Version:** 0.3.66
- **Arguments:** ComfyUI\main.py --use-sage-attention
- **OS:** nt
- **Python Version:** 3.13.6 (tags/v3.13.6:4e66535, Aug  6 2025, 14:36:00) [MSC v.1944 64 bit (AMD64)]
- **Embedded Python:** true
- **PyTorch Version:** 2.8.0+cu129
## Devices

- **Name:** cuda:0 NVIDIA GeForce RTX 4090 : cudaMallocAsync
  - **Type:** cuda
  - **VRAM Total:** 25756696576
  - **VRAM Free:** 24108859392
  - **Torch VRAM Total:** 0
  - **Torch VRAM Free:** 0

## Logs

2025-10-24T21:14:23.021788 - [START] Security scan2025-10-24T21:14:23.021810 - 
2025-10-24T21:14:23.902710 - [DONE] Security scan2025-10-24T21:14:23.902720 - 
2025-10-24T21:14:23.992509 - ## ComfyUI-Manager: installing dependencies done.2025-10-24T21:14:23.992601 - 
2025-10-24T21:14:23.992695 - ** ComfyUI startup time:2025-10-24T21:14:23.992809 -  2025-10-24T21:14:23.992857 - 2025-10-24 21:14:23.9922025-10-24T21:14:23.992902 - 
2025-10-24T21:14:23.992949 - ** Platform:2025-10-24T21:14:23.993003 -  2025-10-24T21:14:23.993045 - Windows2025-10-24T21:14:23.993087 - 
2025-10-24T21:14:23.993132 - ** Python version:2025-10-24T21:14:23.993173 -  2025-10-24T21:14:23.993216 - 3.13.6 (tags/v3.13.6:4e66535, Aug  6 2025, 14:36:00) [MSC v.1944 64 bit (AMD64)]2025-10-24T21:14:23.993293 - 
2025-10-24T21:14:23.993344 - ** Python executable:2025-10-24T21:14:23.993414 -  2025-10-24T21:14:23.993466 - E:\AIProjects\ComfyUI\python_embeded\python.exe2025-10-24T21:14:23.993514 - 
2025-10-24T21:14:23.993559 - ** ComfyUI Path:2025-10-24T21:14:23.993599 -  2025-10-24T21:14:23.993679 - E:\AIProjects\ComfyUI\ComfyUI2025-10-24T21:14:23.993744 - 
2025-10-24T21:14:23.993801 - ** ComfyUI Base Folder Path:2025-10-24T21:14:23.993848 -  2025-10-24T21:14:23.993903 - E:\AIProjects\ComfyUI\ComfyUI2025-10-24T21:14:23.993952 - 
2025-10-24T21:14:23.993998 - ** User directory:2025-10-24T21:14:23.994036 -  2025-10-24T21:14:23.994082 - E:\AIProjects\ComfyUI\ComfyUI\user2025-10-24T21:14:23.994125 - 
2025-10-24T21:14:23.994174 - ** ComfyUI-Manager config path:2025-10-24T21:14:23.994215 -  2025-10-24T21:14:23.997775 - E:\AIProjects\ComfyUI\ComfyUI\user\default\ComfyUI-Manager\config.ini2025-10-24T21:14:23.997832 - 
2025-10-24T21:14:23.997886 - ** Log path:2025-10-24T21:14:23.997958 -  2025-10-24T21:14:23.998014 - E:\AIProjects\ComfyUI\ComfyUI\user\comfyui.log2025-10-24T21:14:23.998048 - 
2025-10-24T21:14:24.781865 - 
Prestartup times for custom nodes:
2025-10-24T21:14:24.782018 -    2.2 seconds: E:\AIProjects\ComfyUI\ComfyUI\custom_nodes\comfyui-manager
2025-10-24T21:14:24.782129 - 
2025-10-24T21:14:26.404298 - Checkpoint files will always be loaded safely.
2025-10-24T21:14:26.542224 - Total VRAM 24564 MB, total RAM 31887 MB
2025-10-24T21:14:26.542331 - pytorch version: 2.8.0+cu129
2025-10-24T21:14:26.542813 - Set vram state to: NORMAL_VRAM
2025-10-24T21:14:26.543087 - Device: cuda:0 NVIDIA GeForce RTX 4090 : cudaMallocAsync
2025-10-24T21:14:27.730120 - Using sage attention
2025-10-24T21:14:29.621864 - Python version: 3.13.6 (tags/v3.13.6:4e66535, Aug  6 2025, 14:36:00) [MSC v.1944 64 bit (AMD64)]
2025-10-24T21:14:29.622010 - ComfyUI version: 0.3.66
2025-10-24T21:14:29.668585 - ComfyUI frontend version: 1.28.7
2025-10-24T21:14:29.670190 - [Prompt Server] web root: E:\AIProjects\ComfyUI\python_embeded\Lib\site-packages\comfyui_frontend_package\static
2025-10-24T21:14:30.689224 - ComfyUI-GGUF: Allowing full torch compile
2025-10-24T21:14:31.042120 - ### Loading: ComfyUI-Manager (V3.37)
2025-10-24T21:14:31.042603 - [ComfyUI-Manager] network_mode: public
2025-10-24T21:14:31.118709 - ### ComfyUI Revision: 150 [560b1bdf] *DETACHED | Released on '2025-10-21'
2025-10-24T21:14:31.253467 - [ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/alter-list.json
2025-10-24T21:14:31.265767 - [ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/model-list.json
2025-10-24T21:14:31.311905 - [ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/github-stats.json
2025-10-24T21:14:31.383169 - [ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/extension-node-map.json
2025-10-24T21:14:31.418831 - [ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/custom-node-list.json
2025-10-24T21:14:31.907477 - Using sage attention
2025-10-24T21:14:32.103461 - E:\AIProjects\ComfyUI\ComfyUI\custom_nodes\RES4LYF\helper_sigma_preview_image_preproc.py:850: SyntaxWarning: invalid escape sequence '\h'
  labels.append("$Δ \hat{t}$")
2025-10-24T21:14:32.532822 - (RES4LYF) Init2025-10-24T21:14:32.532924 - 
2025-10-24T21:14:32.533260 - (RES4LYF) Importing beta samplers.2025-10-24T21:14:32.533350 - 
2025-10-24T21:14:32.754885 - (RES4LYF) Importing legacy samplers.2025-10-24T21:14:32.754968 - 
2025-10-24T21:14:32.766696 - 
Import times for custom nodes:
2025-10-24T21:14:32.766810 -    0.0 seconds: E:\AIProjects\ComfyUI\ComfyUI\custom_nodes\websocket_image_save.py
2025-10-24T21:14:32.767049 -    0.0 seconds: E:\AIProjects\ComfyUI\ComfyUI\custom_nodes\comfy-image-saver
2025-10-24T21:14:32.767138 -    0.1 seconds: E:\AIProjects\ComfyUI\ComfyUI\custom_nodes\ComfyUI-GGUF
2025-10-24T21:14:32.767213 -    0.1 seconds: E:\AIProjects\ComfyUI\ComfyUI\custom_nodes\comfyui-manager
2025-10-24T21:14:32.767274 -    0.3 seconds: E:\AIProjects\ComfyUI\ComfyUI\custom_nodes\ComfyUI-KJNodes
2025-10-24T21:14:32.767331 -    1.6 seconds: E:\AIProjects\ComfyUI\ComfyUI\custom_nodes\RES4LYF
2025-10-24T21:14:32.767388 - 
2025-10-24T21:14:33.006289 - Context impl SQLiteImpl.
2025-10-24T21:14:33.006415 - Will assume non-transactional DDL.
2025-10-24T21:14:33.007180 - No target revision found.
2025-10-24T21:14:33.026124 - Starting server

2025-10-24T21:14:33.026545 - To see the GUI go to: http://127.0.0.1:8188
2025-10-24T21:14:34.921920 - FETCH ComfyRegistry Data: 5/1022025-10-24T21:14:34.922029 - 
2025-10-24T21:14:38.829114 - FETCH ComfyRegistry Data: 10/1022025-10-24T21:14:38.829213 - 
2025-10-24T21:14:42.629593 - FETCH ComfyRegistry Data: 15/1022025-10-24T21:14:42.629686 - 
2025-10-24T21:14:45.052198 - [DEPRECATION WARNING] Detected import of deprecated legacy API: /scripts/ui.js. This is likely caused by a custom node extension using outdated APIs. Please update your extensions or contact the extension author for an updated version.
2025-10-24T21:14:45.053340 - [DEPRECATION WARNING] Detected import of deprecated legacy API: /extensions/core/groupNode.js. This is likely caused by a custom node extension using outdated APIs. Please update your extensions or contact the extension author for an updated version.
2025-10-24T21:14:45.296530 - [DEPRECATION WARNING] Detected import of deprecated legacy API: /scripts/ui/components/buttonGroup.js. This is likely caused by a custom node extension using outdated APIs. Please update your extensions or contact the extension author for an updated version.
2025-10-24T21:14:45.302192 - [DEPRECATION WARNING] Detected import of deprecated legacy API: /scripts/ui/components/button.js. This is likely caused by a custom node extension using outdated APIs. Please update your extensions or contact the extension author for an updated version.
2025-10-24T21:14:47.194661 - FETCH ComfyRegistry Data: 20/1022025-10-24T21:14:47.194732 - 
2025-10-24T21:14:51.335340 - FETCH ComfyRegistry Data: 25/1022025-10-24T21:14:51.335447 - 
2025-10-24T21:14:55.138375 - FETCH ComfyRegistry Data: 30/1022025-10-24T21:14:55.138456 - 
2025-10-24T21:14:59.118301 - FETCH ComfyRegistry Data: 35/1022025-10-24T21:14:59.118393 - 
2025-10-24T21:15:03.212589 - FETCH ComfyRegistry Data: 40/1022025-10-24T21:15:03.212683 - 
2025-10-24T21:15:07.252799 - FETCH ComfyRegistry Data: 45/1022025-10-24T21:15:07.252887 - 
2025-10-24T21:15:11.344854 - FETCH ComfyRegistry Data: 50/1022025-10-24T21:15:11.344959 - 
2025-10-24T21:15:15.963503 - FETCH ComfyRegistry Data: 55/1022025-10-24T21:15:15.963599 - 
2025-10-24T21:15:20.001816 - FETCH ComfyRegistry Data: 60/1022025-10-24T21:15:20.001907 - 
2025-10-24T21:15:24.306976 - FETCH ComfyRegistry Data: 65/1022025-10-24T21:15:24.307064 - 
2025-10-24T21:15:28.252447 - FETCH ComfyRegistry Data: 70/1022025-10-24T21:15:28.252550 - 
2025-10-24T21:15:33.286458 - FETCH ComfyRegistry Data: 75/1022025-10-24T21:15:33.286545 - 
2025-10-24T21:15:37.583627 - FETCH ComfyRegistry Data: 80/1022025-10-24T21:15:37.583736 - 
2025-10-24T21:15:41.817365 - FETCH ComfyRegistry Data: 85/1022025-10-24T21:15:41.817455 - 
2025-10-24T21:15:45.966859 - FETCH ComfyRegistry Data: 90/1022025-10-24T21:15:45.966952 - 
2025-10-24T21:15:49.998054 - FETCH ComfyRegistry Data: 95/1022025-10-24T21:15:49.998132 - 
2025-10-24T21:15:54.479583 - FETCH ComfyRegistry Data: 100/1022025-10-24T21:15:54.479672 - 
2025-10-24T21:15:56.685162 - FETCH ComfyRegistry Data [DONE]2025-10-24T21:15:56.685276 - 
2025-10-24T21:15:56.811154 - [ComfyUI-Manager] default cache updated: https://api.comfy.org/nodes
2025-10-24T21:15:56.829603 - FETCH DATA from: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/custom-node-list.json2025-10-24T21:15:56.829673 - 2025-10-24T21:15:57.064604 -  [DONE]2025-10-24T21:15:57.064684 - 
2025-10-24T21:15:57.115456 - [ComfyUI-Manager] All startup tasks have been completed.
2025-10-24T21:26:25.557901 - got prompt
2025-10-24T21:26:25.559938 - Failed to validate prompt for output 10:
2025-10-24T21:26:25.560045 - * VAELoader 8:
2025-10-24T21:26:25.560117 -   - Value not in list: vae_name: 'wan_2.1_vae.safetensors' not in ['Wan2.1_VAE.safetensors', 'taesd', 'taesdxl', 'taesd3', 'taef1', 'pixel_space']
2025-10-24T21:26:25.560177 - Output will be ignored
2025-10-24T21:26:25.560242 - invalid prompt: {'type': 'prompt_outputs_failed_validation', 'message': 'Prompt outputs failed validation', 'details': '', 'extra_info': {}}
2025-10-24T21:26:54.704044 - got prompt
2025-10-24T21:26:54.750224 - Using pytorch attention in VAE
2025-10-24T21:26:54.751308 - Using pytorch attention in VAE
2025-10-24T21:26:54.927796 - VAE load device: cuda:0, offload device: cpu, dtype: torch.bfloat16
2025-10-24T21:26:54.975862 - Using scaled fp8: fp8 matrix mult: False, scale input: False
2025-10-24T21:26:55.505871 - Requested to load WanTEModel
2025-10-24T21:26:55.511936 - loaded completely 9.5367431640625e+25 6419.477203369141 True
2025-10-24T21:26:55.521513 - CLIP/text encoder model load device: cuda:0, offload device: cpu, current: cuda:0, dtype: torch.float16
2025-10-24T21:26:58.240652 - gguf qtypes: F16 (694), Q8_0 (400), F32 (1)
2025-10-24T21:26:58.264700 - model weight dtype torch.float16, manual cast: None
2025-10-24T21:26:58.266859 - model_type FLOW
2025-10-24T21:26:58.498559 - Requested to load WanTEModel
2025-10-24T21:27:00.575631 - loaded completely 21398.8 6419.477203369141 True
2025-10-24T21:27:02.382441 - Requested to load WAN21
2025-10-24T21:27:13.981327 - loaded completely 17468.72358651423 14823.906372070312 True
2025-10-24T21:27:14.057820 - (RES4LYF) rk_type: res_2s2025-10-24T21:27:14.057915 - 
2025-10-24T21:27:14.096474 - 
  0%|                                                                                            | 0/4 [00:00<?, ?it/s]2025-10-24T21:27:24.358934 - !!! Exception during processing !!! backend='inductor' raised:
ImportError: cannot import name 'triton_key' from 'triton.compiler.compiler' (E:\AIProjects\ComfyUI\python_embeded\Lib\site-packages\triton\compiler\compiler.py)

Set TORCHDYNAMO_VERBOSE=1 for the internal stack trace (please do this especially if you're reporting a bug to PyTorch). For even more developer context, set TORCH_LOGS="+dynamo"

2025-10-24T21:27:24.368229 - Traceback (most recent call last):
  File "E:\AIProjects\ComfyUI\ComfyUI\execution.py", line 496, in execute
    output_data, output_ui, has_subgraph, has_pending_tasks = await get_output_data(prompt_id, unique_id, obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb, hidden_inputs=hidden_inputs)
                                                              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\AIProjects\ComfyUI\ComfyUI\execution.py", line 315, in get_output_data
    return_values = await _async_map_node_over_list(prompt_id, unique_id, obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb, hidden_inputs=hidden_inputs)
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\AIProjects\ComfyUI\ComfyUI\execution.py", line 289, in _async_map_node_over_list
    await process_inputs(input_dict, i)
  File "E:\AIProjects\ComfyUI\ComfyUI\execution.py", line 277, in process_inputs
    result = f(**inputs)
  File "E:\AIProjects\ComfyUI\ComfyUI\nodes.py", line 1559, in sample
    return common_ksampler(model, noise_seed, steps, cfg, sampler_name, scheduler, positive, negative, latent_image, denoise=denoise, disable_noise=disable_noise, start_step=start_at_step, last_step=end_at_step, force_full_denoise=force_full_denoise)
  File "E:\AIProjects\ComfyUI\ComfyUI\nodes.py", line 1492, in common_ksampler
    samples = comfy.sample.sample(model, noise, steps, cfg, sampler_name, scheduler, positive, negative, latent_image,
                                  denoise=denoise, disable_noise=disable_noise, start_step=start_step, last_step=last_step,
                                  force_full_denoise=force_full_denoise, noise_mask=noise_mask, callback=callback, disable_pbar=disable_pbar, seed=seed)
  File "E:\AIProjects\ComfyUI\ComfyUI\comfy\sample.py", line 45, in sample
    samples = sampler.sample(noise, positive, negative, cfg=cfg, latent_image=latent_image, start_step=start_step, last_step=last_step, force_full_denoise=force_full_denoise, denoise_mask=noise_mask, sigmas=sigmas, callback=callback, disable_pbar=disable_pbar, seed=seed)
  File "E:\AIProjects\ComfyUI\ComfyUI\comfy\samplers.py", line 1154, in sample
    return sample(self.model, noise, positive, negative, cfg, self.device, sampler, sigmas, self.model_options, latent_image=latent_image, denoise_mask=denoise_mask, callback=callback, disable_pbar=disable_pbar, seed=seed)
  File "E:\AIProjects\ComfyUI\ComfyUI\comfy\samplers.py", line 1044, in sample
    return cfg_guider.sample(noise, latent_image, sampler, sigmas, denoise_mask, callback, disable_pbar, seed)
           ~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\AIProjects\ComfyUI\ComfyUI\comfy\samplers.py", line 1029, in sample
    output = executor.execute(noise, latent_image, sampler, sigmas, denoise_mask, callback, disable_pbar, seed)
  File "E:\AIProjects\ComfyUI\ComfyUI\comfy\patcher_extension.py", line 112, in execute
    return self.original(*args, **kwargs)
           ~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^
  File "E:\AIProjects\ComfyUI\ComfyUI\comfy\samplers.py", line 997, in outer_sample
    output = self.inner_sample(noise, latent_image, device, sampler, sigmas, denoise_mask, callback, disable_pbar, seed)
  File "E:\AIProjects\ComfyUI\ComfyUI\comfy\samplers.py", line 980, in inner_sample
    samples = executor.execute(self, sigmas, extra_args, callback, noise, latent_image, denoise_mask, disable_pbar)
  File "E:\AIProjects\ComfyUI\ComfyUI\comfy\patcher_extension.py", line 112, in execute
    return self.original(*args, **kwargs)
           ~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^
  File "E:\AIProjects\ComfyUI\ComfyUI\comfy\samplers.py", line 752, in sample
    samples = self.sampler_function(model_k, noise, sigmas, extra_args=extra_args, callback=k_callback, disable=disable_pbar, **self.extra_options)
  File "E:\AIProjects\ComfyUI\ComfyUI\custom_nodes\RES4LYF\beta\__init__.py", line 167, in sample_res_2s
    return rk_sampler_beta.sample_rk_beta(model, x, sigmas, None, extra_args, callback, disable, rk_type="res_2s",)
           ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\AIProjects\ComfyUI\python_embeded\Lib\site-packages\torch\utils\_contextlib.py", line 120, in decorate_context
    return func(*args, **kwargs)
  File "E:\AIProjects\ComfyUI\ComfyUI\custom_nodes\RES4LYF\beta\rk_sampler_beta.py", line 1665, in sample_rk_beta
    eps_[row], data_[row] = RK(x_tmp, s_tmp, x_0, sigma, transformer_options={'row': row, 'x_tmp': x_tmp, 'sigma_next': sigma_next})
                            ~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\AIProjects\ComfyUI\ComfyUI\custom_nodes\RES4LYF\beta\rk_method_beta.py", line 901, in __call__
    denoised = self.model_denoised(x.to(self.model_device), sub_sigma.to(self.model_device), **self.extra_args).to(sigma.device)
               ~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\AIProjects\ComfyUI\ComfyUI\custom_nodes\RES4LYF\beta\rk_method_beta.py", line 241, in model_denoised
    denoised = self.model(x, sigma * s_in, **extra_args)
  File "E:\AIProjects\ComfyUI\ComfyUI\comfy\samplers.py", line 401, in __call__
    out = self.inner_model(x, sigma, model_options=model_options, seed=seed)
  File "E:\AIProjects\ComfyUI\ComfyUI\comfy\samplers.py", line 953, in __call__
    return self.outer_predict_noise(*args, **kwargs)
           ~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^
  File "E:\AIProjects\ComfyUI\ComfyUI\comfy\samplers.py", line 960, in outer_predict_noise
    ).execute(x, timestep, model_options, seed)
      ~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\AIProjects\ComfyUI\ComfyUI\comfy\patcher_extension.py", line 112, in execute
    return self.original(*args, **kwargs)
           ~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^
  File "E:\AIProjects\ComfyUI\ComfyUI\comfy\samplers.py", line 963, in predict_noise
    return sampling_function(self.inner_model, x, timestep, self.conds.get("negative", None), self.conds.get("positive", None), self.cfg, model_options=model_options, seed=seed)
  File "E:\AIProjects\ComfyUI\ComfyUI\comfy\samplers.py", line 381, in sampling_function
    out = calc_cond_batch(model, conds, x, timestep, model_options)
  File "E:\AIProjects\ComfyUI\ComfyUI\comfy\samplers.py", line 206, in calc_cond_batch
    return _calc_cond_batch_outer(model, conds, x_in, timestep, model_options)
  File "E:\AIProjects\ComfyUI\ComfyUI\comfy\samplers.py", line 214, in _calc_cond_batch_outer
    return executor.execute(model, conds, x_in, timestep, model_options)
           ~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\AIProjects\ComfyUI\ComfyUI\comfy\patcher_extension.py", line 112, in execute
    return self.original(*args, **kwargs)
           ~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^
  File "E:\AIProjects\ComfyUI\ComfyUI\comfy\samplers.py", line 326, in _calc_cond_batch
    output = model.apply_model(input_x, timestep_, **c).chunk(batch_chunks)
             ~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\AIProjects\ComfyUI\ComfyUI\comfy\model_base.py", line 161, in apply_model
    return comfy.patcher_extension.WrapperExecutor.new_class_executor(
           ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    ...<2 lines>...
        comfy.patcher_extension.get_all_wrappers(comfy.patcher_extension.WrappersMP.APPLY_MODEL, transformer_options)
        ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    ).execute(x, t, c_concat, c_crossattn, control, transformer_options, **kwargs)
    ~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\AIProjects\ComfyUI\ComfyUI\comfy\patcher_extension.py", line 113, in execute
    return self.wrappers[self.idx](self, *args, **kwargs)
           ~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\AIProjects\ComfyUI\ComfyUI\comfy_api\torch_helpers\torch_compile.py", line 26, in apply_torch_compile_wrapper
    return executor(*args, **kwargs)
  File "E:\AIProjects\ComfyUI\ComfyUI\comfy\patcher_extension.py", line 105, in __call__
    return new_executor.execute(*args, **kwargs)
           ~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^
  File "E:\AIProjects\ComfyUI\ComfyUI\comfy\patcher_extension.py", line 112, in execute
    return self.original(*args, **kwargs)
           ~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^
  File "E:\AIProjects\ComfyUI\ComfyUI\comfy\model_base.py", line 200, in _apply_model
    model_output = self.diffusion_model(xc, t, context=context, control=control, transformer_options=transformer_options, **extra_conds).float()
                   ~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\AIProjects\ComfyUI\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1773, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
           ~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^
  File "E:\AIProjects\ComfyUI\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1784, in _call_impl
    return forward_call(*args, **kwargs)
  File "E:\AIProjects\ComfyUI\ComfyUI\comfy\ldm\wan\model.py", line 614, in forward
    return comfy.patcher_extension.WrapperExecutor.new_class_executor(
           ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    ...<2 lines>...
        comfy.patcher_extension.get_all_wrappers(comfy.patcher_extension.WrappersMP.DIFFUSION_MODEL, transformer_options)
        ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    ).execute(x, timestep, context, clip_fea, time_dim_concat, transformer_options, **kwargs)
    ~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\AIProjects\ComfyUI\ComfyUI\comfy\patcher_extension.py", line 112, in execute
    return self.original(*args, **kwargs)
           ~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^
  File "E:\AIProjects\ComfyUI\ComfyUI\comfy\ldm\wan\model.py", line 634, in _forward
    return self.forward_orig(x, timestep, context, clip_fea=clip_fea, freqs=freqs, transformer_options=transformer_options, **kwargs)[:, :, :t, :h, :w]
           ~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\AIProjects\ComfyUI\ComfyUI\comfy\ldm\wan\model.py", line 579, in forward_orig
    x = block(x, e=e0, freqs=freqs, context=context, context_img_len=context_img_len, transformer_options=transformer_options)
  File "E:\AIProjects\ComfyUI\python_embeded\Lib\site-packages\torch\_dynamo\eval_frame.py", line 375, in __call__
    return super().__call__(*args, **kwargs)
           ~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^
  File "E:\AIProjects\ComfyUI\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1773, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
           ~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^
  File "E:\AIProjects\ComfyUI\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1784, in _call_impl
    return forward_call(*args, **kwargs)
  File "E:\AIProjects\ComfyUI\python_embeded\Lib\site-packages\torch\_dynamo\eval_frame.py", line 749, in compile_wrapper
    raise e.remove_dynamo_frames() from None  # see TORCHDYNAMO_VERBOSE=1
    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\AIProjects\ComfyUI\python_embeded\Lib\site-packages\torch\_dynamo\output_graph.py", line 1871, in _call_user_compiler
    raise BackendCompilerFailed(
        self.compiler_fn, e, inspect.currentframe()
    ).with_traceback(e.__traceback__) from None
  File "E:\AIProjects\ComfyUI\python_embeded\Lib\site-packages\torch\_dynamo\output_graph.py", line 1846, in _call_user_compiler
    compiled_fn = compiler_fn(gm, example_inputs)
  File "E:\AIProjects\ComfyUI\python_embeded\Lib\site-packages\torch\_dynamo\repro\after_dynamo.py", line 150, in __call__
    compiled_gm = compiler_fn(gm, example_inputs)
  File "E:\AIProjects\ComfyUI\python_embeded\Lib\site-packages\torch\__init__.py", line 2380, in __call__
    return compile_fx(model_, inputs_, config_patches=self.config)
  File "E:\AIProjects\ComfyUI\python_embeded\Lib\site-packages\torch\_inductor\compile_fx.py", line 2418, in compile_fx
    return aot_autograd(
           ~~~~~~~~~~~~~
    ...<8 lines>...
        ignore_shape_env=ignore_shape_env,
        ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    )(model_, example_inputs_)
    ~^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\AIProjects\ComfyUI\python_embeded\Lib\site-packages\torch\_dynamo\backends\common.py", line 109, in __call__
    cg = aot_module_simplified(gm, example_inputs, **self.kwargs)
  File "E:\AIProjects\ComfyUI\python_embeded\Lib\site-packages\torch\_functorch\aot_autograd.py", line 1199, in aot_module_simplified
    compiled_fn = AOTAutogradCache.load(
        dispatch_and_compile,
    ...<6 lines>...
        remote,
    )
  File "E:\AIProjects\ComfyUI\python_embeded\Lib\site-packages\torch\_functorch\_aot_autograd\autograd_cache.py", line 1140, in load
    compiled_fn = dispatch_and_compile()
  File "E:\AIProjects\ComfyUI\python_embeded\Lib\site-packages\torch\_functorch\aot_autograd.py", line 1184, in dispatch_and_compile
    compiled_fn, _ = create_aot_dispatcher_function(
                     ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^
        functional_call,
        ^^^^^^^^^^^^^^^^
    ...<3 lines>...
        shape_env,
        ^^^^^^^^^^
    )
    ^
  File "E:\AIProjects\ComfyUI\python_embeded\Lib\site-packages\torch\_functorch\aot_autograd.py", line 576, in create_aot_dispatcher_function
    return _create_aot_dispatcher_function(
        flat_fn, fake_flat_args, aot_config, fake_mode, shape_env
    )
  File "E:\AIProjects\ComfyUI\python_embeded\Lib\site-packages\torch\_functorch\aot_autograd.py", line 836, in _create_aot_dispatcher_function
    compiled_fn, fw_metadata = compiler_fn(
                               ~~~~~~~~~~~^
        flat_fn,
        ^^^^^^^^
    ...<2 lines>...
        fw_metadata=fw_metadata,
        ^^^^^^^^^^^^^^^^^^^^^^^^
    )
    ^
  File "E:\AIProjects\ComfyUI\python_embeded\Lib\site-packages\torch\_functorch\_aot_autograd\jit_compile_runtime_wrappers.py", line 245, in aot_dispatch_base
    compiled_fw = compiler(fw_module, updated_flat_args)
  File "E:\AIProjects\ComfyUI\python_embeded\Lib\site-packages\torch\_functorch\aot_autograd.py", line 483, in __call__
    return self.compiler_fn(gm, example_inputs)
           ~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^
  File "E:\AIProjects\ComfyUI\python_embeded\Lib\site-packages\torch\_inductor\compile_fx.py", line 2250, in fw_compiler_base
    return inner_compile(
        gm,
    ...<5 lines>...
        boxed_forward_device_index=forward_device,
    )
  File "E:\AIProjects\ComfyUI\python_embeded\Lib\site-packages\torch\_inductor\compile_fx.py", line 745, in compile_fx_inner
    return wrap_compiler_debug(_compile_fx_inner, compiler_name="inductor")(
           ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^
        gm,
        ^^^
        example_inputs,
        ^^^^^^^^^^^^^^^
        **kwargs,
        ^^^^^^^^^
    )
    ^
  File "E:\AIProjects\ComfyUI\python_embeded\Lib\site-packages\torch\_dynamo\repro\after_aot.py", line 124, in debug_wrapper
    inner_compiled_fn = compiler_fn(gm, example_inputs)
  File "E:\AIProjects\ComfyUI\python_embeded\Lib\site-packages\torch\_inductor\compile_fx.py", line 860, in _compile_fx_inner
    (key_info, cache_info) = FxGraphCache.prepare_key(
                             ~~~~~~~~~~~~~~~~~~~~~~~~^
        gm, example_inputs, graph_kwargs, inputs_to_check, remote
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    )
    ^
  File "E:\AIProjects\ComfyUI\python_embeded\Lib\site-packages\torch\_inductor\codecache.py", line 1474, in prepare_key
    key, debug_lines = compiled_fx_graph_hash(
                       ~~~~~~~~~~~~~~~~~~~~~~^
        gm, example_inputs, fx_kwargs, inputs_to_check
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    )
    ^
  File "E:\AIProjects\ComfyUI\python_embeded\Lib\site-packages\torch\_inductor\codecache.py", line 960, in compiled_fx_graph_hash
    details = FxGraphHashDetails(gm, example_inputs, fx_kwargs, inputs_to_check)
  File "E:\AIProjects\ComfyUI\python_embeded\Lib\site-packages\torch\_inductor\codecache.py", line 896, in __init__
    self.system_info = CacheBase.get_system()
                       ~~~~~~~~~~~~~~~~~~~~^^
  File "E:\AIProjects\ComfyUI\python_embeded\Lib\site-packages\torch\_inductor\codecache.py", line 205, in get_system
    from triton.compiler.compiler import triton_key
torch._dynamo.exc.BackendCompilerFailed: backend='inductor' raised:
ImportError: cannot import name 'triton_key' from 'triton.compiler.compiler' (E:\AIProjects\ComfyUI\python_embeded\Lib\site-packages\triton\compiler\compiler.py)

Set TORCHDYNAMO_VERBOSE=1 for the internal stack trace (please do this especially if you're reporting a bug to PyTorch). For even more developer context, set TORCH_LOGS="+dynamo"


2025-10-24T21:27:24.377349 - Prompt executed in 29.67 seconds
2025-10-24T21:27:24.597073 - 
  0%|                                                                                            | 0/4 [00:10<?, ?it/s]2025-10-24T21:27:24.597193 - 


## Attached Workflow
Please make sure that workflow does not contain any sensitive information such as API keys or passwords.

Workflow too large. Please manually upload the workflow from local file system.


## Additional Context
(Please add any additional context or steps to reproduce the error here)

Other

No response

MrBuubles007 avatar Oct 24 '25 19:10 MrBuubles007

Try Torch 2.8 and Cuda 12.8. Try pip install triton-windows, delete the Triton you have. Build your sage attention wheel with those installed. Should work. Did for me.

MoeMonsuta avatar Oct 26 '25 19:10 MoeMonsuta

Try Torch 2.8 and Cuda 12.8. Try pip install triton-windows, delete the Triton you have. Build your sage attention wheel with those installed. Should work. Did for me.

Did this. Didn't work. Ensured all installed packages/wheels matched pytorch and cuda env (2.8, 12.8)

RobertAgee avatar Nov 14 '25 07:11 RobertAgee