ComfyUI icon indicating copy to clipboard operation
ComfyUI copied to clipboard

!!! Exception during processing!!! Trying to convert Float8_e5m2 to the MPS backend but it does not have support for that dtype.

Open luohui1102 opened this issue 1 year ago • 4 comments

Your question

"Hello! I'm using an Apple M1 chip, and I'm encountering MPS issues when running many nodes. Are there any solutions?"

Logs

!!! Exception during processing!!! Trying to convert Float8_e5m2 to the MPS backend but it does not have support for that dtype.
Traceback (most recent call last):
  File "/Users/llh/pinokio/api/comfyui.git/app/execution.py", line 152, in recursive_execute
    output_data, output_ui = get_output_data(obj, input_data_all)
  File "/Users/llh/pinokio/api/comfyui.git/app/execution.py", line 82, in get_output_data
    return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True)
  File "/Users/llh/pinokio/api/comfyui.git/app/execution.py", line 75, in map_node_over_list
    results.append(getattr(obj, func)(**slice_dict(input_data_all, i)))
  File "/Users/llh/pinokio/api/comfyui.git/app/comfy_extras/nodes_custom_sampler.py", line 612, in sample
    samples = guider.sample(noise.generate_noise(latent), latent_image, sampler, sigmas, denoise_mask=noise_mask, callback=callback, disable_pbar=disable_pbar, seed=noise.seed)
  File "/Users/llh/pinokio/api/comfyui.git/app/comfy/samplers.py", line 716, in sample
    output = self.inner_sample(noise, latent_image, device, sampler, sigmas, denoise_mask, callback, disable_pbar, seed)
  File "/Users/llh/pinokio/api/comfyui.git/app/comfy/samplers.py", line 695, in inner_sample
    samples = sampler.sample(self, sigmas, extra_args, callback, noise, latent_image, denoise_mask, disable_pbar)
  File "/Users/llh/pinokio/api/comfyui.git/app/comfy/samplers.py", line 600, in sample
    samples = self.sampler_function(model_k, noise, sigmas, extra_args=extra_args, callback=k_callback, disable=disable_pbar, **self.extra_options)
  File "/Users/llh/pinokio/api/comfyui.git/app/env/lib/python3.10/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context
    return func(*args, **kwargs)
  File "/Users/llh/pinokio/api/comfyui.git/app/comfy/k_diffusion/sampling.py", line 143, in sample_euler
    denoised = model(x, sigma_hat * s_in, **extra_args)
  File "/Users/llh/pinokio/api/comfyui.git/app/comfy/samplers.py", line 299, in __call__
    out = self.inner_model(x, sigma, model_options=model_options, seed=seed)
  File "/Users/llh/pinokio/api/comfyui.git/app/comfy/samplers.py", line 682, in __call__
    return self.predict_noise(*args, **kwargs)
  File "/Users/llh/pinokio/api/comfyui.git/app/comfy/samplers.py", line 685, in predict_noise
    return sampling_function(self.inner_model, x, timestep, self.conds.get("negative", None), self.conds.get("positive", None), self.cfg, model_options=model_options, seed=seed)
  File "/Users/llh/pinokio/api/comfyui.git/app/comfy/samplers.py", line 279, in sampling_function
    out = calc_cond_batch(model, conds, x, timestep, model_options)
  File "/Users/llh/pinokio/api/comfyui.git/app/comfy/samplers.py", line 228, in calc_cond_batch
    output = model.apply_model(input_x, timestep_, **c).chunk(batch_chunks)
  File "/Users/llh/pinokio/api/comfyui.git/app/comfy/model_base.py", line 122, in apply_model
    model_output = self.diffusion_model(xc, t, context=context, control=control, transformer_options=transformer_options, **extra_conds).float()
  File "/Users/llh/pinokio/api/comfyui.git/app/env/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1532, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
  File "/Users/llh/pinokio/api/comfyui.git/app/env/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1541, in _call_impl
    return forward_call(*args, **kwargs)
  File "/Users/llh/pinokio/api/comfyui.git/app/comfy/ldm/flux/model.py", line 143, in forward
    out = self.forward_orig(img, img_ids, context, txt_ids, timestep, y, guidance)
  File "/Users/llh/pinokio/api/comfyui.git/app/comfy/ldm/flux/model.py", line 101, in forward_orig
    img = self.img_in(img)
  File "/Users/llh/pinokio/api/comfyui.git/app/env/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1532, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
  File "/Users/llh/pinokio/api/comfyui.git/app/env/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1541, in _call_impl
    return forward_call(*args, **kwargs)
  File "/Users/llh/pinokio/api/comfyui.git/app/comfy/ops.py", line 63, in forward
    return self.forward_comfy_cast_weights(*args, **kwargs)
  File "/Users/llh/pinokio/api/comfyui.git/app/comfy/ops.py", line 58, in forward_comfy_cast_weights
    weight, bias = cast_bias_weight(self, input)
  File "/Users/llh/pinokio/api/comfyui.git/app/comfy/ops.py", line 39, in cast_bias_weight
    bias = cast_to(s.bias, dtype, device, non_blocking=non_blocking)
  File "/Users/llh/pinokio/api/comfyui.git/app/comfy/ops.py", line 24, in cast_to
    return weight.to(device=device, dtype=dtype, non_blocking=non_blocking)
TypeError: Trying to convert Float8_e5m2 to the MPS backend but it does not have support for that dtype.

Other

WeChat24013e2259d9a5e77472aa78c973a2e3

luohui1102 avatar Aug 06 '24 14:08 luohui1102

In the load diffusion model node, try changing the weight_dtype to Default (16bit).

salinas707 avatar Aug 06 '24 14:08 salinas707

Thank you very much, it has been resolved.

luohui1102 avatar Aug 07 '24 06:08 luohui1102

Thank you very much, it has been resolved.

How much unified memory do you have? Thanks

bharattrader avatar Aug 07 '24 08:08 bharattrader

In the load diffusion model node, try changing the weight_dtype to Default (16bit).

i chose default instead of fp8_e4m3fn and got this (see image below). I'm on a m2 Mac with 128gb ram. first flux test_00001_

rachelcenter avatar Aug 25 '24 19:08 rachelcenter

@rachelcenter Update to the newest torch nightlies. There was a bug that crept in after torch 2.3.1.

Adreitz avatar Sep 18 '24 02:09 Adreitz

非常感谢,已经解决了。

How to solve it? I changed it to the default and still reported an error.

bieya2024 avatar Feb 24 '25 06:02 bieya2024

This issue is being marked stale because it has not had any activity for 30 days. Reply below within 7 days if your issue still isn't solved, and it will be left open. Otherwise, the issue will be closed automatically.

github-actions[bot] avatar Apr 01 '25 11:04 github-actions[bot]