ComfyUI-DiffusersStableCascade
ComfyUI-DiffusersStableCascade copied to clipboard
issue on Mac M3: BFloat16 is not supported on MPS
ERROR:root:!!! Exception during processing !!!
ERROR:root:Traceback (most recent call last):
File "/Users/dfl/sd/ComfyUI/execution.py", line 152, in recursive_execute
output_data, output_ui = get_output_data(obj, input_data_all)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/dfl/sd/ComfyUI/execution.py", line 82, in get_output_data
return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/dfl/sd/ComfyUI/execution.py", line 75, in map_node_over_list
results.append(getattr(obj, func)(**slice_dict(input_data_all, i)))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/dfl/sd/ComfyUI/custom_nodes/ComfyUI-DiffusersStableCascade/nodes.py", line 44, in process
self.prior = StableCascadePriorPipeline.from_pretrained("stabilityai/stable-cascade-prior", torch_dtype=torch.bfloat16).to(device)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/dfl/sd/ComfyUI/venv/src/diffusers/src/diffusers/pipelines/pipeline_utils.py", line 862, in to
module.to(device, dtype)
File "/Users/dfl/sd/ComfyUI/venv/lib/python3.11/site-packages/transformers/modeling_utils.py", line 2595, in to
return super().to(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/dfl/sd/ComfyUI/venv/lib/python3.11/site-packages/torch/nn/modules/module.py", line 1160, in to
return self._apply(convert)
^^^^^^^^^^^^^^^^^^^^
File "/Users/dfl/sd/ComfyUI/venv/lib/python3.11/site-packages/torch/nn/modules/module.py", line 810, in _apply
module._apply(fn)
File "/Users/dfl/sd/ComfyUI/venv/lib/python3.11/site-packages/torch/nn/modules/module.py", line 810, in _apply
module._apply(fn)
File "/Users/dfl/sd/ComfyUI/venv/lib/python3.11/site-packages/torch/nn/modules/module.py", line 810, in _apply
module._apply(fn)
File "/Users/dfl/sd/ComfyUI/venv/lib/python3.11/site-packages/torch/nn/modules/module.py", line 833, in _apply
param_applied = fn(param)
^^^^^^^^^
File "/Users/dfl/sd/ComfyUI/venv/lib/python3.11/site-packages/torch/nn/modules/module.py", line 1158, in convert
return t.to(device, dtype if t.is_floating_point() or t.is_complex() else None, non_blocking)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
TypeError: BFloat16 is not supported on MPS
I start comfy with this script:
PYTORCH_ENABLE_MPS_FALLBACK=1
./venv/bin/python main.py --force-fp16
tried changing various permutations to no effect
same on m2 ultra
Same here : https://github.com/kijai/ComfyUI-DiffusersStableCascade/issues/6#issuecomment-1947101857