ImageBlur node breaks at blur_radius above 6
Running it with no custom nodes.
Running a simple blur like this:
Works up to a blur_radius of 6.
At 7, ComfyUI simply crashes without an error message.
At 8 and above, the execution is stuck at the node for minutes, then errors out:
ERROR:root:!!! Exception during processing !!!
ERROR:root:Traceback (most recent call last):
File "D:\ComfyUI_windows_portable_nightly_pytorch\ComfyUI\execution.py", line 152, in recursive_execute
output_data, output_ui = get_output_data(obj, input_data_all)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\ComfyUI_windows_portable_nightly_pytorch\ComfyUI\execution.py", line 82, in get_output_data
return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\ComfyUI_windows_portable_nightly_pytorch\ComfyUI\execution.py", line 75, in map_node_over_list
results.append(getattr(obj, func)(**slice_dict(input_data_all, i)))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\ComfyUI_windows_portable_nightly_pytorch\ComfyUI\comfy_extras\nodes_post_processing.py", line 112, in blur
blurred = F.conv2d(padded_image, kernel, padding=kernel_size // 2, groups=channels)[:,:,blur_radius:-blur_radius, blur_radius:-blur_radius]
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
RuntimeError: vector too long
Afterwards, ImageBlur is broken with any value until a restart. Strangely, the node works perfectly with the output from the "Convert Mask to Image" node.
What is the image resolution? When I take a 1024x1024->nearest-exact upscale image by 4x to 4096->blur->5.56 seconds total. If I upscale to 8192x8192, it takes 21.69 seconds to complete. It pretty much only used my 13600kf's e-cores for the operation, not even the real cores(I actually find that kind of weird that the threading would use them over the real cores). Oh and that's with the maxed out blur radius size of 31.
What are your PC specs? There are a ton of variables in the mix, but it sounds like something system related.
EDIT: I pushed it even further, I took that 8k image and doubled it to 16k, ran the same blur radius 31 on it and it took 86.43 seconds. However, my python.exe in the task manager did peak over 15GB of ram while it did it all.
[rgthree] Using rgthree's optimized recursive execution.
Prompt executed in 5.65 seconds
got prompt
[rgthree] Using rgthree's optimized recursive execution.
Prompt executed in 21.69 seconds
got prompt
[rgthree] Using rgthree's optimized recursive execution.
Prompt executed in 86.43 seconds
It's 1024*1024, but it happens at 512, 768, any resolution. The memory usage also doesn't spike. I don't think it's a performance problem. But if it works for you, then I assume it's a me problem. Looks like a problem with conv2d from torch. I don't know how to even start troubleshooting this.
I don't know how to even start troubleshooting this.
You might need to try a full manual update or something. Maybe try running the update_comfyui_and_python_dependencies.bat in the ComfyUI_windows_portable\update folder. You might need to edit it in notepad and add --force-reinstall in the line that you see --upgrade (put it behind upgrade like --upgrade --force-reinstall) to make sure that it removes/replaces any potentially broken files.
Also experiencing this as of today. Did a clean install of comfyui standalone, added some custom nodes to support a workflow and a blur radius of 10 is causing it to hang.
Did a little test, and getting the same thing. A radius of 7 is crashing with no error, and 8+ is giving the "vector too long" error.
Did a completely clean install of comfyui, no custom nodes, and it's the same thing.
I seem to have fixed it for my install.
Updating the file comfy_extras/nodes_post_processing.py, change the order of numbers in the permute method in both the image and blurred objects.
line 110 from: image = image.permute(0, 3, 1, 2)
to: image = image.permute(0, 3, 2, 1)
and line 113 from: blurred = blurred.permute(0, 2, 3, 1)
to: blurred = blurred.permute(0, 3, 2, 1)
It's also worth noting, the sharpen function is crashing the same way the blur was crashing on values of 7 and higher. I haven't tried fixing that one yet, but looking at the numbers, I'm guessing it's a similar issue.