ComfyUI icon indicating copy to clipboard operation
ComfyUI copied to clipboard

Missing Models and Custom Nodes in ComfyUI, including IP-Adapters (I would like to contribute and try fix this )

Open MiladZarour opened this issue 1 year ago • 11 comments

When using ComfyUI and running run_with_gpu.bat, importing a JSON file may result in missing nodes. This issue can be easily fixed by opening the manager and clicking on "Install Missing Nodes," allowing us to check and install the required nodes.

However, this functionality does not extend to missing models or IP-adapters.

To address this, I suggest implementing a feature that allows the installation of missing models and other components directly from the manager tab when importing a JSON file. This would streamline the process and ensure all necessary components are installed seamlessly.

I would like to contribute and try fixing this issue.

MiladZarour avatar May 26 '24 23:05 MiladZarour

Write the code, open a pull request.

shawnington avatar May 27 '24 01:05 shawnington

how to start the main.py script ? I am getting this error when I run it ?

  File "D:\Fix_Comfyui\ComfyUI\comfy\model_management.py", line 89, in get_torch_device
    return torch.device(torch.cuda.current_device())
                        ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\Fix_Comfyui\ComfyUI\venv\Lib\site-packages\torch\cuda\__init__.py", line 778, in current_device
    _lazy_init()
  File "D:\Fix_Comfyui\ComfyUI\venv\Lib\site-packages\torch\cuda\__init__.py", line 284, in _lazy_init
    raise AssertionError("Torch not compiled with CUDA enabled")
AssertionError: Torch not compiled with CUDA enabled

Process finished with exit code 1 I want to test my code if there is any another way ?

MiladZarour avatar May 27 '24 18:05 MiladZarour

oh CUDA wasn't installed I think now should works..

MiladZarour avatar May 27 '24 18:05 MiladZarour

to run the main.py I think I have to run same as the run_nvidia_gpu.bat ? python.exe -s ComfyUI\main.py --windows-standalone-build

MiladZarour avatar May 27 '24 18:05 MiladZarour

I installed CUDA , and installed all the requirement.txt and running the main.py : D:\\Fix_Comfyui\\ComfyUI\\venv\\Scripts\\python.exe -s D:\\Fix_Comfyui\\ComfyUI\\main.py --windows-standalone-build getting this error :

Traceback (most recent call last):
  File "D:\Fix_Comfyui\ComfyUI\main.py", line 79, in <module>
    import execution
  File "D:\Fix_Comfyui\ComfyUI\execution.py", line 11, in <module>
    import nodes
  File "D:\Fix_Comfyui\ComfyUI\nodes.py", line 21, in <module>
    import comfy.diffusers_load
  File "D:\Fix_Comfyui\ComfyUI\comfy\diffusers_load.py", line 3, in <module>
    import comfy.sd
  File "D:\Fix_Comfyui\ComfyUI\comfy\sd.py", line 5, in <module>
    from comfy import model_management
  File "D:\Fix_Comfyui\ComfyUI\comfy\model_management.py", line 120, in <module>
    total_vram = get_total_memory(get_torch_device()) / (1024 * 1024)
                                  ^^^^^^^^^^^^^^^^^^
  File "D:\Fix_Comfyui\ComfyUI\comfy\model_management.py", line 89, in get_torch_device
    return torch.device(torch.cuda.current_device())
                        ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\Fix_Comfyui\ComfyUI\venv\Lib\site-packages\torch\cuda\__init__.py", line 778, in current_device
    _lazy_init()
  File "D:\Fix_Comfyui\ComfyUI\venv\Lib\site-packages\torch\cuda\__init__.py", line 284, in _lazy_init
    raise AssertionError("Torch not compiled with CUDA enabled")
AssertionError: Torch not compiled with CUDA enabled

and running :

Milad@Milad MINGW64 /d/Fix_Comfyui/ComfyUI (master)
$ D:\\Fix_Comfyui\\ComfyUI\\venv\\Scripts\\python.exe D:\\Fix_Comfyui\\ComfyUI\\main.py
Traceback (most recent call last):
  File "D:\Fix_Comfyui\ComfyUI\main.py", line 79, in <module>
    import execution
  File "D:\Fix_Comfyui\ComfyUI\execution.py", line 11, in <module>
    import nodes
  File "D:\Fix_Comfyui\ComfyUI\nodes.py", line 21, in <module>
    import comfy.diffusers_load
  File "D:\Fix_Comfyui\ComfyUI\comfy\diffusers_load.py", line 3, in <module>
    import comfy.sd
  File "D:\Fix_Comfyui\ComfyUI\comfy\sd.py", line 5, in <module>
    from comfy import model_management
  File "D:\Fix_Comfyui\ComfyUI\comfy\model_management.py", line 120, in <module>
    total_vram = get_total_memory(get_torch_device()) / (1024 * 1024)
                                  ^^^^^^^^^^^^^^^^^^
  File "D:\Fix_Comfyui\ComfyUI\comfy\model_management.py", line 89, in get_torch_device
    return torch.device(torch.cuda.current_device())
                        ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\Fix_Comfyui\ComfyUI\venv\Lib\site-packages\torch\cuda\__init__.py", line 778, in current_device
    _lazy_init()
  File "D:\Fix_Comfyui\ComfyUI\venv\Lib\site-packages\torch\cuda\__init__.py", line 284, in _lazy_init
    raise AssertionError("Torch not compiled with CUDA enabled")
AssertionError: Torch not compiled with CUDA enabled

MiladZarour avatar May 27 '24 18:05 MiladZarour

it works now ! image

D:\Fix_Comfyui\ComfyUI\venv\Scripts\python.exe D:\Fix_Comfyui\ComfyUI\main.py 
Total VRAM 8192 MB, total RAM 65365 MB
pytorch version: 2.3.0+cu118
Set vram state to: NORMAL_VRAM
Device: cuda:0 NVIDIA GeForce RTX 3060 Ti : cudaMallocAsync
VAE dtype: torch.bfloat16
Using pytorch cross attention
D:\Fix_Comfyui\ComfyUI\comfy\extra_samplers\uni_pc.py:19: SyntaxWarning: invalid escape sequence '\h'
  """Create a wrapper class for the forward SDE (VP type).
****** User settings have been changed to be stored on the server instead of browser storage. ******
****** For multi-user setups add the --multi-user CLI argument to enable multiple user profiles. ******

Import times for custom nodes:
   0.0 seconds: D:\Fix_Comfyui\ComfyUI\custom_nodes\websocket_image_save.py

Starting server

To see the GUI go to: http://127.0.0.1:8188

Now I will start !

MiladZarour avatar May 27 '24 18:05 MiladZarour

and I realised that I actually forked the wrong Comfyui 😁

I should forked https://github.com/ltdrdata/ComfyUI-Manager actually, because there where I should edit I believe.....

MiladZarour avatar May 27 '24 18:05 MiladZarour

I get the same error. How did you fix it?

haraeza avatar Jun 01 '24 14:06 haraeza

Same error here.

VantomPayne avatar Jun 01 '24 15:06 VantomPayne

take a look at https://github.com/hiddenswitch/ComfyUI?tab=readme-ov-file#installing for better support for automatic model downloading

doctorpangloss avatar Jun 09 '24 22:06 doctorpangloss

@MiladZarour We have some plans for supporting missing models. What do you think about this: https://github.com/comfyanonymous/ComfyUI/discussions/3717

robinjhuang avatar Jul 03 '24 20:07 robinjhuang

Closing due to inactivity. This should be a PR to ComfyUI - Manager.

robinjhuang avatar Jul 30 '24 00:07 robinjhuang

FYI, Handling the installation of dependent models for nodes is also in the plan. However, this is not an issue that can be resolved simply by upgrading the functionality of ComfyUI-Manager. Improvements are needed, such as custom nodes providing their own specifications for dependent models.

ltdrdata avatar Jul 30 '24 08:07 ltdrdata