ComfyUI_VLM_nodes icon indicating copy to clipboard operation
ComfyUI_VLM_nodes copied to clipboard

Can't install VLM nodes in ComfyUI

Open RamonGuthrie opened this issue 10 months ago • 15 comments

This seems to be the error message I'm getting I hope it makes sense

All packages from requirements.txt are installed and up to date. llama-cpp installed Missing or outdated packages: llama-cpp-agent, mkdocs, mkdocs-material, mkdocstrings[python], docstring-parser Installing/Updating missing packages... Collecting llama-cpp-agent Using cached llama_cpp_agent-0.0.24-py3-none-any.whl.metadata (22 kB) Collecting mkdocs Using cached mkdocs-1.5.3-py3-none-any.whl.metadata (6.2 kB) Collecting mkdocs-material Using cached mkdocs_material-9.5.17-py3-none-any.whl.metadata (17 kB) Collecting docstring-parser Using cached docstring_parser-0.16-py3-none-any.whl.metadata (3.0 kB) Collecting mkdocstrings[python] Using cached mkdocstrings-0.24.3-py3-none-any.whl.metadata (7.6 kB) Collecting llama-cpp-python>=0.2.60 (from llama-cpp-agent) Using cached llama_cpp_python-0.2.61.tar.gz (37.4 MB) Installing build dependencies ... done Getting requirements to build wheel ... done ERROR: Exception: Traceback (most recent call last): File "D:\Stable_Diffusion\ComfyUI_windows_portable_nightly_pytorch\python_embeded\Lib\site-packages\pip_internal\cli\base_command.py", line 180, in exc_logging_wrapper status = run_func(*args) ^^^^^^^^^^^^^^^ File "D:\Stable_Diffusion\ComfyUI_windows_portable_nightly_pytorch\python_embeded\Lib\site-packages\pip_internal\cli\req_command.py", line 245, in wrapper return func(self, options, args) ^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\Stable_Diffusion\ComfyUI_windows_portable_nightly_pytorch\python_embeded\Lib\site-packages\pip_internal\commands\install.py", line 377, in run requirement_set = resolver.resolve( ^^^^^^^^^^^^^^^^^ File "D:\Stable_Diffusion\ComfyUI_windows_portable_nightly_pytorch\python_embeded\Lib\site-packages\pip_internal\resolution\resolvelib\resolver.py", line 95, in resolve result = self._result = resolver.resolve( ^^^^^^^^^^^^^^^^^ File "D:\Stable_Diffusion\ComfyUI_windows_portable_nightly_pytorch\python_embeded\Lib\site-packages\pip_vendor\resolvelib\resolvers.py", line 546, in resolve state = resolution.resolve(requirements, max_rounds=max_rounds) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\Stable_Diffusion\ComfyUI_windows_portable_nightly_pytorch\python_embeded\Lib\site-packages\pip_vendor\resolvelib\resolvers.py", line 427, in resolve failure_causes = self._attempt_to_pin_criterion(name) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\Stable_Diffusion\ComfyUI_windows_portable_nightly_pytorch\python_embeded\Lib\site-packages\pip_vendor\resolvelib\resolvers.py", line 239, in _attempt_to_pin_criterion criteria = self._get_updated_criteria(candidate) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\Stable_Diffusion\ComfyUI_windows_portable_nightly_pytorch\python_embeded\Lib\site-packages\pip_vendor\resolvelib\resolvers.py", line 230, in _get_updated_criteria self._add_to_criteria(criteria, requirement, parent=candidate) File "D:\Stable_Diffusion\ComfyUI_windows_portable_nightly_pytorch\python_embeded\Lib\site-packages\pip_vendor\resolvelib\resolvers.py", line 173, in _add_to_criteria if not criterion.candidates: File "D:\Stable_Diffusion\ComfyUI_windows_portable_nightly_pytorch\python_embeded\Lib\site-packages\pip_vendor\resolvelib\structs.py", line 156, in bool return bool(self._sequence) ^^^^^^^^^^^^^^^^^^^^ File "D:\Stable_Diffusion\ComfyUI_windows_portable_nightly_pytorch\python_embeded\Lib\site-packages\pip_internal\resolution\resolvelib\found_candidates.py", line 155, in bool return any(self) ^^^^^^^^^ File "D:\Stable_Diffusion\ComfyUI_windows_portable_nightly_pytorch\python_embeded\Lib\site-packages\pip_internal\resolution\resolvelib\found_candidates.py", line 143, in return (c for c in iterator if id(c) not in self._incompatible_ids) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\Stable_Diffusion\ComfyUI_windows_portable_nightly_pytorch\python_embeded\Lib\site-packages\pip_internal\resolution\resolvelib\found_candidates.py", line 47, in _iter_built candidate = func() ^^^^^^ File "D:\Stable_Diffusion\ComfyUI_windows_portable_nightly_pytorch\python_embeded\Lib\site-packages\pip_internal\resolution\resolvelib\factory.py", line 182, in _make_candidate_from_link base: Optional[BaseCandidate] = self._make_base_candidate_from_link( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\Stable_Diffusion\ComfyUI_windows_portable_nightly_pytorch\python_embeded\Lib\site-packages\pip_internal\resolution\resolvelib\factory.py", line 228, in _make_base_candidate_from_link self._link_candidate_cache[link] = LinkCandidate( ^^^^^^^^^^^^^^ File "D:\Stable_Diffusion\ComfyUI_windows_portable_nightly_pytorch\python_embeded\Lib\site-packages\pip_internal\resolution\resolvelib\candidates.py", line 290, in init super().init( File "D:\Stable_Diffusion\ComfyUI_windows_portable_nightly_pytorch\python_embeded\Lib\site-packages\pip_internal\resolution\resolvelib\candidates.py", line 156, in init self.dist = self._prepare() ^^^^^^^^^^^^^^^ File "D:\Stable_Diffusion\ComfyUI_windows_portable_nightly_pytorch\python_embeded\Lib\site-packages\pip_internal\resolution\resolvelib\candidates.py", line 222, in _prepare dist = self._prepare_distribution() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\Stable_Diffusion\ComfyUI_windows_portable_nightly_pytorch\python_embeded\Lib\site-packages\pip_internal\resolution\resolvelib\candidates.py", line 301, in _prepare_distribution return preparer.prepare_linked_requirement(self._ireq, parallel_builds=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\Stable_Diffusion\ComfyUI_windows_portable_nightly_pytorch\python_embeded\Lib\site-packages\pip_internal\operations\prepare.py", line 525, in prepare_linked_requirement return self._prepare_linked_requirement(req, parallel_builds) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\Stable_Diffusion\ComfyUI_windows_portable_nightly_pytorch\python_embeded\Lib\site-packages\pip_internal\operations\prepare.py", line 640, in _prepare_linked_requirement dist = _get_prepared_distribution( ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\Stable_Diffusion\ComfyUI_windows_portable_nightly_pytorch\python_embeded\Lib\site-packages\pip_internal\operations\prepare.py", line 71, in _get_prepared_distribution abstract_dist.prepare_distribution_metadata( File "D:\Stable_Diffusion\ComfyUI_windows_portable_nightly_pytorch\python_embeded\Lib\site-packages\pip_internal\distributions\sdist.py", line 54, in prepare_distribution_metadata self._install_build_reqs(finder) File "D:\Stable_Diffusion\ComfyUI_windows_portable_nightly_pytorch\python_embeded\Lib\site-packages\pip_internal\distributions\sdist.py", line 124, in _install_build_reqs build_reqs = self._get_build_requires_wheel() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\Stable_Diffusion\ComfyUI_windows_portable_nightly_pytorch\python_embeded\Lib\site-packages\pip_internal\distributions\sdist.py", line 101, in _get_build_requires_wheel return backend.get_requires_for_build_wheel() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\Stable_Diffusion\ComfyUI_windows_portable_nightly_pytorch\python_embeded\Lib\site-packages\pip_internal\utils\misc.py", line 745, in get_requires_for_build_wheel return super().get_requires_for_build_wheel(config_settings=cs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\Stable_Diffusion\ComfyUI_windows_portable_nightly_pytorch\python_embeded\Lib\site-packages\pip_vendor\pyproject_hooks_impl.py", line 166, in get_requires_for_build_wheel return self._call_hook('get_requires_for_build_wheel', { ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\Stable_Diffusion\ComfyUI_windows_portable_nightly_pytorch\python_embeded\Lib\site-packages\pip_vendor\pyproject_hooks_impl.py", line 321, in _call_hook raise BackendUnavailable(data.get('traceback', '')) pip._vendor.pyproject_hooks._impl.BackendUnavailable: Traceback (most recent call last): File "D:\Stable_Diffusion\ComfyUI_windows_portable_nightly_pytorch\python_embeded\Lib\site-packages\pip_vendor\pyproject_hooks_in_process_in_process.py", line 77, in build_backend obj = import_module(mod_path) ^^^^^^^^^^^^^^^^^^^^^^^ File "importlib_init.py", line 126, in import_module File "", line 1204, in _gcd_import File "", line 1176, in _find_and_load File "", line 1126, in _find_and_load_unlocked File "", line 241, in _call_with_frames_removed File "", line 1204, in _gcd_import File "", line 1176, in _find_and_load File "", line 1140, in _find_and_load_unlocked ModuleNotFoundError: No module named 'scikit_build_core'

Traceback (most recent call last): File "D:\Stable_Diffusion\ComfyUI_windows_portable_nightly_pytorch\ComfyUI\nodes.py", line 1864, in load_custom_node module_spec.loader.exec_module(module) File "", line 940, in exec_module File "", line 241, in call_with_frames_removed File "D:\Stable_Diffusion\ComfyUI_windows_portable_nightly_pytorch\ComfyUI\custom_nodes\ComfyUI_VLM_nodes_init.py", line 46, in check_requirements_installed(llama_cpp_agent_path) File "D:\Stable_Diffusion\ComfyUI_windows_portable_nightly_pytorch\ComfyUI\custom_nodes\ComfyUI_VLM_nodes_init_.py", line 35, in check_requirements_installed subprocess.check_call([sys.executable, '-s', '-m', 'pip', 'install', *missing_packages]) File "subprocess.py", line 413, in check_call subprocess.CalledProcessError: Command '['D:\Stable_Diffusion\ComfyUI_windows_portable_nightly_pytorch\python_embeded\python.exe', '-s', '-m', 'pip', 'install', 'llama-cpp-agent', 'mkdocs', 'mkdocs-material', 'mkdocstrings[python]', 'docstring-parser']' returned non-zero exit status 2.

Cannot import D:\Stable_Diffusion\ComfyUI_windows_portable_nightly_pytorch\ComfyUI\custom_nodes\ComfyUI_VLM_nodes module for custom nodes: Command '['D:\Stable_Diffusion\ComfyUI_windows_portable_nightly_pytorch\python_embeded\python.exe', '-s', '-m', 'pip', 'install', 'llama-cpp-agent', 'mkdocs', 'mkdocs-material', 'mkdocstrings[python]', 'docstring-parser']' returned non-zero exit status 2.

RamonGuthrie avatar Apr 10 '24 11:04 RamonGuthrie

I have the same issue.

MattMosquito avatar Apr 10 '24 12:04 MattMosquito

Install llama_cpp_python by using whl file from https://github.com/abetlen/llama-cpp-python/releases. Choose the right cuda version prebuillt whl for comfyui.

zmwv823 avatar Apr 10 '24 12:04 zmwv823

how to know cudas version

omar92 avatar Apr 17 '24 03:04 omar92

Look into the console log when you start comfyui. image

zmwv823 avatar Apr 17 '24 04:04 zmwv823

it says this image

omar92 avatar Apr 17 '24 04:04 omar92

my UI is diffrent for some reason

omar92 avatar Apr 17 '24 04:04 omar92

thats the error i keep getting image

omar92 avatar Apr 17 '24 04:04 omar92

Check this folder. image

My comfyui env:

  • torch 2.2.2 +cuda 12.1.

zmwv823 avatar Apr 17 '24 09:04 zmwv823

thats the error i keep getting image

Or try install with this whl file(without cuda):

  • https://github.com/abetlen/llama-cpp-python/releases/download/v0.2.61/llama_cpp_python-0.2.61-cp311-cp311-win_amd64.whl
  • Command(on my laptop,you need to modify path): D:\AI\ComfyUI_windows_portable\python_embeded\a.exe -m pip install D:\AI\环境\CU121\llama_cpp_python-0.2.61-cp311-cp311-win_amd64.whl

zmwv823 avatar Apr 17 '24 09:04 zmwv823

thanks so much

omar92 avatar Apr 17 '24 22:04 omar92

I had this problem and had to make sure "Visual Studio C++" was installed in Windows, then it worked.

GalaxyTimeMachine avatar Apr 19 '24 09:04 GalaxyTimeMachine

how to know cudas version

type this command in to your command prompt , in windows CLi.

nvcc --version

ace2duce avatar May 08 '24 04:05 ace2duce

Seems like I have a simliar problem after updating to nightly comfy portable I can find the .whl file and download it but where do I put it ... or is it a different problem here is my output from terminal

Installing llama-cpp-python... Looking in indexes: https://jllllll.github.io/llama-cpp-python-cuBLAS-wheels/AVX/cu124 ERROR: Could not find a version that satisfies the requirement llama-cpp-python (from versions: none) ERROR: No matching distribution found for llama-cpp-python Traceback (most recent call last): File "C:\AI\01_Comfy_nightly\ComfyUI\nodes.py", line 1879, in load_custom_node module_spec.loader.exec_module(module) File "", line 995, in exec_module File "", line 488, in call_with_frames_removed File "C:\AI\01_Comfy_nightly\ComfyUI\custom_nodes\ComfyUI_VLM_nodes_init.py", line 44, in install_llama(system_info) File "C:\AI\01_Comfy_nightly\ComfyUI\custom_nodes\ComfyUI_VLM_nodes\install_init.py", line 111, in install_llama install_package("llama-cpp-python", custom_command=custom_command) File "C:\AI\01_Comfy_nightly\ComfyUI\custom_nodes\ComfyUI_VLM_nodes\install_init.py", line 91, in install_package subprocess.check_call(command) File "subprocess.py", line 413, in check_call subprocess.CalledProcessError: Command '['C:\AI\01_Comfy_nightly\python_embeded\python.exe', '-m', 'pip', 'install', 'llama-cpp-python', '--no-cache-dir', '--force-reinstall', '--no-deps', '--index-url=https://jllllll.github.io/llama-cpp-python-cuBLAS-wheels/AVX/cu124']' returned non-zero exit status 1.

Cannot import C:\AI\01_Comfy_nightly\ComfyUI\custom_nodes\ComfyUI_VLM_nodes module for custom nodes: Command '['C:\AI\01_Comfy_nightly\python_embeded\python.exe', '-m', 'pip', 'install', 'llama-cpp-python', '--no-cache-dir', '--force-reinstall', '--no-deps', '--index-url=https://jllllll.github.io/llama-cpp-python-cuBLAS-wheels/AVX/cu124']' returned non-zero exit status 1.

Pedroman1 avatar May 22 '24 05:05 Pedroman1

Seems like I have a simliar problem after updating to nightly comfy portable I can find the .whl file and download it but where do I put it ... or is it a different problem here is my output from terminal

Installing llama-cpp-python... Looking in indexes: https://jllllll.github.io/llama-cpp-python-cuBLAS-wheels/AVX/cu124 ERROR: Could not find a version that satisfies the requirement llama-cpp-python (from versions: none) ERROR: No matching distribution found for llama-cpp-python Traceback (most recent call last): File "C:\AI\01_Comfy_nightly\ComfyUI\nodes.py", line 1879, in load_custom_node module_spec.loader.exec_module(module) File "", line 995, in exec_module File "", line 488, in call_with_frames_removed File "C:\AI\01_Comfy_nightly\ComfyUI\custom_nodes\ComfyUI_VLM_nodes__init_.py", line 44, in install_llama(system_info) File "C:\AI\01_Comfy_nightly\ComfyUI\custom_nodes\ComfyUI_VLM_nodes\install_init.py", line 111, in install_llama install_package("llama-cpp-python", custom_command=custom_command) File "C:\AI\01_Comfy_nightly\ComfyUI\custom_nodes\ComfyUI_VLM_nodes\install_init.py", line 91, in install_package subprocess.check_call(command) File "subprocess.py", line 413, in check_call subprocess.CalledProcessError: Command '['C:\AI\01_Comfy_nightly\python_embeded\python.exe', '-m', 'pip', 'install', 'llama-cpp-python', '--no-cache-dir', '--force-reinstall', '--no-deps', '--index-url=https://jllllll.github.io/llama-cpp-python-cuBLAS-wheels/AVX/cu124']' returned non-zero exit status 1.

Cannot import C:\AI\01_Comfy_nightly\ComfyUI\custom_nodes\ComfyUI_VLM_nodes module for custom nodes: Command '['C:\AI\01_Comfy_nightly\python_embeded\python.exe', '-m', 'pip', 'install', 'llama-cpp-python', '--no-cache-dir', '--force-reinstall', '--no-deps', '--index-url=https://jllllll.github.io/llama-cpp-python-cuBLAS-wheels/AVX/cu124']' returned non-zero exit status 1.

Do not compile by yourself. Use the precompiled whl file. 2 way(example,you should modify it according to your pc):

  • C:\AI\01_Comfy_nightly\python_embeded\python.exe -m pip install C:/llama_cpp_python-0.2.75-cp311-cp311-win_amd64.whl(Download from here)
  • Extract all files in whl file to C:\AI\01_Comfy_nightly\python_embeded\Lib\site-packages(Not reccomend!!! requirements won't install automaticly)

image

zmwv823 avatar May 22 '24 06:05 zmwv823

Install llama_cpp_python by using whl file from https://github.com/abetlen/llama-cpp-python/releases. Choose the right cuda version prebuillt whl for comfyui.

That was very easy to do and got it install. Thank you.

gseth avatar Aug 16 '24 21:08 gseth