fast-stable-diffusion icon indicating copy to clipboard operation
fast-stable-diffusion copied to clipboard

Comfy UI - RuntimeError: Failed to import transformers.configuration_utils

Open gateway opened this issue 1 year ago • 11 comments

Seems like a new issue??

fired up paperspace server on a RTX 5000 and am now getting an error.. detailed logs listed below.. this has been working for months minus some tweaks about missing url..

## ComfyUI-Manager: installing dependencies done.
** ComfyUI startup time: 2025-01-15 18:42:19.906506
** Platform: Linux
** Python version: 3.11.7 (main, Dec  8 2023, 18:56:58) [GCC 11.4.0]
** Python executable: /usr/local/bin/python
** ComfyUI Path: /notebooks/ComfyUI
** Log path: /notebooks/comfyui.log

Prestartup times for custom nodes:
   0.0 seconds: /notebooks/ComfyUI/custom_nodes/rgthree-comfy
   3.7 seconds: /notebooks/ComfyUI/custom_nodes/ComfyUI-Manager

Total VRAM 16117 MB, total RAM 30068 MB
pytorch version: 2.1.1+cu121
xformers version: 0.0.23
Set vram state to: NORMAL_VRAM
Device: cuda:0 Quadro RTX 5000 : cudaMallocAsync
Traceback (most recent call last):
  File "/usr/local/lib/python3.11/dist-packages/transformers/utils/import_utils.py", line 1817, in _get_module
    return importlib.import_module("." + module_name, self.__name__)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.11/importlib/__init__.py", line 126, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "<frozen importlib._bootstrap>", line 1204, in _gcd_import
  File "<frozen importlib._bootstrap>", line 1176, in _find_and_load
  File "<frozen importlib._bootstrap>", line 1147, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 690, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 940, in exec_module
  File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
  File "/usr/local/lib/python3.11/dist-packages/transformers/integrations/ggml.py", line 24, in <module>
    from tokenizers import Tokenizer, decoders, normalizers, pre_tokenizers, processors
  File "/usr/local/lib/python3.11/dist-packages/tokenizers/__init__.py", line 78, in <module>
    from .tokenizers import (
ImportError: libssl-cd1d6220.so.1.0.2k: cannot open shared object file: No such file or directory

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.11/dist-packages/transformers/utils/import_utils.py", line 1817, in _get_module
    return importlib.import_module("." + module_name, self.__name__)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.11/importlib/__init__.py", line 126, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "<frozen importlib._bootstrap>", line 1204, in _gcd_import
  File "<frozen importlib._bootstrap>", line 1176, in _find_and_load
  File "<frozen importlib._bootstrap>", line 1147, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 690, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 940, in exec_module
  File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
  File "/usr/local/lib/python3.11/dist-packages/transformers/configuration_utils.py", line 29, in <module>
    from .modeling_gguf_pytorch_utils import load_gguf_checkpoint
  File "/usr/local/lib/python3.11/dist-packages/transformers/modeling_gguf_pytorch_utils.py", line 23, in <module>
    from .integrations import (
  File "<frozen importlib._bootstrap>", line 1229, in _handle_fromlist
  File "/usr/local/lib/python3.11/dist-packages/transformers/utils/import_utils.py", line 1805, in __getattr__
    module = self._get_module(self._class_to_module[name])
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/dist-packages/transformers/utils/import_utils.py", line 1819, in _get_module
    raise RuntimeError(
RuntimeError: Failed to import transformers.integrations.ggml because of the following error (look up to see its traceback):
libssl-cd1d6220.so.1.0.2k: cannot open shared object file: No such file or directory

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/notebooks/ComfyUI/main.py", line 136, in <module>
    import execution
  File "/notebooks/ComfyUI/execution.py", line 13, in <module>
    import nodes
  File "/notebooks/ComfyUI/nodes.py", line 22, in <module>
    import comfy.diffusers_load
  File "/notebooks/ComfyUI/comfy/diffusers_load.py", line 3, in <module>
    import comfy.sd
  File "/notebooks/ComfyUI/comfy/sd.py", line 10, in <module>
    from .ldm.cascade.stage_c_coder import StageC_coder
  File "/notebooks/ComfyUI/comfy/ldm/cascade/stage_c_coder.py", line 19, in <module>
    import torchvision
  File "/usr/local/lib/python3.11/dist-packages/torchvision/__init__.py", line 6, in <module>
    from torchvision import _meta_registrations, datasets, io, models, ops, transforms, utils
  File "/usr/local/lib/python3.11/dist-packages/torchvision/models/__init__.py", line 2, in <module>
    from .convnext import *
  File "/usr/local/lib/python3.11/dist-packages/torchvision/models/convnext.py", line 8, in <module>
    from ..ops.misc import Conv2dNormActivation, Permute
  File "/usr/local/lib/python3.11/dist-packages/torchvision/ops/__init__.py", line 23, in <module>
    from .poolers import MultiScaleRoIAlign
  File "/usr/local/lib/python3.11/dist-packages/torchvision/ops/poolers.py", line 10, in <module>
    from .roi_align import roi_align
  File "/usr/local/lib/python3.11/dist-packages/torchvision/ops/roi_align.py", line 4, in <module>
    import torch._dynamo
  File "/usr/local/lib/python3.11/dist-packages/torch/_dynamo/__init__.py", line 2, in <module>
    from . import allowed_functions, convert_frame, eval_frame, resume_execution
  File "/usr/local/lib/python3.11/dist-packages/torch/_dynamo/convert_frame.py", line 46, in <module>
    from .output_graph import OutputGraph
  File "/usr/local/lib/python3.11/dist-packages/torch/_dynamo/output_graph.py", line 35, in <module>
    from . import config, logging as torchdynamo_logging, variables
  File "/usr/local/lib/python3.11/dist-packages/torch/_dynamo/variables/__init__.py", line 53, in <module>
    from .torch import TorchVariable
  File "/usr/local/lib/python3.11/dist-packages/torch/_dynamo/variables/torch.py", line 131, in <module>
    transformers.configuration_utils.PretrainedConfig.__eq__ = (
    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/dist-packages/transformers/utils/import_utils.py", line 1808, in __getattr__
    value = self._get_module(name)
            ^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/dist-packages/transformers/utils/import_utils.py", line 1819, in _get_module
    raise RuntimeError(
RuntimeError: Failed to import transformers.configuration_utils because of the following error (look up to see its traceback):
Failed to import transformers.integrations.ggml because of the following error (look up to see its traceback):
libssl-cd1d6220.so.1.0.2k: cannot open shared object file: No such file or directory
159 GiB | 500 GiB
0%
GPU
0%
RAM
1.7|30 GiB
paperspace/gradient-base:pt211-tf215-cudatk120-py311-20240202```

gateway avatar Jan 15 '25 19:01 gateway

does the issue persist ?

TheLastBen avatar Jan 16 '25 20:01 TheLastBen

does the issue persist ?

yea.. i just tried it again.. not sure whats going on..

gateway avatar Jan 16 '25 20:01 gateway

try a fresh notebook

TheLastBen avatar Jan 17 '25 03:01 TheLastBen

Chiming in with the same issue, fresh notebook didn't solve it. Only happens when I install hunyuan.

willmurdoch avatar Jan 17 '25 20:01 willmurdoch

yea didnt work for me either.. I dont have hunyuan installed.. just fyi..

gateway avatar Jan 17 '25 21:01 gateway

anyone figure this out?

gateway avatar Jan 18 '25 21:01 gateway

did you try a different machine ?

TheLastBen avatar Jan 19 '25 13:01 TheLastBen

did you try a different machine ?

Yup, tried A6000 and RTX5000

willmurdoch avatar Jan 19 '25 15:01 willmurdoch

Uninstalling tokenizers and installing transformers again seems to solve the issue

pip uninstall tokenizers
pip install transformers -U

gayafhannah avatar Jan 20 '25 20:01 gayafhannah

Uninstalling tokenizers and installing transformers again seems to solve the issue

pip uninstall tokenizers
pip install transformers -U

I can confirm this works for me.. this is what I have..

in the install/update comfyui code block..

!pip uninstall tokenizers -y !pip install transformers

-y is just saying Yes that you want to..

gateway avatar Jan 20 '25 20:01 gateway

Worked for me as well!

willmurdoch avatar Jan 21 '25 04:01 willmurdoch