[SOLVED] split_torch_state_dict_into_shards
bash -x play.sh
ESC[?2004l^M+ '[' '!' -f runtime/envs/koboldai/bin/python ']'
+ bin/micromamba run -r runtime -n koboldai python aiserver.py
Traceback (most recent call last):
File "/opt/koboldai-client/runtime/envs/koboldai/lib/python3.8/site-packages/transformers/utils/import_utils.py", line 1076, in _get_module
return importlib.import_module("." + module_name, self.__name__)
File "/opt/koboldai-client/runtime/envs/koboldai/lib/python3.8/importlib/__init__.py", line 127, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
File "<frozen importlib._bootstrap>", line 1014, in _gcd_import
File "<frozen importlib._bootstrap>", line 991, in _find_and_load
File "<frozen importlib._bootstrap>", line 975, in _find_and_load_unlocked
File "<frozen importlib._bootstrap>", line 671, in _load_unlocked
File "<frozen importlib._bootstrap_external>", line 843, in exec_module
File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
File "/opt/koboldai-client/runtime/envs/koboldai/lib/python3.8/site-packages/transformers/modeling_utils.py", line 78, in <module>
from accelerate import __version__ as accelerate_version
File "/opt/koboldai-client/runtime/envs/koboldai/lib/python3.8/site-packages/accelerate/__init__.py", line 16, in <module>
from .accelerator import Accelerator
File "/opt/koboldai-client/runtime/envs/koboldai/lib/python3.8/site-packages/accelerate/accelerator.py", line 34, in <module>
from huggingface_hub import split_torch_state_dict_into_shards
ImportError: cannot import name 'split_torch_state_dict_into_shards' from 'huggingface_hub' (/opt/koboldai-client/runtime/envs/koboldai/lib/python3.8/site-packages/huggingface_hub/__init__.py)
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "aiserver.py", line 58, in <module>
from utils import debounce
File "/opt/koboldai-client/utils.py", line 12, in <module>
from transformers import PreTrainedModel
File "<frozen importlib._bootstrap>", line 1039, in _handle_fromlist
File "/opt/koboldai-client/runtime/envs/koboldai/lib/python3.8/site-packages/transformers/utils/import_utils.py", line 1066, in __getattr__
module = self._get_module(self._class_to_module[name])
File "/opt/koboldai-client/runtime/envs/koboldai/lib/python3.8/site-packages/transformers/utils/import_utils.py", line 1078, in _get_module
raise RuntimeError(
RuntimeError: Failed to import transformers.modeling_utils because of the following error (look up to see its traceback):
cannot import name 'split_torch_state_dict_into_shards' from 'huggingface_hub' (/opt/koboldai-client/runtime/envs/koboldai/lib/python3.8/site-packages/huggingface_hub/__init__.py)
FIX:
./bin/micromamba run -r runtime -n koboldai pip install --upgrade huggingface_hub
Where should this command be run?
Where should this command be run?
I'm not sure about the command he mentioned. I went down a similar path.
- Open command prompt
- Navigate to the directory with KoboldAI installed via CD (e.g. CD C:\Program Files (x86)\KoboldAI)
- Run
miniconda3\condabin\activateThat will run command prompt with the miniconda context - type
pip install --upgrade huggingface_hub
This fixed the issue for me
Or you can try this :
- Launch
commandline.batorcommandline.sh(depending of your OS) - Execute
pip install --upgrade huggingface_hub - Relaunch KoboldAI
I try this one and it works for me because I don't have a condabin folder in my miniconda3 folder
for AMD GPU
fixed by doing ./bin/micromamba run -r runtime -n koboldai-rocm pip install --upgrade huggingface_hub
I do recommed checking out KoboldCpp which is a muvh newer project with much more modern capabilities. It supports AMD GPU's well.