ChatGLM-6B icon indicating copy to clipboard operation
ChatGLM-6B copied to clipboard

[BUG/Help] ImportError: cannot import name 'GENERATION_CONFIG_NAME' from 'transformers.utils'

Open cloudhzc opened this issue 2 years ago • 1 comments

Is there an existing issue for this?

  • [X] I have searched the existing issues

Current Behavior

model = AutoModel.from_pretrained("THUDM/chatglm-6b-int4",trust_remote_code=True).float() Explicitly passing a revision is encouraged when loading a configuration with custom code to ensure no malicious code has been contributed in a newer revision. Explicitly passing a revision is encouraged when loading a model with custom code to ensure no malicious code has been contributed in a newer revision. Traceback (most recent call last): File "", line 1, in File "C:\Users\mina_\Anaconda3\envs\ChatGLM-6B\lib\site-packages\transformers\models\auto\auto_factory.py", line 456, in from_pretrained logger.warning( File "C:\Users\mina_\Anaconda3\envs\ChatGLM-6B\lib\site-packages\transformers\dynamic_module_utils.py", line 374, in get_class_from_dynamic_module

File "C:\Users\mina_\Anaconda3\envs\ChatGLM-6B\lib\site-packages\transformers\dynamic_module_utils.py", line 147, in get_class_in_module def get_class_in_module(class_name, module_path): File "C:\Users\mina_\Anaconda3\envs\ChatGLM-6B\lib\importlib_init_.py", line 127, in import_module return _bootstrap._gcd_import(name[level:], package, level) File "", line 1006, in gcd_import File "", line 983, in find_and_load File "", line 967, in find_and_load_unlocked File "", line 677, in load_unlocked File "", line 728, in exec_module File "", line 219, in call_with_frames_removed File "C:\Users\mina/.cache\huggingface\modules\transformers_modules\THUDM\chatglm-6b-int4\dac03c3ac833dab2845a569a9b7f6ac4e8c5dc9b\modeling_chatglm.py", line 30, in from transformers.generation.utils import LogitsProcessorList, StoppingCriteriaList, GenerationConfig File "C:\Users\mina\Anaconda3\envs\ChatGLM-6B\lib\site-packages\transformers\generation\utils.py", line 39, in from .configuration_utils import GenerationConfig File "C:\Users\mina\Anaconda3\envs\ChatGLM-6B\lib\site-packages\transformers\generation\configuration_utils.py", line 24, in from ..utils import ( ImportError: cannot import name 'GENERATION_CONFIG_NAME' from 'transformers.utils' (C:\Users\mina\Anaconda3\envs\ChatGLM-6B\lib\site-packages\transformers\utils_init.py)

Expected Behavior

No response

Steps To Reproduce

  1. conda activate chatglm-6b
  2. from transformers import AutoTokenizer, AutoModel
  3. tokenizer = AutoTokenizer.from_pretrained("THUDM/chatglm-6b", trust_remote_code=True)
  4. model = AutoModel.from_pretrained("THUDM/chatglm-6b-int4",trust_remote_code=True).float()
  5. See this issue.

Environment

- OS: Windows 10
- Python: 3.7.5
- Transformers:
- PyTorch:
- CUDA Support (`python -c "import torch; print(torch.cuda.is_available())"`) : False

Anything else?

No response

cloudhzc avatar Mar 24 '23 10:03 cloudhzc

What's your transformers version?

duzx16 avatar Mar 26 '23 03:03 duzx16

I had the same problem.

transformers:4.26.1 protobuf :3.20.0 icetk:0.0.4 cpm-kernels:1.0.11 torch:2.0.0 gradio:3.23.0

helloapple1 avatar Mar 27 '23 09:03 helloapple1

I encountered the same bug here. Following is the bug traceback.

│/home/mist/.cache/huggingface/modules/transformers_modules/THUDM/chatglm-6b/4a9b711e61d62b64ae8a │ │ 07d763553a98a984d281/modeling_chatglm.py:31 in │ │ │ │ 28 from transformers.modeling_utils import PreTrainedModel │ │ 29 from transformers.utils import logging │ │ 30 from transformers.generation.logits_process import LogitsProcessor │ │ ❱ 31 from transformers.generation.utils import LogitsProcessorList, StoppingCriteriaList, Gen │ │ 32 │ │ 33 from .configuration_chatglm import ChatGLMConfig │ │ 34 │ ╰──────────────────────────────────────────────────────────────────────────────────────────────────╯ ImportError: cannot import name 'GenerationConfig' from 'transformers.generation.utils' (/mistgpu/site-packages/transformers/generation/utils.py)

kandeng avatar Mar 27 '23 21:03 kandeng

This error is not dependent on CHATGLM implementation. It's strictly connected to transformers library, and it also appears for > >4.27.0 version. You can mock some params from transformers.utils and pass its value as None, to be able to this model to load properly.

transformers.utils.GENERATION_CONFIG_NAME = None
transformers.utils.cached_file = None
transformers.utils.download_url = None
transformers.utils.extract_commit_hash = None
model = AutoModel.from_pretrained("THUDM/chatglm-6b", trust_remote_code=True).half().cuda()

s3nh avatar Mar 29 '23 10:03 s3nh

OSError: Can't load the model for 'THUDM/chatglm-6b-int4'. If you were trying to load it from 'https://huggingface.co/models', make sure you don't have a local directory with the same name. Otherwise, make sure 'THUDM/chatglm-6b-int4' is the correct path to a directory containing a file named pytorch_model.bin, tf_model.h5, model.ckpt or flax_model.msgpack.

cloudhzc avatar Mar 30 '23 10:03 cloudhzc

in requirements file https://github.com/THUDM/ChatGLM-6B/blob/main/requirements.txt chatglm need transformers==4.27.1

so pip install transformers==4.27.1

lqfeng avatar Aug 01 '23 03:08 lqfeng