[BUG/Help] ImportError: cannot import name 'GENERATION_CONFIG_NAME' from 'transformers.utils'
Is there an existing issue for this?
- [X] I have searched the existing issues
Current Behavior
model = AutoModel.from_pretrained("THUDM/chatglm-6b-int4",trust_remote_code=True).float() Explicitly passing a
revisionis encouraged when loading a configuration with custom code to ensure no malicious code has been contributed in a newer revision. Explicitly passing arevisionis encouraged when loading a model with custom code to ensure no malicious code has been contributed in a newer revision. Traceback (most recent call last): File "", line 1, in File "C:\Users\mina_\Anaconda3\envs\ChatGLM-6B\lib\site-packages\transformers\models\auto\auto_factory.py", line 456, in from_pretrained logger.warning( File "C:\Users\mina_\Anaconda3\envs\ChatGLM-6B\lib\site-packages\transformers\dynamic_module_utils.py", line 374, in get_class_from_dynamic_module
File "C:\Users\mina_\Anaconda3\envs\ChatGLM-6B\lib\site-packages\transformers\dynamic_module_utils.py", line 147, in get_class_in_module
def get_class_in_module(class_name, module_path):
File "C:\Users\mina_\Anaconda3\envs\ChatGLM-6B\lib\importlib_init_.py", line 127, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
File "
Expected Behavior
No response
Steps To Reproduce
conda activate chatglm-6bfrom transformers import AutoTokenizer, AutoModeltokenizer = AutoTokenizer.from_pretrained("THUDM/chatglm-6b", trust_remote_code=True)model = AutoModel.from_pretrained("THUDM/chatglm-6b-int4",trust_remote_code=True).float()- See this issue.
Environment
- OS: Windows 10
- Python: 3.7.5
- Transformers:
- PyTorch:
- CUDA Support (`python -c "import torch; print(torch.cuda.is_available())"`) : False
Anything else?
No response
What's your transformers version?
I had the same problem.
transformers:4.26.1 protobuf :3.20.0 icetk:0.0.4 cpm-kernels:1.0.11 torch:2.0.0 gradio:3.23.0
I encountered the same bug here. Following is the bug traceback.
│/home/mist/.cache/huggingface/modules/transformers_modules/THUDM/chatglm-6b/4a9b711e61d62b64ae8a │
│ 07d763553a98a984d281/modeling_chatglm.py:31 in
This error is not dependent on CHATGLM implementation. It's strictly connected to transformers library, and it also appears for > >4.27.0 version. You can mock some params from transformers.utils and pass its value as None, to be able to this model to load properly.
transformers.utils.GENERATION_CONFIG_NAME = None
transformers.utils.cached_file = None
transformers.utils.download_url = None
transformers.utils.extract_commit_hash = None
model = AutoModel.from_pretrained("THUDM/chatglm-6b", trust_remote_code=True).half().cuda()
OSError: Can't load the model for 'THUDM/chatglm-6b-int4'. If you were trying to load it from 'https://huggingface.co/models', make sure you don't have a local directory with the same name. Otherwise, make sure 'THUDM/chatglm-6b-int4' is the correct path to a directory containing a file named pytorch_model.bin, tf_model.h5, model.ckpt or flax_model.msgpack.
in requirements file https://github.com/THUDM/ChatGLM-6B/blob/main/requirements.txt chatglm need transformers==4.27.1
so pip install transformers==4.27.1