ChatGLM-6B icon indicating copy to clipboard operation
ChatGLM-6B copied to clipboard

[BUG/Help] <keyerror>

Open Lukrin-tech opened this issue 2 years ago • 3 comments

Is there an existing issue for this?

  • [X] I have searched the existing issues

Current Behavior

Traceback (most recent call last): File "test.py", line 2, in tokenizer = AutoTokenizer.from_pretrained("THUDM/chatglm-6b", trust_remote_code=True) File "/usr/local/lib/python3.6/dist-packages/transformers/models/auto/tokenization_auto.py", line 390, in from_pretrained config = AutoConfig.from_pretrained(pretrained_model_name_or_path, **kwargs) File "/usr/local/lib/python3.6/dist-packages/transformers/models/auto/configuration_auto.py", line 400, in from_pretrained config_class = CONFIG_MAPPING[config_dict["model_type"]] KeyError: 'chatglm'

路径没写错,为什么老是报这个错误呢?config.json也和官方比对过了没错

Expected Behavior

No response

Steps To Reproduce

11

Environment

- OS:Ubuntu
- Python:3.6
- Transformers:4.28.1
- PyTorch:1.9.0
- CUDA Support (`python -c "import torch; print(torch.cuda.is_available())"`) : True

Anything else?

No response

Lukrin-tech avatar May 04 '23 08:05 Lukrin-tech

Are you sure you are using transformers 4.28.1?

duzx16 avatar May 06 '23 08:05 duzx16

所以你怎么解决了呢?我也遇到了同样的问题,头都挠破了想不出为啥

Bin-xgms avatar May 25 '23 08:05 Bin-xgms

所以你怎么解决了呢?我也遇到了同样的问题,头都挠破了想不出为啥

保证环境一致也就是更新transformers以及一系列包就没问题了

Lukrin-tech avatar Jun 19 '23 06:06 Lukrin-tech