ChatGLM-6B
ChatGLM-6B copied to clipboard
[Help]试图修改模型下载位置,报错does not appear to have a file named config.json
Is there an existing issue for this?
- [X] I have searched the existing issues
Current Behavior
首次配置,执行python web_demo.py之前,参照https://github.com/THUDM/ChatGLM-6B/issues/50#issuecomment-1469862918修改模型下载位置:
path = "/home/3090-server/personal/bsc/ChatGLM/ChatGLM-6B/models" tokenizer = AutoTokenizer.from_pretrained(path, trust_remote_code=True) model = AutoModel.from_pretrained(path, trust_remote_code=True).half().cuda()
原代码:
tokenizer = AutoTokenizer.from_pretrained("THUDM/chatglm-6b", trust_remote_code=True) model = AutoModel.from_pretrained("THUDM/chatglm-6b", trust_remote_code=True).half().cuda()
报错:
Traceback (most recent call last): File "/home/3090-server/personal/bsc/ChatGLM/ChatGLM-6B/web_demo.py", line 5, in
tokenizer = AutoTokenizer.from_pretrained(mypath, trust_remote_code=True) File "/opt/miniconda3/envs/chatglm/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py", line 613, in from_pretrained config = AutoConfig.from_pretrained( File "/opt/miniconda3/envs/chatglm/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py", line 852, in from_pretrained config_dict, unused_kwargs = PretrainedConfig.get_config_dict(pretrained_model_name_or_path, **kwargs) File "/opt/miniconda3/envs/chatglm/lib/python3.10/site-packages/transformers/configuration_utils.py", line 565, in get_config_dict config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwargs) File "/opt/miniconda3/envs/chatglm/lib/python3.10/site-packages/transformers/configuration_utils.py", line 620, in _get_config_dict resolved_config_file = cached_file( File "/opt/miniconda3/envs/chatglm/lib/python3.10/site-packages/transformers/utils/hub.py", line 380, in cached_file raise EnvironmentError( OSError: /home/3090-server/personal/bsc/ChatGLM/ChatGLM-6B/models does not appear to have a file named config.json. Checkout 'https://huggingface.co//home/3090-server/personal/bsc/ChatGLM/ChatGLM-6B/models/None' for available files.
如果不修改就可以正常运行。是不是必须先用默认的位置执行一遍,再手动修改?
Expected Behavior
No response
Steps To Reproduce
No.
Environment
- OS:Ubuntu 18.04
- Python:3.10
- Transformers:4.26.1
- PyTorch:1.13.1+cu117
- CUDA Support:True
Anything else?
No response
日志里说/home/3090-server/personal/bsc/ChatGLM/ChatGLM-6B/models
目录下没有config.json
。
可以尝试把Hugging Face的仓库clone下来,并在代码中指向仓库所在的目录,确保目录下有config.json
。
日志里说
/home/3090-server/personal/bsc/ChatGLM/ChatGLM-6B/models
目录下没有config.json
。可以尝试把Hugging Face的仓库clone下来,并在代码中指向仓库所在的目录,确保目录下有
config.json
。
解决了,谢谢
你好,请问你是怎么解决的呢,我的也碰到了这种错误:
Traceback (most recent call last):
File "web_demo.py", line 7, in
有可能是传入模型路径的时候,传入的是github下载的程序目录,而不是huggingface下载的模型目录
你好,请问如何解决的 我也是第一次配置的时候,在执行demo.py时报错:
File "/data/zhangxueyi/jmx/ChatGLM2-6B/web_demo.py", line 6, in
日志里说
/home/3090-server/personal/bsc/ChatGLM/ChatGLM-6B/models
目录下没有config.json
。可以尝试把Hugging Face的仓库clone下来,并在代码中指向仓库所在的目录,确保目录下有
config.json
。 你好,请问怎么在代码中指向仓库所在的目录哇