ChatGLM-6B
ChatGLM-6B copied to clipboard
[BUG/Help] <更新版本后无法运行,似乎是载入模型失败了>
Is there an existing issue for this?
- [X] I have searched the existing issues
Current Behavior
D:\web\chatGLM\ChatGLM-6B> python.exe .\web_demo.py
Explicitly passing a revision
is encouraged when loading a model with custom code to ensure no malicious code has been contributed in a newer revision.
Explicitly passing a revision
is encouraged when loading a configuration with custom code to ensure no malicious code has been contributed in a newer revision.
Explicitly passing a revision
is encouraged when loading a model with custom code to ensure no malicious code has been contributed in a newer revision.
Loading checkpoint shards: 25%|██████████████▎ | 2/8 [00:02<00:06, 1.14s/it]
Traceback (most recent call last):
File "C:\Users\zh\AppData\Local\Programs\Python\Python310\lib\site-packages\transformers\modeling_utils.py", line 415, in load_state_dict
return torch.load(checkpoint_file, map_location="cpu")
File "C:\Users\zh\AppData\Local\Programs\Python\Python310\lib\site-packages\torch\serialization.py", line 797, in load
with _open_zipfile_reader(opened_file) as opened_zipfile:
File "C:\Users\zh\AppData\Local\Programs\Python\Python310\lib\site-packages\torch\serialization.py", line 283, in init
super().init(torch._C.PyTorchFileReader(name_or_buffer))
RuntimeError: PytorchStreamReader failed reading zip archive: failed finding central directory
During handling of the above exception, another exception occurred:
Traceback (most recent call last): File "C:\Users\zh\AppData\Local\Programs\Python\Python310\lib\site-packages\transformers\modeling_utils.py", line 419, in load_state_dict if f.read(7) == "version": UnicodeDecodeError: 'gbk' codec can't decode byte 0x80 in position 64: illegal multibyte sequence
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "D:\web\chatGLM\ChatGLM-6B\web_demo.py", line 6, in
Expected Behavior
No response
Steps To Reproduce
python.exe .\web_demo.py
Environment
- OS: windows 10
- Python:3.10.6
- Transformers:
- PyTorch:
- CUDA Support (`python -c "import torch; print(torch.cuda.is_available())"`) : True
Anything else?
No response
遇到了同样的错误,请问有解决方法吗
相同的错误+1
+1
可以试一下 https://github.com/THUDM/ChatGLM-6B#%E4%BB%8E%E6%9C%AC%E5%9C%B0%E5%8A%A0%E8%BD%BD%E6%A8%A1%E5%9E%8B
也遇到了,重下也不行
要确认是不是完整的下载了,我是下载的时候出错了。导致加载不成功,重新下了就好了
git lfs pull
方式下载回来的模型没问题,但是如果使用手动从 清华云下载模型,就会有这个报错。