ChatGLM-6B icon indicating copy to clipboard operation
ChatGLM-6B copied to clipboard

running on ubuntu python3

Open haha517 opened this issue 2 years ago • 3 comments

Is there an existing issue for this?

  • [X] I have searched the existing issues

Current Behavior

Explicitly passing a revision is encouraged when loading a model with custom code to ensure no malicious code has been contributed in a newer revision. Explicitly passing a revision is encouraged when loading a configuration with custom code to ensure no malicious code has been contributed in a newer revision. Explicitly passing a revision is encouraged when loading a model with custom code to ensure no malicious code has been contributed in a newer revision. Loading checkpoint shards: 62%|██████████████████████████████ | 5/8 [00:10<00:06, 2.22s/it]Killed

Expected Behavior

No response

Steps To Reproduce

python3 web_demo.py
Explicitly passing a revision is encouraged when loading a model with custom code to ensure no malicious code has been contributed in a newer revision. Explicitly passing a revision is encouraged when loading a configuration with custom code to ensure no malicious code has been contributed in a newer revision. Explicitly passing a revision is encouraged when loading a model with custom code to ensure no malicious code has been contributed in a newer revision. Loading checkpoint shards: 62%|██████████████████████████████ | 5/8 [00:10<00:06, 2.22s/it]Killed

Environment

- OS:
- Python:
- Transformers:
- PyTorch:
- CUDA Support (`python -c "import torch; print(torch.cuda.is_available())"`) :

Anything else?

No response

haha517 avatar Mar 16 '23 10:03 haha517

use a 1080ti GPU,16G memory

haha517 avatar Mar 16 '23 10:03 haha517

请对模型进行量化再加载。不量化的话初始占用13GB左右【*.bin模型总大小】,后续可以达到18GB显存。显然你的显存不够。 参考 #39

yaleimeng avatar Mar 17 '23 01:03 yaleimeng

请问你的问题解决了吗 留个联系方式交流下?

makc-321 avatar Mar 25 '23 14:03 makc-321