ChatGLM-6B
ChatGLM-6B copied to clipboard
[BUG/Help] <title>AttributeError: module transformers has no attribute TFChatGLMForConditionalGeneration
Is there an existing issue for this?
- [X] I have searched the existing issues
Current Behavior
更改web_demo.py为: tokenizer = AutoTokenizer.from_pretrained("chatglm-6b_pre", trust_remote_code=True,from_tf=True) model = AutoModel.from_pretrained("chatglm-6b_pre", trust_remote_code=True,from_tf=True).half().cuda()
运行web_demo.py时候出现Traceback (most recent call last):
File "/home/jovyan/web_demo.py", line 6, in
Expected Behavior
No response
Steps To Reproduce
无
Environment
- OS:linux
- Python:3.9.16
- Transformers: 4.27.1
- PyTorch: 2.0.0
- CUDA Support (`python -c "import torch; print(torch.cuda.is_available())"`) :True
Anything else?
no
你确定 chatglm-6b_pre
里面的文件是完整的吗,对比 https://huggingface.co/THUDM/chatglm-6b