ChatGLM-6B icon indicating copy to clipboard operation
ChatGLM-6B copied to clipboard

[BUG/Help] <用PeftModel加入Lora权重后,模型生成不了回答呢,求问大佬有没有别的合并权重的方法>

Open heccxixi opened this issue 1 year ago • 0 comments

Is there an existing issue for this?

  • [X] I have searched the existing issues

Current Behavior

from transformers import AutoTokenizer, AutoModel from peft import PeftModel import torch

base_model = '/home/hexinyu/.cache/huggingface/hub/models--THUDM--chatglm-6b/snapshots/thu_chatglm' tokenizer = AutoTokenizer.from_pretrained(base_model, trust_remote_code=True)

LORA_WEIGHTS = './action_item/' if torch.cuda.is_available(): device = "cuda" if device == "cuda": model = AutoModel.from_pretrained( base_model, load_in_8bit=True, torch_dtype=torch.float16,
device_map={'': 0}, trust_remote_code=True, ) model = PeftModel.from_pretrained(model, LORA_WEIGHTS, torch_dtype=torch.float16, device_map={'': 0},) model = model.eval() response, history = model.chat(tokenizer, "判断他这个参加那个京东的那个活动嘛。是什么意思", history=[]) print(response)

Expected Behavior

No response

Steps To Reproduce

去掉合并model = PeftModel.from_pretrained(model, LORA_WEIGHTS, torch_dtype=torch.float16, device_map={'': 0},)就正常运行了 求各位大佬有没有别的合并权重的方法

Environment

- OS:linux
- Python:3.10.9
- Transformers:4.28.0.dev0
- PyTorch:2.0.0+cu118
- CUDA Support

Anything else?

No response

heccxixi avatar Apr 19 '23 05:04 heccxixi