Baichuan-13B
Baichuan-13B copied to clipboard
如何传参 max_length 这种 在调用model.chat是1如何传递
https://huggingface.co/baichuan-inc/Baichuan-13B-Chat/discussions/8
https://huggingface.co/baichuan-inc/Baichuan-13B-Chat/discussions/8
如何修改,改这个json文件吗,代码有接口吗
from transformers.generation.utils import GenerationConfig
model.generation_config = GenerationConfig(**{
"assistant_token_id": 196,
"bos_token_id": 1,
"do_sample": True,
"eos_token_id": 2,
"max_new_tokens": 4,
"pad_token_id": 0,
"repetition_penalty": 1.1,
"temperature": 0.3,
"top_k": 5,
"top_p": 0.85,
"transformers_version": "4.30.2",
"user_token_id": 195
})
在初始化模型的时候添加这个
model 没有chat 属性,下面这句话怎么调通的? model.chat(tokenizer, messages, stream=True):
model 没有chat 属性,下面这句话怎么调通的? model.chat(tokenizer, messages, stream=True):
model有chat方法,可能是你模型加载参数没设置好?