zzzzzzk
Results
2
issues of
zzzzzzk
运行chaglm成功,但是测试baichuan2-7b-chat出错 测试代码: import sys from transformers import AutoModelForCausalLM, AutoTokenizer from transformers.generation.utils import GenerationConfig model_path = '/data/zhoukai/open_models/Baichuan/Baichuan2-7B-Chat' model = AutoModelForCausalLM.from_pretrained(model_path, device_map='cpu', trust_remote_code=True) tokenizer = AutoTokenizer.from_pretrained(model_path, trust_remote_code=True) model.generation_config = GenerationConfig.from_pretrained(model_path) from build.tools.fastllm_pytools...