Baichuan-13B
Baichuan-13B copied to clipboard
The model 'BaichuanForCausalLM' is not supported for text-generation.
import torch from langchain import PromptTemplate, LLMChain from langchain.llms import HuggingFacePipeline from transformers import AutoTokenizer, AutoModelForCausalLM, pipeline, AutoModelForSeq2SeqLM
model_id = "baichuan-inc/Baichuan-13B-Chat" tokenizer = AutoTokenizer.from_pretrained(model_id, use_fast=False, trust_remote_code=True) model = AutoModelForCausalLM.from_pretrained(model_id, device_map="auto", torch_dtype=torch.float16, trust_remote_code=True) #model = model.quantize(8).cuda()
pipe = pipeline( "text-generation", model=model, tokenizer=tokenizer, max_length=1000 )
local_llm = HuggingFacePipeline(pipeline=pipe) print(local_llm('What is the capital of France? '))
template = """Question: {question} Answer: Let's think step by step.""" prompt = PromptTemplate(template=template, input_variables=["question"])
llm_chain = LLMChain(prompt=prompt, llm=local_llm) question = "What is the capital of England?" print(llm_chain.run(question)) 报错 The model 'BaichuanForCausalLM' is not supported for text-generation
你的问题解决了吗?我也想知道怎么在langchain上用baichuan-13b-chat
你的问题解决了吗?我也想知道怎么在langchain上用baichuan-13b-chat
我发的代码就是基于langchian运行baichuan-13b-chat,这个报错不影响运行
I have faced the similar issue.
The model 'BaiChuanForCausalLM' is not supported for text-generation. Supported models are ['BartForCausalLM', 'BertLMHeadModel', 'BertGenerationDecoder', 'BigBirdForCausalLM', 'BigBirdPegasusForCausalLM', 'BioGptForCausalLM', 'BlenderbotForCausalLM', 'BlenderbotSmallForCausalLM', 'BloomForCausalLM', 'CamembertForCausalLM', 'CodeGenForCausalLM', 'CpmAntForCausalLM', 'CTRLLMHeadModel', 'Data2VecTextForCausalLM', 'ElectraForCausalLM', 'ErnieForCausalLM', 'GitForCausalLM', 'GPT2LMHeadModel', 'GPT2LMHeadModel', 'GPTBigCodeForCausalLM', 'GPTNeoForCausalLM', 'GPTNeoXForCausalLM', 'GPTNeoXJapaneseForCausalLM', 'GPTJForCausalLM', 'LlamaForCausalLM', 'MarianForCausalLM', 'MBartForCausalLM', 'MegaForCausalLM', 'MegatronBertForCausalLM', 'MvpForCausalLM', 'OpenLlamaForCausalLM', 'OpenAIGPTLMHeadModel', 'OPTForCausalLM', 'PegasusForCausalLM', 'PLBartForCausalLM', 'ProphetNetForCausalLM', 'QDQBertLMHeadModel', 'ReformerModelWithLMHead', 'RemBertForCausalLM', 'RobertaForCausalLM', 'RobertaPreLayerNormForCausalLM', 'RoCBertForCausalLM', 'RoFormerForCausalLM', 'RwkvForCausalLM', 'Speech2Text2ForCausalLM', 'TransfoXLLMHeadModel', 'TrOCRForCausalLM', 'XGLMForCausalLM', 'XLMWithLMHeadModel', 'XLMProphetNetForCausalLM', 'XLMRobertaForCausalLM', 'XLMRobertaXLForCausalLM', 'XLNetLMHeadModel', 'XmodForCausalLM']
Is this issue resolved ??? I have used the model "csdc-atl/baichuan-7B-chat"