GLM
GLM copied to clipboard
如何使用在huggingface下载的离线模型推理glm-10b-chinese?
您好,我在huggingface下载了10b-cn模型的pytorch_model.bin
等文件。然后使用如下代码加载模型
from transformers import AutoModelForCausalLM, AutoTokenizer
from transformers import AutoModel
checkpoint = "../models/glm/glm_10b_cn"
tokenizer = AutoTokenizer.from_pretrained(checkpoint, trust_remote_code=True)
model = AutoModelForCausalLM.from_pretrained(checkpoint, trust_remote_code=True)
出现报错:
ValueError: Unrecognized configuration class <class 'transformers_modules.local.configuration_glm.GLMConfig'> for this kind of AutoModel: AutoModelForCausalLM.
Model type should be one of BartConfig, BertConfig, BertGenerationConfig, BigBirdConfig, BigBirdPegasusConfig, BlenderbotConfig, BlenderbotSmallConfig, BloomConfig, CamembertConfig, CodeGenConfig, CTRLConfig, Data2VecTextConfig, ElectraConfig, ErnieConfig, GPT2Config, GPTNeoConfig, GPTNeoXConfig, GPTNeoXJapaneseConfig, GPTJConfig, MarianConfig, MBartConfig, MegatronBertConfig, MvpConfig, OpenAIGPTConfig, OPTConfig, PegasusConfig, PLBartConfig, ProphetNetConfig, QDQBertConfig, ReformerConfig, RemBertConfig, RobertaConfig, RoCBertConfig, RoFormerConfig, Speech2Text2Config, TransfoXLConfig, TrOCRConfig, XGLMConfig, XLMConfig, XLMProphetNetConfig, XLMRobertaConfig, XLMRobertaXLConfig, XLNetConfig.
我的transformers==4.25.1。 请问,离线模型可以支持推理吗?我需要如何修改,谢谢。
模型应该用AutoModelForSeq2SeqLM
而不是AutoModelForCausalLM
https://github.com/THUDM/GLM#generation
模型应该用
AutoModelForSeq2SeqLM
而不是AutoModelForCausalLM
https://github.com/THUDM/GLM#generation 这两个有什么区别吗?