GLM icon indicating copy to clipboard operation
GLM copied to clipboard

如何使用在huggingface下载的离线模型推理glm-10b-chinese?

Open vicwer opened this issue 2 years ago • 2 comments

您好,我在huggingface下载了10b-cn模型的pytorch_model.bin等文件。然后使用如下代码加载模型

from transformers import AutoModelForCausalLM, AutoTokenizer
from transformers import AutoModel

checkpoint = "../models/glm/glm_10b_cn"
tokenizer = AutoTokenizer.from_pretrained(checkpoint, trust_remote_code=True)
model = AutoModelForCausalLM.from_pretrained(checkpoint, trust_remote_code=True)

出现报错:

ValueError: Unrecognized configuration class <class 'transformers_modules.local.configuration_glm.GLMConfig'> for this kind of AutoModel: AutoModelForCausalLM.
Model type should be one of BartConfig, BertConfig, BertGenerationConfig, BigBirdConfig, BigBirdPegasusConfig, BlenderbotConfig, BlenderbotSmallConfig, BloomConfig, CamembertConfig, CodeGenConfig, CTRLConfig, Data2VecTextConfig, ElectraConfig, ErnieConfig, GPT2Config, GPTNeoConfig, GPTNeoXConfig, GPTNeoXJapaneseConfig, GPTJConfig, MarianConfig, MBartConfig, MegatronBertConfig, MvpConfig, OpenAIGPTConfig, OPTConfig, PegasusConfig, PLBartConfig, ProphetNetConfig, QDQBertConfig, ReformerConfig, RemBertConfig, RobertaConfig, RoCBertConfig, RoFormerConfig, Speech2Text2Config, TransfoXLConfig, TrOCRConfig, XGLMConfig, XLMConfig, XLMProphetNetConfig, XLMRobertaConfig, XLMRobertaXLConfig, XLNetConfig.

我的transformers==4.25.1。 请问,离线模型可以支持推理吗?我需要如何修改,谢谢。

vicwer avatar Feb 23 '23 08:02 vicwer

模型应该用AutoModelForSeq2SeqLM 而不是AutoModelForCausalLM https://github.com/THUDM/GLM#generation

duzx16 avatar Feb 24 '23 02:02 duzx16

模型应该用AutoModelForSeq2SeqLM 而不是AutoModelForCausalLM https://github.com/THUDM/GLM#generation 这两个有什么区别吗?

Martin-WMM avatar Apr 07 '23 03:04 Martin-WMM