peft icon indicating copy to clipboard operation
peft copied to clipboard

Pipeline 'text-generation' support when?

Open gururise opened this issue 1 year ago • 2 comments

Any plans for adding support to pipeline?

pipe = pipeline(
    "text-generation",
    model=model, # model is PeftModel.from_pretrained()
    tokenizer=tokenizer, 
    max_length=256,
    temperature=0.6,
    top_p=0.95,
    repetition_penalty=1.2
)

The model 'PeftModelForCausalLM' is not supported for text-generation. Supported models are ['BartForCausalLM', 'BertLMHeadModel', 'BertGenerationDecoder', 'BigBirdForCausalLM', 'BigBirdPegasusForCausalLM', 'BioGptForCausalLM', 'BlenderbotForCausalLM', 'BlenderbotSmallForCausalLM', 'BloomForCausalLM', 'CamembertForCausalLM', 'CodeGenForCausalLM', 'CTRLLMHeadModel', 'Data2VecTextForCausalLM', 'ElectraForCausalLM', 'ErnieForCausalLM', 'GitForCausalLM', 'GPT2LMHeadModel', 'GPT2LMHeadModel', 'GPTNeoForCausalLM', 'GPTNeoXForCausalLM', 'GPTNeoXJapaneseForCausalLM', 'GPTJForCausalLM', 'LlamaForCausalLM', 'MarianForCausalLM', 'MBartForCausalLM', 'MegatronBertForCausalLM', 'MvpForCausalLM', 'OpenAIGPTLMHeadModel', 'OPTForCausalLM', 'PegasusForCausalLM', 'PLBartForCausalLM', 'ProphetNetForCausalLM', 'QDQBertLMHeadModel', 'ReformerModelWithLMHead', 'RemBertForCausalLM', 'RobertaForCausalLM', 'RobertaPreLayerNormForCausalLM', 'RoCBertForCausalLM', 'RoFormerForCausalLM', 'Speech2Text2ForCausalLM', 'TransfoXLLMHeadModel', 'TrOCRForCausalLM', 'XGLMForCausalLM', 'XLMWithLMHeadModel', 'XLMProphetNetForCausalLM', 'XLMRobertaForCausalLM', 'XLMRobertaXLForCausalLM', 'XLNetLMHeadModel', 'XmodForCausalLM'].

gururise avatar Mar 26 '23 18:03 gururise

As a work around :

pipeline = pipeline(model = base_model)
pipeline.model = PeftModel.from_pretrained(model = base_model, ...)

... seems to work.

katossky avatar Apr 18 '23 09:04 katossky

Hello @katossky, thank you for the neat way of getting this to work, however, that would only work for LoRA method.

Pipelines should already work, but instead of pipeline function, you need to call the corresponding pipeline class. See the example here using the ASR PIpleine: https://github.com/huggingface/peft/blob/main/examples/int8_training/peft_bnb_whisper_large_v2_training.ipynb

Screenshot 2023-04-18 at 3 48 06 PM

pacman100 avatar Apr 18 '23 10:04 pacman100

As a work around :

pipeline = pipeline(model = base_model)
pipeline.model = PeftModel.from_pretrained(model = base_model, ...)

... seems to work.

This works to me.

tuanio avatar May 02 '23 13:05 tuanio

This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.

github-actions[bot] avatar May 26 '23 15:05 github-actions[bot]

@katossky @tuanio Hi, can you provide a sample on this solution? I have origin llamamodel, and peft-lora finetuned llamamodel , and still get error not support for text generation, when using pipeline.

waterluck avatar Jul 06 '23 07:07 waterluck