FastChat icon indicating copy to clipboard operation
FastChat copied to clipboard

gen_model_answer.py on peft adapter gets problem

Open Clemente-H opened this issue 2 years ago • 6 comments

I have a peft adapter model for a finetuned Falcon7b model, When using gen_mode_answer.py, i get this error: TypeError: PeftModelForCausalLM.generate() takes 1 positional argument but 2 were given python gen_model_answer.py --model-path /falcon_finetuned/trained_model/ --model-id falcon_finetuned

Clemente-H avatar Jul 31 '23 05:07 Clemente-H

were you able to fix it ?

mayur11235 avatar Aug 06 '23 23:08 mayur11235

Not yet. I don't know why the inference works fine, but this doesn't. Are you having the same problem?

Clemente-H avatar Aug 07 '23 01:08 Clemente-H

Yes but i am running llama2Thanks & RegardsMayur KOn Aug 6, 2023, at 20:17, Clemente-H @.***> wrote: Not yet. I don't know why the inference works fine, but this don't. Are you having the same problem?

—Reply to this email directly, view it on GitHub, or unsubscribe.You are receiving this because you commented.Message ID: @.***>

mayur11235 avatar Aug 07 '23 03:08 mayur11235

Found this https://github.com/huggingface/peft/issues/708. It help fixing this issue

Clemente-H avatar Aug 07 '23 04:08 Clemente-H

I will try that. Thankyou.On Aug 6, 2023, at 23:45, Clemente-H @.***> wrote: Found this huggingface/peft#708. It help fixing this issue

—Reply to this email directly, view it on GitHub, or unsubscribe.You are receiving this because you commented.Message ID: @.***>

mayur11235 avatar Aug 07 '23 13:08 mayur11235

I add the following code in the https://github.com/lm-sys/FastChat/blob/main/fastchat/llm_judge/gen_model_answer.py#L97, and it works.

if isinstance(model, PeftModelForCausalLM):
        model = model.merge_and_unload()

yifan123 avatar Apr 09 '24 08:04 yifan123