OpenPrompt icon indicating copy to clipboard operation
OpenPrompt copied to clipboard

PromptForClassification shouldn't change plm

Open guy-dar opened this issue 2 years ago • 0 comments

Hey! When I use freeze_plm in PromptForClassification with freeze_plm=True and then freeze_plm=False, the PLM would still behave as though it is frozen. This does not seem like an expected behavior in this case. i.e:

plm, tokenizer, model_config, WrapperClass = load_plm("roberta", "roberta-base")
prompt_model = PromptForClassification(plm=plm, template=promptTemplate, verbalizer=promptVerbalizer, 
                                       freeze_plm=True)
prompt_model = PromptForClassification(plm=plm, template=promptTemplate, verbalizer=promptVerbalizer, 
                                       freeze_plm=False) 
# do training.. behaves as though PLM frozen, 
# e.g. outputs "RuntimeError: element 0 of tensors does not require grad and does not have a grad_fn"

guy-dar avatar Dec 15 '22 09:12 guy-dar