OpenPrompt
OpenPrompt copied to clipboard
PromptForClassification shouldn't change plm
Hey! When I use freeze_plm in PromptForClassification with freeze_plm=True and then freeze_plm=False, the PLM would still behave as though it is frozen. This does not seem like an expected behavior in this case. i.e:
plm, tokenizer, model_config, WrapperClass = load_plm("roberta", "roberta-base")
prompt_model = PromptForClassification(plm=plm, template=promptTemplate, verbalizer=promptVerbalizer,
freeze_plm=True)
prompt_model = PromptForClassification(plm=plm, template=promptTemplate, verbalizer=promptVerbalizer,
freeze_plm=False)
# do training.. behaves as though PLM frozen,
# e.g. outputs "RuntimeError: element 0 of tensors does not require grad and does not have a grad_fn"