DPTDR icon indicating copy to clipboard operation
DPTDR copied to clipboard

How to pre-train some parameters of deep prompt?

Open daisy-disc opened this issue 2 years ago • 1 comments

How to pre-train some parameters of deep prompt? Pre-train deep prompts One is to pre-train deep prompts with a vanilla PLM. Later we initialize a DPT-based retriever using the pre-trained deep prompts and the vanilla PLM. However, experiments in Sec. 4.4 show that it suffers from catastrophic forgetting and exhibits no superior performance to randomly initialized prompts. how to get the "pre-trained deep prompts"?

daisy-disc avatar Sep 04 '22 06:09 daisy-disc

Hi, for pre-training of deep prompts, we just employ a prompted dual-encoder to perform the RIP task, during which the PLM is fixed as a vanilla PLM while prompts is being updated.

tangzhy avatar Oct 12 '22 14:10 tangzhy