The meaning of Prefix-tuning state_dict that OpenPrompt added.
Hello, thanks for the great program, it saves me a lot of time and energy. But I have a question about state_dict when using prefix_tuning_template in T5.
I find that OpenPrompt add 9 parameters to T5:
Can anyone tell me which parameter is add to encoder, like the paper Prefix-Tuning: Optimizing Continuous Prompts for Generation shows:
Thank you very much!
I find three part is indispensability: 'prompt_model.template.wte.weight', 'prompt_model.template.control_trans.0.weight','prompt_model.template.control_trans.2.weight'. Without anyone, the prefix-tuning model will arise the performance like untrained model. But I still not find which part is add to encoder or decoder.