P-tuning-v2 icon indicating copy to clipboard operation
P-tuning-v2 copied to clipboard

An optimized deep prompt tuning strategy comparable to fine-tuning across scales and tasks

Results 38 P-tuning-v2 issues
Sort by recently updated
recently updated
newest added

请问,实现p-tuningv2,在每一层都添加可学习参数的代码在哪里?,nlp新手,感谢!!

附录A中提到: For the multi-task setting, we combine the training set of the three datasets for pre-training. We use **different linear classifiers** for each dataset while sharing the continuous prompts. 这意味着不同任务的训练是分开进行的?针对不同的任务,需要手动更换linear...

It is not clear how you deal with the ReCoRD dataset - the collater, metrics, and so on. In your final version, do you use the SuperGlueDatasetForRecord or just the...

(gh_P-tuning-v2) ub2004@ub2004-B85M-A0:~/llm_dev/P-tuning-v2$ bash run_script/run_rte_roberta.sh Traceback (most recent call last): File "run.py", line 7, in import datasets File "/home/ub2004/anaconda3/envs/gh_P-tuning-v2/lib/python3.8/site-packages/datasets/__init__.py", line 37, in from .builder import ArrowBasedBuilder, BeamBasedBuilder, BuilderConfig, DatasetBuilder, GeneratorBasedBuilder File...

请问在task中,对NER的多任务形式训练代码体现在哪里呢,只看到了SuperGLUE的多任务训练方法

File "run.py", line 7, in import datasets File "/home/appuser/miniconda3/envs/pt2/lib/python3.8/site-packages/datasets/__init__.py", line 37, in from .builder import ArrowBasedBuilder, BeamBasedBuilder, BuilderConfig, DatasetBuilder, GeneratorBasedBuilder File "/home/appuser/miniconda3/envs/pt2/lib/python3.8/site-packages/datasets/builder.py", line 44, in from .data_files import DataFilesDict, _sanitize_patterns...

Thank you for your great work! What hyperparameters (number of epochs, lr, etc.) did you use for prompt tuning (v1) and fine tuning?

针对某一垂直领域进行微调需要至少多大的数据集

Hello, I wonder to know how you implement the prompt with depth less than model's layers. Huggingface requires length of `past_key_value` to match the model's config.n_layers, so I think that...