LLaMA-Adapter icon indicating copy to clipboard operation
LLaMA-Adapter copied to clipboard

Proper comparison between adapter-tuning, lora-tuning, prompt-tuning, and prefix-tuning?

Open jzhang38 opened this issue 2 years ago • 1 comments

Parameter-efficient methods have been studied extensively in (small or large) language models since BERT. Given those abundant prior works, why is there no controlled experiment comparing those different methods in the paper?

jzhang38 avatar May 03 '23 11:05 jzhang38