LaMP
LaMP copied to clipboard
What if I want to use decoder only model?
In your evaluate_llm.py, you wrote:
opts = parser.parse_args() model = AutoModelForSeq2SeqLM.from_pretrained(opts.model_addr, cache_dir=opts.cache_dir) tokenizer = AutoTokenizer.from_pretrained(opts.model_addr, cache_dir=opts.cache_dir) collator = DataCollatorForSeq2Seq(tokenizer = tokenizer, model = model, max_length = opts.max_length)
But what if I want to use decoder only model? Like Llama and GPT.