totaltube

Results 9 comments of totaltube

Same error if opening lora adapter for additional training. Also seq2seq model flan-t5 ```python config = PeftConfig.from_pretrained(peft_model_id) tokenizer = AutoTokenizer.from_pretrained(config.base_model_name_or_path) model = AutoModelForSeq2SeqLM.from_pretrained(config.base_model_name_or_path) model = PeftModel.from_pretrained(model, peft_model_id).to(device) model.print_trainable_parameters() ``` And...

> Hello, `is_trainable=True` is required to load the pretrained adapter and have them in trainable mode. Please pass `PeftModel.from_pretrained(model, peft_model_id, is_trainable=True).to(device)` and let us know if that solves the issue...

looks like PEFT works with XGLM.

> No, I try to use but target_modules is missing in LoraConfig. Ah, I tried it with prefix tuning. Looks like it works.

> what version or codebase of OpenNMT-py did you use? it seems that you have both position_encoding=True and max_relative_position !=0 it is now tested and avoided: https://github.com/OpenNMT/OpenNMT-py/blame/master/onmt/utils/parse.py#L302 Master version on...

With max_relative_position = 20 - converstion goes ok, but with -1 or -2 it fails.

> I see that you are trying to convert an encoder-decoder model (`_get_model_spec_seq2seq` is the stack trace), but the converter currently does not handle `max_relative_positions: -1` or `max_relative_positions: -2` for...

> At least, I tested it with the following options: add_ffnbias: false, multiquery: true, add_qkvbias: false. I also added other layers to ensure that the model has the same or...