peft icon indicating copy to clipboard operation
peft copied to clipboard

Resume the train of a LoRA

Open milyiyo opened this issue 1 year ago • 1 comments

Hi everyone,

Is there any example showing how to resume the training of a LoRA, if it is possible?

milyiyo avatar Mar 29 '23 16:03 milyiyo

Yes! I am also wondering when will we have such a tutorial! It will be of great use.

REIGN12 avatar Apr 14 '23 02:04 REIGN12

Is there still wip or was it postponed ? :)

achibb avatar May 08 '23 19:05 achibb

Hi everyone, this feature has recently been introduced in HF trainer here: https://github.com/huggingface/transformers/pull/24274 - you can benefit from this feature if you install transformers from source . Examples are attached to that PR but the TLDR is that you should call trainer.train(resume_from_checkpoint=True) and make sure you have already trained a model using HF trainer on the same working folder. Closing the issue, feel free to re-open if you think that this has not been addressed ! Thanks

younesbelkada avatar Jun 21 '23 15:06 younesbelkada