LLaVA icon indicating copy to clipboard operation
LLaVA copied to clipboard

[Question] How to do a 2nd finetune on a fintuned checkpoints with lora?

Open YangQiuEric opened this issue 1 year ago • 2 comments

Question

I have used train_mem.py with lora for the initial finetuning, similar to the question. Now, I want to perform another round of finetuning on a different dataset while starting from the checkpoint of the first finetune. How can I accomplish this?

YangQiuEric avatar Jan 29 '24 20:01 YangQiuEric

Here's what I tried:

  1. Go to /pvcvolume/LLaVA/llava/train/train.py

  2. Find TrainingArguments class and add one more argument: lora_path: str = field(default=None, metadata={"help": "Path to the previous lora folder."})

  3. Go back to /pvcvolume/LLaVA/llava/train/train.py line 840, and add

if training_args.lora_path:
   model = PeftModel.from_pretrained(model, training_args.lora_path)
else:
   model = get_peft_model(model, lora_config) # Model is defined here!
  1. Add the lora_path in your training .sh script.

LeoLee7 avatar Jan 29 '24 23:01 LeoLee7

@LeoLee7 I have finetuned first LLaVA-7b with my own dataset. I founld model is bad. So I want to perform another round of finetuning on another dataset while starting from the checkpoint of the first finetune.(same task) Can I use your method to do this??? Thank!!!

cherry956 avatar Feb 22 '24 15:02 cherry956