Kuhn
Kuhn
Same here. I don't know what's wrong with it.
I can give you the baidudisk link for you to download if you text me.
> @basteran @amanysalahfattouh @haotian-liu , have you tried using the flag `--pretrain_mm_mlp_adapter` with the path set to `non_lora_trainables.bin` of your finetuned model? I have been facing the same issue as...
now i know it. look at the llava project. you would find the two-stage weight-loading methods. if anyone still don't know, contact me
> hi @dongwhfdyer , > > It seems you already successfully reproduced this project. > > I am still confused about the training procedure. > > * Do we only...