LLaMA-Factory icon indicating copy to clipboard operation
LLaMA-Factory copied to clipboard

For pretraining with LORA what is the expected output? A Lora Adapator or the complete pretrained model with adaptor merged in it?

Open amitagh opened this issue 10 months ago • 1 comments

For pretraining with LORA what is the expected output? A Lora Adapator or the complete pretrained model with adaptor merged in it? I am pretraining LLM with 7B params with Lora.

amitagh avatar Apr 10 '24 06:04 amitagh

The direct product is LoRA adaptor. Then You can merge it into base model, like this https://github.com/hiyouga/LLaMA-Factory/tree/main/examples/merge_lora to get a complete model. bless.

codemayq avatar Apr 10 '24 07:04 codemayq