LMFlow
LMFlow copied to clipboard
Why is the lora_model so large after fine-tuning?
Why is the lora_model so large after fine-tuning?
The checkpoints are intermediate full model. You can use the adapter_model.bin once the training is completed.
The checkpoints are intermediate full model. You can use the adapter_model.bin once the training is completed.
How to extract adapter_model.bin file from the checkpoint folder?
The checkpoints are intermediate full model. You can use the adapter_model.bin once the training is completed.
How to extract adapter_model.bin file from the checkpoint folder?
Due to the incompatibility of PEFT package and the transformer trainer. The checkpoint only contains full model. We will revise the trainer and make it support PEFT soon
This issue has been marked as stale because it has not had recent activity. If you think this still needs to be addressed please feel free to reopen this issue. Thanks