MiniCPM-o
MiniCPM-o copied to clipboard
fp16 or bf16 in LoRA fine-tuning ?
Thanks for your great work! When LoRA fine-tuning the MiniCPM-V-2_6, should I use fp16 or bf16 ? The default setting in 'finetune_lora.sh' is fp16, but the trained models you provided is bf16, so which one is better for LoRA fine-tuning ?