MiniCPM-V
MiniCPM-V copied to clipboard
I want to know the use of the gpu of MiniCPM-Llama3-V 2.5
I'd like to ask how much gpu memory is needed for this model inference and training
Inference with MiniCPM-Llama3-V 2.5 fp16 need at least 16 GB gpu memory, int4 need 8 GB gpu memory. Full parameter training need 8*A100 80GB, we will release lora fine-tuning code in several days, please stay tuned.
Add this to README
thank you very much