MiniCPM-V icon indicating copy to clipboard operation
MiniCPM-V copied to clipboard

I want to know the use of the gpu of MiniCPM-Llama3-V 2.5

Open praymich opened this issue 1 year ago • 1 comments

I'd like to ask how much gpu memory is needed for this model inference and training

praymich avatar May 21 '24 01:05 praymich

Inference with MiniCPM-Llama3-V 2.5 fp16 need at least 16 GB gpu memory, int4 need 8 GB gpu memory. Full parameter training need 8*A100 80GB, we will release lora fine-tuning code in several days, please stay tuned.

iceflame89 avatar May 21 '24 10:05 iceflame89

Add this to README

Cuiunbo avatar May 23 '24 10:05 Cuiunbo

thank you very much

praymich avatar May 24 '24 01:05 praymich