InternVL
InternVL copied to clipboard
Data size for InternVL 1.5 and 2.0
📚 The doc issue
I did not see any info about the data size for the pre-train and fine-tuning stages for InternVL 1.5 and 2.0. Do you use the same dataset and sizes that you use for InternVL 1.2 plus? i.e., 39.3M for pretraining and 12M for fine-tuning?
Suggest a potential alternative/fix
No response