GeoMIM
GeoMIM copied to clipboard
GPU nums and and training time
Hi! It's a nice job!
It seems use data augmentation for nuscenes datasets, maintaining 7724 samples (total batch sizes=16), which is more than the original number of nuscenes samples.
I want to know the gpu nums you used and how long to pretrain the swin-base Transformer.
Looking forward to your reply! Thank you!
Hi!
We only use the train set for pretraining. Can you describe in more detail how you obtained "7724 samples"?
The base model is pretrained with 16 gpus for about 2 days ( 50 epochs).