GeoMIM icon indicating copy to clipboard operation
GeoMIM copied to clipboard

GPU nums and and training time

Open Debrove opened this issue 2 years ago • 2 comments

Hi! It's a nice job!

It seems use data augmentation for nuscenes datasets, maintaining 7724 samples (total batch sizes=16), which is more than the original number of nuscenes samples.

I want to know the gpu nums you used and how long to pretrain the swin-base Transformer.

Looking forward to your reply! Thank you!

Debrove avatar Sep 30 '23 09:09 Debrove

Hi!

We only use the train set for pretraining. Can you describe in more detail how you obtained "7724 samples"?

The base model is pretrained with 16 gpus for about 2 days ( 50 epochs).

jihaonew avatar Oct 10 '23 16:10 jihaonew