Meta-DETR
Meta-DETR copied to clipboard
How much time it takes to train an epoch when using one v100?
How much time it takes to train an epoch when using one v100? Why did I train an epoch with one 2080TI for a day(around 24 hours) ? Is there anything wrong with it? Or is that the reality?
And I've changed the ResNet from 101 to 50, and multiscale training has only used 480 and 512
Thanks for your interest. Please check if you correctly follow our training scripts.
On Pascal VOC Split 1, one training epoch takes around 19-20 minutes based on my training logs.
Thanks for your interest. Please check if you correctly follow our training scripts.
On Pascal VOC Split 1, one training epoch takes around 19-20 minutes based on my training logs.
Thanks for your reply. In your paper, I see you use 8 × v100, how many batchsize do you set? And I want to konw the time one epoch when training on MSCOCO. I set batchsize=4, and use one 2080TI on MSCOCO to training, which takes one day.
I also meet this question, and I use one Tesla V100 GPU(32G), and I change batch_size from 4 to 8, it seems like need 11 hours to train one epoch.
Thanks for your interest. Please check if you correctly follow our training scripts. On Pascal VOC Split 1, one training epoch takes around 19-20 minutes based on my training logs.
Thanks for your reply. In your paper, I see you use 8 × v100, how many batchsize do you set? And I want to konw the time one epoch when training on MSCOCO. I set batchsize=4, and use one 2080TI on MSCOCO to training, which takes one day.
Is it all very slow during training? The 3060Ti I use is too slow. Can I add your contact information so we can discuss related issues.
您好!您发给我的信件已经收到,我将尽快处理