BEVFormer
BEVFormer copied to clipboard
Questions about SyncBatchNorm
I noticed that batchsize=1 is applied for each gpu while the syncbatchnorm is not applied. Wouldn't this lead to inaccurate parameter updates for the BN layer? I have tried to train bevformer-small on 4 RTX 3090 with batchsize=2 per gpu. I got NDS=49.0, which is higher than the reported 47.9. I'm wondering if this results from SyncBN.