MOTRv2
MOTRv2 copied to clipboard
why batch_size only support 1 per GPU
Hi, Deformable Detr origin git can support different batch_size.
Why MOTR V2 only support batch_size ==1 ?
I think the target tracking task needs to learn the connection between the image series, batch_size*num_frames_per_batch is the batch_size during actual training. When batch_size>1, it will increase code complexity and aggravate GPU storage consumption.
Implementing batch size > 1 is possible but requires some work on handling track queries with variable lengths between batches. Using batch size 1 already uses all GPU memory on 2080Ti (with checkpointing) or V100 (without checkpointing), so we do not train with a larger batch size.