vaesl
vaesl
Sorry for the late reply. We have not conducted the analysis for the computation cost for PETR series. The "get_flops.py" you found in the repo is not used for calculating...
Hi, thanks for your pointing out this problem! As you mentioned, the extrinsic parameters (3D PE in our paper) as well as the gt_maps should be transformed together when applying...
Yes, the batch size is 1 for both training and evaluation.
Thanks for your pointing out this issue. Indeed, there is a silly bug in calculating the IoU metric. We will first update the Readme to remind the users of PETRv2...
@shwoo93 I also reproduced the result of original SSD with the RFB training scheme and got 79.04%, so I think it is fair to compare this result with the RFB...
> I am sorry for the late reply. We use this checkpoint (https://github.com/SwinTransformer/storage/releases/download/v1.0.0/swin_large_patch4_window7_224_22kto1k.pth), which is pretrained on ImageNet22k and finetuned on ImageNet1k with 224x224 input size. The training config for...
Yes, the MOTRv2 model can be converted to tensorrt. In our practice, it can run at 14ms with int8 bit. In the future, we may release the tensorrt version of...
Since the paper is still under review, I think it will also be released in the near future based on the submission result. Currently, we simply released the DanceTrack one.
We first convert the MOTRv2 to onnx model and then convert the onnx to trt one. As for the input dict question, you may rethink the forward process and refer...