mmdetection-to-tensorrt
mmdetection-to-tensorrt copied to clipboard
performs well in NVIDIA agx platform,and some questions about batch inference
Thanks for your help, I have tested this tool in NVIDIA agx platform, we get a nearly 3X speed up compared using mmdetection directly.That's an amazing result! And I have another question about batch inference for help. Mmdetection2.4 has supported batch inference. It is useful for us to add batch inference in this tool. The existed batch inference method in this tool is not work well, right? By the way, this is really a good tool for the people use mmdetection. I hnow this is your part time job and there are still some inference qusetions while testing. Are you ready to publish a more detailed turiol? I will use this tool for a long time and I wish I can do something for this tool if I can ,such as docs writing, hhhh. My email: [email protected] and wechat : heboyong
Hi
Batch inference works on most models. Just set the opt_shape_param:
opt_shape_param=[
[
[1,3,320,320],
[1,3,800,1312],
[4,3,1344,1344],
]
]
should give you a model with batch support (max_batch_size=4).
And of cause, I do need more help on this project. I will create a group on qq or wechat later. Let's work together to make this project better.
@grimoire Would u like invite me to wechat group? my wechat id: jintianiloveu
@heboyong @jinfagang I have create a QQ group: 1107959378. Join if you want to discuss or participate this project.