mmdetection-to-tensorrt
mmdetection-to-tensorrt copied to clipboard
convert mmdetection model to tensorrt, support fp16, int8, batch input, dynamic shape etc.
@grimoire >>>>>>>>>>>>>TRT infer.................. !!!!!!!!!!!!mmdet2trt/apis create_wrap_detector isinstance(trt_model, str) ##############torch2trt_dynamic.py inital TRTModule input_names,output_names: None None #####mmdet2trt init_trt_model....... [01/28/2022-16:45:39] [TRT] [W] TensorRT was linked against cuBLAS/cuBLAS LT 10.2.3 but loaded cuBLAS/cuBLAS LT 10.2.2...
Hello. I succesfully installed this lib. Thanks for your recomendation. So i tested simple demo and made my own Tensorrt pth file(checkpoints). My model is customized mask_rcnn_r50_fpn_fp16_1X_coco. First image demo...
I have converted a MMDet model using this tool and I got an output 'model.engine' model. However, when using DeepStream (with the amirstan plugin) the inference does not work as...
1 模型转化时候这样修改了Infrence.py trt_model = mmdet2trt(cfg_path, args.checkpoint, output_names=["num_detections", "boxes", "scores", "classes","masks"], fp16_mode=args.fp16, device=args.device,enable_mask=True) torch.save(trt_model.state_dict(), args.save_path) 2 deepstream 的配置文件 output-bbox-name=bbox output-blob-names=num_detections;boxes;scores;classes;masks parse-bbox-func-name=NvDsInferParseMmdet custom-lib-path=nvdsinfer_custom_impl_fasterRCNN/libamirstan_plugin.so
**Describe the bug** I'm using the ```mmdet2trt ../mmdetection/configs/detr/detr_r50_8x2_150e_coco.py ./detr_r50_8x2_150e_coco_20201130_194835-2c4b8974.pth detr.trt```to convert DETR to TensorRT, but I get the following error: Use load_from_local loader /opt/conda/lib/python3.8/site-packages/torch/nn/functional.py:718: UserWarning: Named tensors and all their...
1 模型转化时候这样修改了Infrence.py trt_model = mmdet2trt(cfg_path, args.checkpoint, output_names=["num_detections", "boxes", "scores", "classes","masks"], fp16_mode=args.fp16, device=args.device,enable_mask=True) torch.save(trt_model.state_dict(), args.save_path) 2 deepstream 的配置文件 output-bbox-name=bbox output-blob-names=num_detections;boxes;scores;classes;masks parse-bbox-func-name=NvDsInferParseMmdet custom-lib-path=nvdsinfer_custom_impl_fasterRCNN/libamirstan_plugin.so
WARNING:root:can't find wrap module for type:, use instead. [TensorRT] ERROR: Parameter check failed at: ../builder/Layers.cpp::TopKLayer::3499, condition: k > 0 && k 0 ../builder/Layers.cpp:3534 Aborting... Traceback (most recent call last): File...
**Describe the bug** First, thank you very much for your contribution. When I tried to convert the custom Side-Aware Boundary Localization model, I encountered [checkSanity.cpp::checkSanity::106] Error Code 2: Internal Error....
**Describe the bug** I installed the newest MMDetection version but can not convert a simple frcnn model. ``` mim download mmdet --config faster_rcnn_x101_64x4d_fpn_1x_coco --dest . mmdet2trt --save-engine=true --min-scale 1 3...
I try to convert TridentNet to engine, but I have an error: load checkpoint from local path: /home/ariel/mmdetection2/tridentnet/epoch_33.pth can't find wrap module for type:, use instead. can't find wrap module...