TensorRT-CenterNet icon indicating copy to clipboard operation
TensorRT-CenterNet copied to clipboard

ctdet2onnx path question

Open Tony-Hou opened this issue 5 years ago • 15 comments

Question 1: Is compatibility the current repo with TensorRT 6.0?

Question 2: I want to convert ctdet_coco_resdcn101.pth to onnx ,but i can't find the path "lib/models/networks" in your current repo.

Question 3: convert ctdet_coco_dla_2x.pth to ctdet_coco_dla_2x.onnx, where is the lib.opts? from lib.opts import opts from lib.models.model import create_model, load_model

Tony-Hou avatar Dec 23 '19 09:12 Tony-Hou

@Tony-Hou lib You can clone https://github.com/xingyizhou/CenterNet . And then you will find lib/... the .pth file can be download from https://drive.google.com/open?id=1px-Xg7jXSC79QqgsD1AAGJQkuf5m0zh_

I haven't test the code on Tensorrt6.

CaoWGG avatar Dec 23 '19 10:12 CaoWGG

@CaoWGG Thank you very much!

Tony-Hou avatar Dec 23 '19 10:12 Tony-Hou

@CaoWGG Is the belowe Steps correct?:
cp -r mmdetection/mmdet/ops/dcn/ CenterNet/src/lib/models/networks/ then cd CenterNet/src/lib/models/networks there isn't setup.py file。what is the correction steps?

Tony-Hou avatar Dec 23 '19 12:12 Tony-Hou

@Tony-Hou I have put dcn in readme/dcn. you can dirctly copy it to CenterNet/src/lib/models/networks/. then cd CenterNet/src/lib/models/networks/dcn/.

notice : put (convert onnx python script) in CenterNet/src/. then run it.

CaoWGG avatar Dec 23 '19 12:12 CaoWGG

@CaoWGG thanks! I want to convert the ctdet_coco_resdcn18.pth to onnx, Is the below python script is correct?

`from lib.opts import opts from lib.models.model import create_model, load_model from types import MethodType import torch.onnx as onnx import torch from torch.onnx import OperatorExportTypes from collections import OrderedDict

def resnet_dcn_forward(self, x): x = self.conv1(x) x = self.bn1(x) x = self.relu(x) x = self.maxpool(x) x = self.layer1(x) x = self.layer2(x) x = self.layer3(x) x = self.layer4(x) x = self.deconv_layers(x) ret = [] ## change dict to list for head in self.heads: ret.append(self.getattr(head)(x)) return [ret] forward = {'resdcn':resnet_dcn_forward}

opt = opts().init() ## change lib/opts.py add_argument('task', default='ctdet'....) to add_argument('--task', default='ctdet'....) opt.arch = 'resdcn_18' opt.heads = OrderedDict([('hm', 80), ('reg', 2), ('wh', 2)]) opt.head_conv = 256 if 'dla' in opt.arch else opt.head_conv=64 print(opt) model = create_model(opt.arch, opt.heads, opt.head_conv) model.forward = MethodType(forward[opt.arch.split('_')[0]], model) load_model(model, '../../ctdet_coco_resdcn18.pth') model.eval() model.cuda() input = torch.zeros([1, 3, 512, 512]).cuda() onnx.export(model, input, "./ctdet_coco_resdcn18.onnx", verbose=True, operator_export_type=OperatorExportTypes.ONNX)`

Tony-Hou avatar Dec 23 '19 14:12 Tony-Hou

@Tony-Hou

  1. types is a standard library of python3.
  2. you can try it .

CaoWGG avatar Dec 23 '19 14:12 CaoWGG

@CaoWGG Do you build the CenterNet environment based on pytorch-1.0?
my current environment is pytorch 1.2,there are two problem

from torch.utils.ffi import create_extension raise ImportError('torch.utils.ffi' is deprecated. Please use cpp extension instead. )

https://github.com/xingyizhou/CenterNet/issues/3

But i used pytorch 1.0.1 there is torch.utils.ffi is deprecated error. and another problem i check modules/deform_conv.py is not exist DCN, How i next to do?

from .modules.deform_conv import (DeformConv, ModulatedDeformConv, ImportError: cannot import name 'DCN' from 'lib.models.networks.dcn.modules.deform_conv' (/root/work/CenterNet/src/lib/models/networks/dcn/modules/deform_conv.py)

Tony-Hou avatar Dec 24 '19 04:12 Tony-Hou

@Tony-Hou 1.you need to change (from .DCNv2.dcn_v2 import DCN) to (from .dcn.modules.deform_conv import ModulatedDeformConvPack as DCN) in pose_dla_dcn.py and resnet_dcn.py. 2.soory ,its my fault. I have updtae my code. You can refer to here.

CaoWGG avatar Dec 24 '19 05:12 CaoWGG

@CaoWGG Thanks! Could i add friends on Wechat?

Tony-Hou avatar Dec 24 '19 05:12 Tony-Hou

Does not compatible with tensorrt6. it will got this error:

WARNING: ONNX model has a newer ir_version (0.0.4) than this parser was built against (0.0.3).
Parsing model
While parsing node number 0 [Conv -> "401"]:
ERROR: /onnx-tensorrt/ModelImporter.cpp:537 In function importModel:
[5] Assertion failed: tensors.count(input_name)

Also under CUDA10 need change CMakeLists.txt to build onnx2trt

lucasjinreal avatar Dec 24 '19 09:12 lucasjinreal

@jinfagang Can you share your changes to CMakeList ?

CaoWGG avatar Dec 24 '19 13:12 CaoWGG

@CaoWGG add cublas in target_link for plugin.

lucasjinreal avatar Dec 25 '19 02:12 lucasjinreal

@jinfagang thanks

CaoWGG avatar Dec 25 '19 03:12 CaoWGG

@jinfagang How to modify cmakelist about plugin,thanks?For instance,target_link_libraries(plugin libcublas.so)?

universebang avatar Mar 06 '20 02:03 universebang

Does not compatible with tensorrt6. it will got this error:

WARNING: ONNX model has a newer ir_version (0.0.4) than this parser was built against (0.0.3).
Parsing model
While parsing node number 0 [Conv -> "401"]:
ERROR: /onnx-tensorrt/ModelImporter.cpp:537 In function importModel:
[5] Assertion failed: tensors.count(input_name)

Also under CUDA10 need change CMakeLists.txt to build onnx2trt

I change CmakeList in plugin target_link_libraries(nvonnxparser_plugin ${TENSORRT_LIBRARY} cuda cudart cublas) to target_link_libraries(nvonnxparser_plugin ${TENSORRT_LIBRARY} ${CUDA_LIBRARIES}) and doesnt work. is any solution for tensorrt6?

yzpfyang avatar May 12 '20 09:05 yzpfyang