dcnv2_trt
dcnv2_trt copied to clipboard
how to implement DCNv2_TRT plugin in onnx-tensorrt
thank you for the awesome implement firstly.
I compile the source code successfully and it can find the plugin in exporting onnx model, but i got the error about the output dim of DCNv2, and i think the reason relate to builtin_op_importers.cpp
.
I found there are some differences compared with other version DCNv2 plugin code, such as
void DCNv2PluginDynamic::configurePlugin(const DynamicPluginTensorDesc* inputs, int nbInputs,
const DynamicPluginTensorDesc* outputs, int nbOutputs)
{
// Validate input arguments
assert(nbInputs == 4);
assert(nbOutputs == 1);
Could you explain what the four inputs mean ?
Here is my netron imageshoot about the input attributes:
thank you in advance.
我也遇到了这个问题,请问您解决了吗?
这个作者的c++那部分代码没把输入说明,可以参考这个: https://github.com/lesliejackson/TensorRT-DCNv2-Plugin
I cannot run the code without clarying the input node in C++. I refered the code below: https://github.com/lesliejackson/TensorRT-DCNv2-Plugin
谢谢
这个作者的c++那部分代码没把输入说明,可以参考这个: https://github.com/lesliejackson/TensorRT-DCNv2-Plugin
I cannot run the code without clarying the input node in C++. I refered the code below: https://github.com/lesliejackson/TensorRT-DCNv2-Plugin
您好,这个链接中好像需要编写InferPlugin.cpp文件,但是master中没找到这个文件,请问是否存在这个问题呢?
作者没有提到需要InferPlugin.cpp
文件呢。你可以把链接和需要的地方附件一下。
作者没有提到需要
InferPlugin.cpp
文件呢。你可以把链接和需要的地方附件一下。
我搞出来了,谢了啊