TensorRT-Yolov3 icon indicating copy to clipboard operation
TensorRT-Yolov3 copied to clipboard

About leaky layer and upsample layer

Open faedtodd opened this issue 5 years ago • 3 comments

[tensorRTWrapper/code/include/PluginFactory.h] line45-line50:

if(isLeakyRelu(layerName)) { assert(nbWeights == 0 && weights == nullptr); mPluginLeakyRelu.emplace_back(std::unique_ptr<INvPlugin, void()(INvPlugin)>(createPReLUPlugin(NEG_SLOPE), nvPluginDeleter)); return mPluginLeakyRelu.back().get(); ...

who can tell me the meaning of this

faedtodd avatar Mar 29 '19 06:03 faedtodd

Because the TensorRT parser can not handle the negative slop directly ( the leakyRelu vesion). So I add it as the plugin. As written in the tensorrt header, the PReLu plugin layer performs leaky ReLU for 4D tensors. Give an input value x, the PReLU layer computes the output as x if x > 0 and negative_slope //! x if x <= 0.

lewes6369 avatar Mar 31 '19 10:03 lewes6369

Because the TensorRT parser can not handle the negative slop directly ( the leakyRelu vesion). So I add it as the plugin. As written in the tensorrt header, the PReLu plugin layer performs leaky ReLU for 4D tensors. Give an input value x, the PReLU layer computes the output as x if x > 0 and negative_slope //! x if x <= 0.

so, relu layer ganna be repaleced by PRelu layer which has the same performance with leaky layer and upsample layer will replaced by your upsample layer cause caffe doesnt have one

faedtodd avatar Apr 02 '19 05:04 faedtodd

Yes, in the yolov3 model , the relu layer is actually the leaky relu layer. And the upsample layer is not supported by the default TensorRT, so add it as plugins. To gain the same result in the tensorrt, we have to do like this.

lewes6369 avatar Apr 07 '19 14:04 lewes6369