pva-faster-rcnn icon indicating copy to clipboard operation
pva-faster-rcnn copied to clipboard

Do "C_Relu" really speed up the network?

Open catsdogone opened this issue 8 years ago • 4 comments

I test the C_Relu(with scale after concat) in a network. The original conv layer is 52877, I change this layer to C_Relu structure which means that now the conv layer is 26477. But the training time increased unexpected. I use CUDA8.0 and CUDNN5.1. I feel confused. Can you give me some advice? Thank you very much. @sanghoon

catsdogone avatar Dec 21 '16 07:12 catsdogone

I think tthi is Python problem, not C_Relu

songjmcn avatar Dec 22 '16 03:12 songjmcn

@songjmcn While there is not Python layer in may new net and I use the c++ interface of caffe.

catsdogone avatar Dec 22 '16 06:12 catsdogone

The training use Python

songjmcn avatar Dec 22 '16 06:12 songjmcn

Hi @catsdogone We've added C_ReLU mainly for test efficiency. I'm not 100% sure whether it can accelerate network training. And as @songjmcn mentioned, it'll depend on implementations

sanghoon avatar Dec 29 '16 05:12 sanghoon