pva-faster-rcnn
pva-faster-rcnn copied to clipboard
Do "C_Relu" really speed up the network?
I test the C_Relu(with scale after concat) in a network. The original conv layer is 52877, I change this layer to C_Relu structure which means that now the conv layer is 26477. But the training time increased unexpected. I use CUDA8.0 and CUDNN5.1. I feel confused. Can you give me some advice? Thank you very much. @sanghoon
I think tthi is Python problem, not C_Relu
@songjmcn While there is not Python layer in may new net and I use the c++ interface of caffe.
The training use Python
Hi @catsdogone We've added C_ReLU mainly for test efficiency. I'm not 100% sure whether it can accelerate network training. And as @songjmcn mentioned, it'll depend on implementations