Jay Mahadeokar
Jay Mahadeokar
--extra_num_outputs could be reduced to 1024 1024, and -n to 128. Rest of the params should remain same I think. Use -h for more help on params.
@kaishijeng did the above params work for you? I am closing this for now, feel free to re-open it if you have additional questions.
@kaishijeng Squeezenet architecture is a lot different than resnet / vgg in terms of feature map sizes. I am not sure which layers would we attach the detection heads. If...
Sounds good! You need to modify relu_stage1_block3 relu_stage2_block5 relu_stage3_block2 relu_stage4_block2 relu_stage5_block2 params to relu_stage1_block1 relu_stage2_block1 relu_stage3_block1 relu_stage4_block1 relu_stage5_block1 Also, extra_blocks could be 2 2 (or your choice, more blocks will...
Please specify -n as 64. Note that the bottleneck block has 3 layers 64,64,256 filters, whereas normal block has 2 layers with 64,64 filters. Since 1st conv layer has 64...
Who is someone? :-) I believe scale_param filler is 1 by default see [this](https://github.com/BVLC/caffe/blob/master/src/caffe/proto/caffe.proto#L1084-L1089) Refer this [discussion](https://github.com/BVLC/caffe/pull/3591#issuecomment-176527937) on how Scale layer is used with BatchNorm. I am able to properly...
@airRobotCool , closing this issue. Please feel free to reopen if you have any other question.
Glad you found it useful! If you use it to generate any variants of residual networks, feel free to contribute the models back via PR.