LargeMargin_Softmax_Loss
LargeMargin_Softmax_Loss copied to clipboard
the deploy.prototxt of LargeMargin_Softmax_Loss
After finished training. How can I use LargeMargin_Softmax_Loss in the deploy.prototxt? thank you!
You should export LargeMarginInnerProduct weights into new InnerProduct layer without bias term.
the same question :the deploy.prototxt of LargeMargin_Softmax_Loss? How to edit deploy.prototxt
May I ask how to export LargeMarginInnerProduct weights into new InnerProduct layer without bias term in deploy file?
Base on my understanding, when I train the network, I use:
layer {
name: "fc9"
type: "LargeMarginInnerProduct"
bottom: "fc8"
bottom: "label"
top: "fc9"
top: "lambda"
param {
name: "fc9"
lr_mult: 10
}
largemargin_inner_product_param {
num_output: 2
type: SINGLE
base: 0
type: QUADRUPLE
base: 1000
gamma: 0.000025
power: 35
iteration: 0
lambda_min: 0
weight_filler {
type: "msra"
}
}
include {
phase: TRAIN
}
}
In the deploy, I put:
layer {
name: "fc9"
type: "InnerProduct"
bottom: "fc8"
top: "fc9"
param {
lr_mult: 10
decay_mult: 1
}
inner_product_param {
num_output: 2
}
}
Then I got an error when using python interface to test,
Check failed: target_blobs.size() == source_layer.blobs_size() (2 vs. 1) Incompatible number of blobs for layer fc9 *** Check failure stack trace: ***
I guess either because in LargeMarginInnerProduct I have two bottom or two top. Could you please let me know where I am wrong? Thanks a lot. @mnikitin @wy1iu @ydwen
@gxstudy Your deploy fc layer should look like:
layer {
name: "fc9"
type: "InnerProduct"
bottom: "fc8"
top: "fc9"
inner_product_param {
num_output: 2
bias_term: false
}
}
Thank you so much, it works. @mnikitin