LargeMargin_Softmax_Loss icon indicating copy to clipboard operation
LargeMargin_Softmax_Loss copied to clipboard

the deploy.prototxt of LargeMargin_Softmax_Loss

Open qinxianyuzi opened this issue 6 years ago • 5 comments

After finished training. How can I use LargeMargin_Softmax_Loss in the deploy.prototxt? thank you!

qinxianyuzi avatar Oct 19 '17 12:10 qinxianyuzi

You should export LargeMarginInnerProduct weights into new InnerProduct layer without bias term.

mnikitin avatar Oct 19 '17 12:10 mnikitin

the same question :the deploy.prototxt of LargeMargin_Softmax_Loss? How to edit deploy.prototxt

cocowf avatar Jan 21 '18 01:01 cocowf

May I ask how to export LargeMarginInnerProduct weights into new InnerProduct layer without bias term in deploy file?

Base on my understanding, when I train the network, I use:

layer {
  name: "fc9"
  type: "LargeMarginInnerProduct"
  bottom: "fc8"
  bottom: "label"
  top: "fc9"
  top: "lambda"
  param {
    name: "fc9"
    lr_mult: 10
  }
  largemargin_inner_product_param {
    num_output: 2
    type: SINGLE
    base: 0
    type: QUADRUPLE 
    base: 1000
    gamma: 0.000025
    power: 35
    iteration: 0
    lambda_min: 0
    weight_filler {
      type: "msra"
    }
  }
  include {
    phase: TRAIN
  }
}

In the deploy, I put:

layer {
  name: "fc9"
  type: "InnerProduct"
  bottom: "fc8"
  top: "fc9"
  param {
    lr_mult: 10
    decay_mult: 1
  }
  inner_product_param {
    num_output: 2
  }
}

Then I got an error when using python interface to test, Check failed: target_blobs.size() == source_layer.blobs_size() (2 vs. 1) Incompatible number of blobs for layer fc9 *** Check failure stack trace: *** I guess either because in LargeMarginInnerProduct I have two bottom or two top. Could you please let me know where I am wrong? Thanks a lot. @mnikitin @wy1iu @ydwen

gxstudy avatar Apr 11 '18 21:04 gxstudy

@gxstudy Your deploy fc layer should look like:

layer {
  name: "fc9"
  type: "InnerProduct"
  bottom: "fc8"
  top: "fc9"
  inner_product_param {
    num_output: 2
    bias_term: false
  }
}

mnikitin avatar Apr 12 '18 08:04 mnikitin

Thank you so much, it works. @mnikitin

gxstudy avatar Apr 13 '18 00:04 gxstudy