LargeMargin_Softmax_Loss icon indicating copy to clipboard operation
LargeMargin_Softmax_Loss copied to clipboard

Implementation for <Large-Margin Softmax Loss for Convolutional Neural Networks> in ICML'16.

Results 12 LargeMargin_Softmax_Loss issues
Sort by recently updated
recently updated
newest added

Greetings, I'm reading the paper, I would like if anyone can explain why the margin angle is equal to (m - 1) / (m + 1) * theta1,2 Thanks in...

Thank you for sharing! I used LargeMargin_Softmax_Loss to train a model on CASIA Webface。But what criteria do I use to evaluate this model, Euclidean distance?

when i use your mnist_train_test.prototxt to train the mnist, the loss is nan? How should i do to overcome this issues??

Hi there! The same problem of #18 happened to me when train the model and my problem was even more severe that my train accuracy decrease from 90% to 50%....

Why do you do this before each iteration? and the original full connection layer does not have this step. caffe_set(M_*N_, (Dtype)0., bottom_diff);

``` Blob sign_0_; // sign_0 = sign(cos_theta) // for DOUBLE type Blob cos_theta_quadratic_; // for TRIPLE type Blob sign_1_; // sign_1 = sign(abs(cos_theta) - 0.5) Blob sign_2_; // sign_2 =...

After finished training. How can I use LargeMargin_Softmax_Loss in the deploy.prototxt? thank you!

when I train Lenet , I use ReLU as activation function .The accuracy is 0.1 , but I use PReLU , the accuracy is 0.98 . I don't know why...