Focal-Loss
Focal-Loss copied to clipboard
focal-loss = 87.3365
when i change softmaxwithloss to focal-loss, in the beginning of training, the loss becomes 87.3365. When using softmaxwithloss, there is no such a problem. Is the code still under the testing phase?
hi @liangzimei,
sorry about that. i just finish the code. i will test it in the next few days.
btw, can u give how do define the focal loss
in your train.prototxt?
hi @liangzimei,
i have fixed the bugs: the loss and the gradient. now it can work. please have a try.
for more details, u can see #2
@zimenglan-sysu-512 , thanks for your reminder. my train.prototxt is layer { name "fc3_new_1" type: InnerProduct ...... } layer { name: "loss_focal" type: "FocalLoss" bottom: "fc3_new_1" bottom: "label" top: "loss_focal" loss_param{ normalize: true normalization: FULL } } when i update the cpp and cu just now, the loss become nan in the training process.
hi @liangzimei,
If i try small dataset using pvanet, it's ok, but it encounters the NAN
, when i use larger dataset. I still try to fix it.
hi @liangzimei,
i fix the NaN
problem. i guess that, the op (1 - p_t)^gamma / (1 - p_t)
causes the problem, when 1 - p_t
is so small.
now you can have a try.