Focal-Loss icon indicating copy to clipboard operation
Focal-Loss copied to clipboard

Reproduction of Focal-loss on caffe

Results 8 Focal-Loss issues
Sort by recently updated
recently updated
newest added

hi, @liuyuisanai when i train the model for 2 classification, I have this error. So, I try to change the output num of fc layer from 2 to 1. It...

when i run this code,i got this error: Check failed: bottom[0]->count() == bottom[1]->count() (20 vs. 10) SIGMOID_CROSS_ENTROPY_LOSS layer inputs must have the same count. how do i solve it?

I set alpha to 0.5 and gamma to 0.5, the loss will go to NAN. After changing gamma to 1, loss became normal. I didn't see any restriction of gamma...

Any idea why would i be getting this error: ``` *** Aborted at 1504167002 (unix time) try "date -d @1504167002" if you are using GNU date *** PC: @ 0x2b47a0e52cc9...

please, I got this error“error LNK2001: 无法解析的外部符号 "protected: virtual void __cdecl caffe::FocalLossLayer::Forward_gpu(class std::vector const &,class std::vector const &)" (?Forward_gpu@?$FocalLossLayer@N@caffe@@MEAAXAEBV?$vector@PEAV?$Blob@N@caffe@@V?$allocator@PEAV?$Blob@N@caffe@@@std@@@std@@0@Z) E:\DXD\caffefl\caffe\windows\caffe\focal_loss_layer.obj caffe

In training phase, we use SoftmaxWithLoss Layer and replace it with Softmax in testing. If I use FocalLoss in training, is it still replaced by Softmax?

Hi, to normalize loss, you divide the loss by the total count of anchors, but the paper suggest to divide only the positive anchors. Why did you do that? thanks

Hi,sciencefans: thank you for sharing your code , and I want to know is the Focal Loss work well?? How much improve than before?