caffe icon indicating copy to clipboard operation
caffe copied to clipboard

Remove obsolete loss layer check

Open longjon opened this issue 10 years ago • 5 comments

With the nd blob update, it's no longer true that the num dimension has to be the same for both bottoms of every loss layer (e.g., softmax loss layer with axis = 0).

I'm not sure if additional checks should be added to other specific loss layers to compensate.

longjon avatar Mar 11 '15 02:03 longjon

(e.g., softmax loss layer with axis = 0)

Good call, didn't think of that case. I guess everything besides AccuracyLayer and SoftmaxLossLayer theoretically should have the check readded since no others have changed since nd blobs AFAIK? But it probably wouldn't make sense to readd in some cases (e.g. EuclideanLossLayer, which I think checks that shapes match).

jeffdonahue avatar Mar 11 '15 02:03 jeffdonahue

Redirected from #3097 . One simple fix is to remove the CHECK statement for num(), but this will ignore the dimension mismatch error. It is probably good to generalize all types of loss layers to arbitrary dimension, and move the "axis" property to loss_param, and let the base loss_layer check the dimension. @longjon what do you think?

YutingZhang avatar Sep 24 '15 00:09 YutingZhang

Right, generalizing all loss layers would be nice, but as a simpler first step we should probably (a) remove this check from the base class, and (b) go through each of the loss layers and individually readd this or another check according to whatever makes sense for each layer. You're welcome to upgrade this PR to include (b)!

longjon avatar Sep 24 '15 09:09 longjon

Related: #2814

shelhamer avatar Apr 14 '17 08:04 shelhamer

Does this need to still be open. Was it solved in #2814

JamesAshwood07 avatar Oct 17 '25 09:10 JamesAshwood07