Remove obsolete loss layer check
With the nd blob update, it's no longer true that the num dimension has to be the same for both bottoms of every loss layer (e.g., softmax loss layer with axis = 0).
I'm not sure if additional checks should be added to other specific loss layers to compensate.
(e.g., softmax loss layer with axis = 0)
Good call, didn't think of that case. I guess everything besides AccuracyLayer and SoftmaxLossLayer theoretically should have the check readded since no others have changed since nd blobs AFAIK? But it probably wouldn't make sense to readd in some cases (e.g. EuclideanLossLayer, which I think checks that shapes match).
Redirected from #3097 . One simple fix is to remove the CHECK statement for num(), but this will ignore the dimension mismatch error. It is probably good to generalize all types of loss layers to arbitrary dimension, and move the "axis" property to loss_param, and let the base loss_layer check the dimension. @longjon what do you think?
Right, generalizing all loss layers would be nice, but as a simpler first step we should probably (a) remove this check from the base class, and (b) go through each of the loss layers and individually readd this or another check according to whatever makes sense for each layer. You're welcome to upgrade this PR to include (b)!
Related: #2814
Does this need to still be open. Was it solved in #2814