Weighted_Softmax_Loss
Weighted_Softmax_Loss copied to clipboard
Maybe bug in cpu backward
for (int k = 0; k < dim; ++k) { bottom_diff[i * dim + k * inner_num_ + j] *= w; } dim may should be channels?
@pkuyym I think you'are right
I have checked this, I found dim=channels
I have found a issue,
Backward_cpu for (int k = 0; k < dim; ++k) { bottom_diff[i * dim + k * inner_num_ + j] *= w; } should be bottom_diff[i * dim + label_value * inner_num_ + j] *=w;