Weighted_Softmax_Loss icon indicating copy to clipboard operation
Weighted_Softmax_Loss copied to clipboard

Maybe bug in cpu backward

Open pkuyym opened this issue 8 years ago • 3 comments

for (int k = 0; k < dim; ++k) { bottom_diff[i * dim + k * inner_num_ + j] *= w; } dim may should be channels?

pkuyym avatar Mar 25 '16 06:03 pkuyym

@pkuyym I think you'are right

stormkingz avatar Dec 30 '16 07:12 stormkingz

I have checked this, I found dim=channels

mahaoyanghb avatar Sep 15 '19 08:09 mahaoyanghb

I have found a issue,

Backward_cpu for (int k = 0; k < dim; ++k) { bottom_diff[i * dim + k * inner_num_ + j] *= w; } should be bottom_diff[i * dim + label_value * inner_num_ + j] *=w;

mahaoyanghb avatar Sep 15 '19 08:09 mahaoyanghb