alpr-unconstrained icon indicating copy to clipboard operation
alpr-unconstrained copied to clipboard

Loss error

Open amkonshin opened this issue 3 years ago • 1 comments

I have some issues with train on pytorch. I rewrite loss function so its same as urs. I trained as u reccommended and get bad results, after investigation i mention that nn predict awful probabilities in channel 0 and channel 1 output. Its all above 1, so u cant search for good results using threshold 0.5 or something like that. Than i rerun training and printing l1,obj and noobj losses and it cames obj and noobj losses always became 0 after few iterations. Because ur loss function logloss : def logloss(Ptrue,Pred,szs,eps=10e-10): b,h,w,ch = szs Pred = tf.clip_by_value(Pred,eps,1.) Pred = -tf.log(Pred) Pred = PredPtrue Pred = tf.reshape(Pred,(b,hw*ch)) Pred = tf.reduce_sum(Pred,1) return Pred equal to 0 when predicted values are above 1 (ln 1=0). So net learn to predict big values, and its not how it is supposed to work, cuz later we need this probs to estimate good predicitons and nms and so on. So whats the point of using this loss func or maybe i am wrong somewhere?

amkonshin avatar Nov 17 '21 09:11 amkonshin

Btw ive found an error in my code, i used Relu instead of Sigmoid as last layer activation. But i still can understand why we need 2 filter for obj and noobj, why can`t we use 1 filter with obj prob and use BCELoss?

amkonshin avatar Nov 18 '21 08:11 amkonshin