triplet-loss-pytorch
triplet-loss-pytorch copied to clipboard
Why validation loss<training loss
excuse me, during the training, why the validation loss is less than training loss
because of overfitting? I think you should ask a more detailed question on StackOverflow.
if loss always equals 0 the acc result is very bad
If you refer to the accuracy displayed during training, then forget about that because it has nothing to do with the performance of the model.
Is this the problem or you mean that after training a model that achieves 0 loss, the performance as a classifier is very poor?