recommenders
recommenders copied to clipboard
Question about hyperparameters' tuning and model comparison.
I'm a little confused about the practical difference between loss
and metrics
.
I have developed a model for predicting the next item a visitor will click in an e-shop, so I'm primarily interested in the actual click being in the top 5 results.
- During hyperparameter tuning, should I minimize the model's
loss
or maximise thetop_5_categorical_accuracy
? - When comparing different model variations, should I look for a lower
loss
or a highertop_5_categorical_accuracy
?
I appreciate any explanation you can provide.
When you tune a model, you want to minimize the loss - the loss is what is used for gradient descent. The accuracy metrics help you evaluate how the model performs. I know you asked this question over a year ago, but thought I would post anyway in case someone else stumbles across the question.
In general, you want to tune and evaluate on the thing you actually care about. In this case top k accuracy.