recommenders icon indicating copy to clipboard operation
recommenders copied to clipboard

Question about hyperparameters' tuning and model comparison.

Open YannisPap opened this issue 3 years ago • 2 comments

I'm a little confused about the practical difference between loss and metrics.

I have developed a model for predicting the next item a visitor will click in an e-shop, so I'm primarily interested in the actual click being in the top 5 results.

  1. During hyperparameter tuning, should I minimize the model's loss or maximise the top_5_categorical_accuracy?
  2. When comparing different model variations, should I look for a lower loss or a higher top_5_categorical_accuracy?

I appreciate any explanation you can provide.

YannisPap avatar Sep 21 '21 08:09 YannisPap

When you tune a model, you want to minimize the loss - the loss is what is used for gradient descent. The accuracy metrics help you evaluate how the model performs. I know you asked this question over a year ago, but thought I would post anyway in case someone else stumbles across the question.

yoklday avatar Nov 29 '22 18:11 yoklday

In general, you want to tune and evaluate on the thing you actually care about. In this case top k accuracy.

patrickorlando avatar Nov 30 '22 00:11 patrickorlando