hyperparameter-optimization icon indicating copy to clipboard operation
hyperparameter-optimization copied to clipboard

Best vs mean CV score

Open MiladShahidi opened this issue 6 years ago • 1 comments

Thank you for this helpful notebook. Can you explain why you return the best cv scores in the objective function? Isn't that an overly optimistic (upward bias) measure of performance? I believe the results of the n_fold cross validations should be averaged and returned as the score/loss. Am I missing something?

MiladShahidi avatar Sep 21 '18 08:09 MiladShahidi

You are correct that the average metric from all the folds should be used as the measure of performance. If you look at the code for the objective function, I do return the average roc-auc over the folds.

best_score = np.max(cv_results['auc-mean'])
loss = 1 - best_score

auc-mean is the average auc over the cross-validation folds.

WillKoehrsen avatar Sep 21 '18 20:09 WillKoehrsen