thundersvm icon indicating copy to clipboard operation
thundersvm copied to clipboard

Cross-validation with different criteria than the accuracy metric

Open aaroncaffrey opened this issue 6 years ago • 1 comments

Feature suggestion:

Currently, as far as I can tell, cross-validation is performed using only the accuracy metric. However, the accuracy metric can be very misleading for imbalanced classes where the number of training examples differs. For example, with 90 positive and 10 negative training examples, a 90% accuracy can be achieved by only correctly predicting the positive class.

Therefore, it is desirable to be able to cross-validate against other metrics, such as F1, Recall, Precision, MCC, Average Precision (equal to AUC Precision-Recall), AUC ROC, and etc.

Is it possible in future to add a command line parameter to the training program to select such an alternative cross-validation performance metric?

Libsvm has a similar optional module here: https://www.csie.ntu.edu.tw/~cjlin/libsvmtools/eval/index.html

Thank you.

aaroncaffrey avatar May 29 '19 18:05 aaroncaffrey

Thanks! We will improve it. Please stay tuned.

zeyiwen avatar May 30 '19 04:05 zeyiwen