CloudForest
CloudForest copied to clipboard
Report specificity, sensitivity etc for binary classification with `-test`
I made a small patch on my own fork to report a little bit more data with -test when growforest is finishing. It looks like this:
Error: 0.06121835978431722
Accuracy: 48510 / 51673 = 0.9387881485495326
True Negatives 24999 / Total Negatives 26585 = Specificity (True Negative Rate) 0.940342
True Positives 23511 / Total Positives 25088 = Sensitivity (True Positive Rate) 0.937141
True Positives 23511 / Predicted Positives 25097 = Precision (Positive Predictive Value) 0.936805
True Negatives 24999 / Predicted Negatives 26576 = Negative Predictive Value 0.940661
F1 Score: 0.936973
I didn't make a PR because in my little patch I just assumed I was performing classification with 2 categories (it's what I always do) and didn't check if this was really the case.
Would this be useful in general? If so, I can add the checks to run this only when it makes sense and submit it.
I'm not sure how useful it is to have these reported by the utility (i mostly export the predictions and do my validation elsewhere using roc auc) but they could certainly be in the code somewhere for others to use if needed. Thoughts?