Notebook on ROC, AUC
Hey, I pinged you yesterday Can you assign me?
Hey @PetalsOnWind can I take up this issue?
@kritikaparmar-programmer Sure. Go ahead
Can I work on this issue as a GS-SOC'21 participant ?
Here is the concept with which I will solve this issue :
- ROC : Receiver Operating Characteristics graph , is a measure of how much a model is capable of distinguishing or classifying the classes. ROC is plotted between TPR (true positive rate) and FPR (false positive rate) which can be calculated from confusion matrix.
- AUC : Area Under the curve represents the degree of separability.
- If AUC --> 1 , it implies that the degree of separability of classes is good while AUC --> 0 represents poor performance. We can use ROC-AUC curves to compare models and different thresholds to find which one gives better classification.
Here are some links for reference : https://towardsdatascience.com/understanding-auc-roc-curve-68b2303cc9c5 https://machinelearningmastery.com/roc-curves-and-precision-recall-curves-for-classification-in-python/ https://scikit-learn.org/stable/auto_examples/model_selection/plot_roc.html
hey @PetalsOnWind I have an understanding of ROC,AUC and and I would like to explain itin detailed wrt the maths behind it.Pls assigned me this issue. I am A ggsoc'21 participant