Statistics-and-Econometrics-for-Data-Science icon indicating copy to clipboard operation
Statistics-and-Econometrics-for-Data-Science copied to clipboard

Notebook on ROC, AUC

Open PetalsOnWind opened this issue 5 years ago • 6 comments

PetalsOnWind avatar Dec 08 '20 14:12 PetalsOnWind

Hey, I pinged you yesterday Can you assign me?

suhas142 avatar Dec 09 '20 18:12 suhas142

Hey @PetalsOnWind can I take up this issue?

@kritikaparmar-programmer Sure. Go ahead

PetalsOnWind avatar Dec 26 '20 18:12 PetalsOnWind

Can I work on this issue as a GS-SOC'21 participant ?

RidhimaKohli avatar Mar 08 '21 18:03 RidhimaKohli

Here is the concept with which I will solve this issue :

  • ROC : Receiver Operating Characteristics graph , is a measure of how much a model is capable of distinguishing or classifying the classes. ROC is plotted between TPR (true positive rate) and FPR (false positive rate) which can be calculated from confusion matrix.
  • AUC : Area Under the curve represents the degree of separability.
  • If AUC --> 1 , it implies that the degree of separability of classes is good while AUC --> 0 represents poor performance. We can use ROC-AUC curves to compare models and different thresholds to find which one gives better classification.

Here are some links for reference : https://towardsdatascience.com/understanding-auc-roc-curve-68b2303cc9c5 https://machinelearningmastery.com/roc-curves-and-precision-recall-curves-for-classification-in-python/ https://scikit-learn.org/stable/auto_examples/model_selection/plot_roc.html

RidhimaKohli avatar Mar 14 '21 11:03 RidhimaKohli

hey @PetalsOnWind I have an understanding of ROC,AUC and and I would like to explain itin detailed wrt the maths behind it.Pls assigned me this issue. I am A ggsoc'21 participant

aishwaryachand avatar Apr 09 '21 22:04 aishwaryachand