confidenceinterval icon indicating copy to clipboard operation
confidenceinterval copied to clipboard

DeLong returns AUROC > 1.0

Open e-pet opened this issue 6 months ago • 0 comments

I discovered a case for which delong_roc_variance returns a value (ever so slightly) > 1.0.

y_true = array([0, 1, 1, 0, 1, 1, 0, 0, 0, 1, 1, 1, 1, 0, 0, 1, 1, 1, 1, 0, 1, 1,
       1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 0, 0, 1,
       0, 1, 0, 1, 0, 1])

y_pred_prob = array([0.00721543, 0.53823202, 0.8211329 , 0.24135367, 0.61904833,
       0.51323922, 0.50633062, 0.32820338, 0.0909322 , 0.75047184,
       0.68729921, 0.9778512 , 0.71006862, 0.4194739 , 0.31955396,
       0.84841144, 0.75825972, 0.9168877 , 0.86957666, 0.29215992,
       0.77219873, 0.87827124, 0.57069033, 0.5206118 , 0.73469778,
       0.54911986, 0.63431484, 0.69279543, 0.96744707, 0.98569492,
       0.82269733, 0.9818143 , 0.57780473, 0.86393161, 0.69898409,
       0.95322689, 0.66892306, 0.82946166, 0.78274066, 0.84924395,
       0.87428222, 0.37256549, 0.33853178, 0.79422342, 0.14013082,
       0.98968488, 0.2856893 , 0.95359197, 0.13624191, 0.72447603])

auroc, variance = delong_roc_variance(y_true, y_pred_prob)  # yields 1.0000001

This is probably just a numerical rounding error somewhere and certainly nothing grave, but it might nevertheless be nice to force auroc <= 1.0? (Not sure if this could also be indicative of a very subtle bug in the DeLong implementation.)

FWIW, sklearn.metrics.roc_auc_score returns 1.0 for this case.

-- Eike

e-pet avatar Jun 25 '25 14:06 e-pet