mmdetection icon indicating copy to clipboard operation
mmdetection copied to clipboard

How can I get the corresponding RECALL when the iou threshold is 0.5?

Open cmjkqyl opened this issue 1 year ago • 5 comments

I changed iou_thrs in mmdet/datasets/coco.py to [0.5], and then got the verification results as shown below. 249548ba417d8f2b3b02d56806b8052 I want to know if the 0.887 in the picture is the correct precision at the 0.5 threshold, and whether 0.899 is the correct recall at the 0.5 threshold. Because their prefixed interpretations confuse me.

cmjkqyl avatar Oct 21 '23 17:10 cmjkqyl

0.5:0.5 is mean 0.5 only

hhaAndroid avatar Oct 23 '23 01:10 hhaAndroid

0.5:0.5 is mean 0.5 only

IMG_20231024_150123

According to the printed results, the RECALL value of the model should be 0.899. However, when I use tools/analysis_tools/confusion_matrix.py to print out the confusion matrix, the RECALL value calculated according to the confusion matrix is 0.85. They don't correspond. Can you tell me the possible cause of it? And which value could be the correct RECALL?

cmjkqyl avatar Oct 24 '23 07:10 cmjkqyl

@cmjkqyl This could be because the two scripts have different calculation processes. I suggest you rely on the COCO mAP (mean Average Precision) as the reference metric.

hhaAndroid avatar Oct 25 '23 01:10 hhaAndroid

@hhaAndroid thank you for your reply. In our task, mAP is definitely the most important metric. But indicators such as RECALL or FPR are also important references. I think many users will also have this need. It would be helpful if you could analyze the source of the difference between these two values or the correct RECALL.

cmjkqyl avatar Nov 04 '23 14:11 cmjkqyl

You can just add some line of codes at the end of tools/analysis_tools/confusion_matrix.py:

TP = np.diag(confusion_matrix)
FP = np.sum(confusion_matrix, axis=0) - TP
FN = np.sum(confusion_matrix, axis=1) - TP

precision = TP / (TP + FP)
recall = TP / (TP + FN)

and then print it.

ddracoo avatar Feb 03 '24 22:02 ddracoo