mmdetection
mmdetection copied to clipboard
How can I get the corresponding RECALL when the iou threshold is 0.5?
I changed iou_thrs in mmdet/datasets/coco.py to [0.5], and then got the verification results as shown below.
I want to know if the 0.887 in the picture is the correct precision at the 0.5 threshold, and whether 0.899 is the correct recall at the 0.5 threshold. Because their prefixed interpretations confuse me.
0.5:0.5 is mean 0.5 only
0.5:0.5 is mean 0.5 only
According to the printed results, the RECALL value of the model should be 0.899. However, when I use tools/analysis_tools/confusion_matrix.py to print out the confusion matrix, the RECALL value calculated according to the confusion matrix is 0.85. They don't correspond. Can you tell me the possible cause of it? And which value could be the correct RECALL?
@cmjkqyl This could be because the two scripts have different calculation processes. I suggest you rely on the COCO mAP (mean Average Precision) as the reference metric.
@hhaAndroid thank you for your reply. In our task, mAP is definitely the most important metric. But indicators such as RECALL or FPR are also important references. I think many users will also have this need. It would be helpful if you could analyze the source of the difference between these two values or the correct RECALL.
You can just add some line of codes at the end of tools/analysis_tools/confusion_matrix.py
:
TP = np.diag(confusion_matrix)
FP = np.sum(confusion_matrix, axis=0) - TP
FN = np.sum(confusion_matrix, axis=1) - TP
precision = TP / (TP + FP)
recall = TP / (TP + FN)
and then print it.