supervision icon indicating copy to clipboard operation
supervision copied to clipboard

Bug found in ConfusionMatrix.from_detections

Open chiggins2024 opened this issue 1 year ago • 4 comments

Search before asking

  • [X] I have searched the Supervision issues and found no similar bug report.

Bug

Issue found in code when producing a confusion matrix for object detection. It seems like the FN was being added incorrectly to the matrix. Here is the code that was problematic for me. When removing the else condition, I was getting the correct TP value. It seems that num_classes, ends up being at the same position detection_classes[matched_detection_idx[j]]

        ```
for i, true_class_value in enumerate(true_classes):
            j = matched_true_idx == i
            print('sum(j)', sum(j))
            if matches.shape[0] > 0 and sum(j) == 1:
                result_matrix[
                    true_class_value, detection_classes[matched_detection_idx[j]]
                ] += 1  # TP
            else:
                result_matrix[true_class_value, num_classes] += 1  # FN

### Environment

_No response_

### Minimal Reproducible Example

_No response_

### Additional

_No response_

### Are you willing to submit a PR?

- [ ] Yes I'd like to help by submitting a PR!

chiggins2024 avatar Oct 24 '24 15:10 chiggins2024

Hi @chiggins2024 👋

Thank you for the report. We'll check it as soon as we can! As we're transitioning away from the legacy MeanAveragePrecision and ConfusionMatrix, most likely the fix will come as a new ConfusionMatrix version.

LinasKo avatar Nov 01 '24 12:11 LinasKo

Hi thanks! do you know when we can anticipate this new ConfusionMatrix version to the ready ?

On Fri, Nov 1, 2024 at 9:20 AM LinasKo @.***> wrote:

Hi @chiggins2024 https://github.com/chiggins2024 👋

Thank you for the report. We'll check it as soon as we can! As we're transitioning away from the legacy MeanAveragePrecision and ConfusionMatrix, most likely the fix will come as a new ConfusionMatrix version.

— Reply to this email directly, view it on GitHub https://github.com/roboflow/supervision/issues/1619#issuecomment-2451785592, or unsubscribe https://github.com/notifications/unsubscribe-auth/BMMRA527N4OACGT7KVTXYW3Z6NWZVAVCNFSM6AAAAABQRJYZESVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDINJRG44DKNJZGI . You are receiving this because you were mentioned.Message ID: @.***>

chiggins2024 avatar Nov 04 '24 15:11 chiggins2024

Any update on that? When new Confusion matrix will be ready?

Buckler89 avatar Jan 28 '25 22:01 Buckler89

The detection evaluation function matches predicted boxes to ground truths solely based on IoU, without considering class agreement during matching. Afterwards, it compares class IDs and counts mismatched classes as false positives, but by then the correct prediction with lower IoU has already been discarded. This causes incorrect counting of true positives and false positives, leading to inaccurate evaluation metrics.

I corrected this by replacing this code

matched_idx = np.asarray(iou_batch > iou_threshold).nonzero()

        if matched_idx[0].shape[0]:
            matches = np.stack(
                (matched_idx[0], matched_idx[1], iou_batch[matched_idx]), axis=1
            )
            matches = ConfusionMatrix._drop_extra_matches(matches=matches)
        else:
            matches = np.zeros((0, 3))

with the following:

matched_idx = np.asarray(iou_batch > iou_threshold).nonzero()

        if matched_idx[0].shape[0]:
            # Filter matches by class equality
            valid_matches_mask = detection_classes[matched_idx[1]] == true_classes[matched_idx[0]]
            if np.any(valid_matches_mask):
                valid_true_idx = matched_idx[0][valid_matches_mask]
                valid_pred_idx = matched_idx[1][valid_matches_mask]

                ious = iou_batch[valid_true_idx, valid_pred_idx]
                matches = np.stack((valid_true_idx, valid_pred_idx, ious), axis=1)

                # Now drop extra matches with highest IoU per GT/pred
                matches = ConfusionMatrix._drop_extra_matches(matches=matches)
            else:
                matches = np.zeros((0, 3))
        else:
            matches = np.zeros((0, 3))

panagiotamoraiti avatar May 27 '25 14:05 panagiotamoraiti