waymo-open-dataset icon indicating copy to clipboard operation
waymo-open-dataset copied to clipboard

mAP all 0 using my own gt and inference

Open Loong789 opened this issue 2 years ago • 4 comments
trafficstars

I want to use this piece of code to evaluate my own 3D detection ground truth and inference.

https://github.com/waymo-research/waymo-open-dataset/blob/master/tutorial/tutorial_camera_only.ipynb

However, when I submit the metrics_dict to compute_let_detection_metrics, I get all mAP values as 0.

Loong789 avatar Aug 16 '23 07:08 Loong789

I modified the parse_metrics_objects_binary_files function because both my ground truth and inference are in JSON format. Therefore, I used my modified method to obtain the eval_dict. The format remains the same as the original one.

eval_dict = { 'prediction_frame_id': [], 'prediction_bbox': [], 'prediction_type': [], 'prediction_score': [], 'ground_truth_frame_id': [], 'ground_truth_bbox': [], 'ground_truth_type': [], 'ground_truth_difficulty': [], } However, when I input this dictionary into compute_let_detection_metrics, I'm getting all mAP values as 0.

Loong789 avatar Aug 16 '23 07:08 Loong789

my eval_dict:

{'prediction_frame_id': <tf.Tensor: shape=(3113,), dtype=int32, numpy= array([1664177764, 1664177764, 1664177764, ..., 1663915546, 1663915546, 1663915546], dtype=int32)>, 'prediction_bbox': <tf.Tensor: shape=(3113, 7), dtype=float32, numpy= array([[-18.237066 , 0.21223623, 26.317488 , ..., 1.9052734 , 4.6679688 , 0.8667859 ], [-31.60234 , 0.19234297, 41.1731 , ..., 1.9384766 , 4.6054688 , 0.86156267], [-81.71167 , 0.44039583, 101.23588 , ..., 2.0078125 , 4.7226562 , 0.87564856], ..., [ 18.21825 , -1.5972904 , 35.34692 , ..., 1.9619141 , 4.5078125 , 1.2166243 ], [ 12.860676 , -0.47492164, 23.250347 , ..., 2.2246094 , 6.3515625 , 2.727902 ], [ 14.691335 , -0.9401689 , 29.226725 , ..., 1.9541016 , 4.5898438 , 2.8041542 ]], dtype=float32)>, 'prediction_type': <tf.Tensor: shape=(3113,), dtype=int32, numpy=array([1, 1, 1, ..., 1, 1, 1], dtype=int32)>, 'prediction_score': <tf.Tensor: shape=(3113,), dtype=float32, numpy= array([0.96240234, 0.9482422 , 0.48388672, ..., 0.09088135, 0.05749512, 0.05459595], dtype=float32)>, 'ground_truth_frame_id': <tf.Tensor: shape=(330,), dtype=int32, numpy= array([1664177764, 1664177764, 1664177764, 1664177764, 1664177764, 1664177764, 1664177764, 1664177764, 1664177764, 1664177764, 1664177764, 1664177764, 1664177764, 1664177764, 1664177764, 1664177764, 1664177764, 1664177764, 1664177764, 1664177764, 1664177764, 1664177764, 1664177764, 1664177764, 1664177764, 1664177764, 1664177764, 1664177764, 1664153045, 1664153045, 1664153045, 1664153045, 1664153045, 1664153045, 1664153045, 1664153045, 1664153045, 1664153045, 1664153045, 1664153045, 1664153045, 1664153045, 1664153045, 1664153045, 1664153045, 1664153045, 1664153045, 1664153045, 1664153045, 1664153045, 1664153045, 1664153045, 1664153045, 1664153045, 1664153045, 1664153045, 1664153045, 1664502722, 1664502722, 1664502722, 1664502722, 1664502722, 1664502722, 1664502722, 1664502722, 1664502722, 1664502722, 1664502722, 1664502722, 1664502722, 1664502722, 1664502722, 1664502722, 1664502722, 1664502722, 1664502722, 1664502722, 1664502722, 1664502722, 1664502722, 1664502722, 1664502722, 1664502722, 1664502722, 1664502722, 1664502722, 1664502722, 1664502722, 1664502722, 1664502722, 1664502722, 1664502722, 1664502722, 1664502722, 1664502722, 1664502722, 1664502722, 1664502722, 1664502722, 1664502722, 1664502722, 1664502722, 1664502722, 1664502722, 1664502722, 1664502722, 1664502722, 1664502722, 1664502722, 1664502722, 1664502722, 1664502722, 1664502722, 1664502722, 1665213701, 1665213701, 1665213701, 1665213701, 1665213701, 1665213701, 1665213701, 1665213701, 1665213701, 1665213701, 1665213701, 1665213701, 1665213701, 1665213701, 1665213701, 1665213701, 1665213701, 1665213701, 1665213701, 1665213701, 1665213701, 1665213701, 1665213701, 1665213701, 1665213701, 1665213701, 1665213701, 1665213701, 1665213701, 1665213701, 1665213701, 1665213701, 1665213701, 1665213701, 1665382203, 1665382203, 1665382203, 1665382203, 1665382203, 1665382203, 1665382203, 1665382203, 1665382203, 1665382203, 1665382203, 1665382203, 1665382203, 1665382203, 1665382203, 1665382203, 1665382203, 1665382203, 1665382203, 1665382203, 1665382203, 1665382203, 1665382203, 1665382203, 1665382203, 1665382203, 1665382203, 1665382203, 1665382203, 1665382203, 1665382203, 1665382203, 1665382203, 1665382203, 1665382203, 1665382203, 1665382203, 1665382203, 1665382203, 1665382203, 1665382203, 1665377974, 1665377974, 1665377974, 1665377974, 1665377974, 1665377974, 1665377974, 1665377974, 1665377974, 1665377974, 1665377974, 1665377974, 1665377974, 1665377974, 1665377974, 1665377974, 1665377974, 1665377974, 1665377974, 1665377974, 1665377974, 1665377974, 1665377974, 1665377974, 1665377974, 1665377974, 1665377974, 1665377974, 1665377974, 1665377974, 1665377974, 1665377974, 1665377974, 1665377974, 1665377974, 1665377974, 1665377974, 1665377974, 1665377974, 1665377974, 1665377974, 1665377974, 1665377974, 1665377974, 1665377974, 1665377974, 1665377974, 1665377974, 1665377974, 1663914893, 1663914893, 1663914893, 1663914893, 1663914893, 1663914893, 1663914893, 1663914893, 1663914893, 1663914893, 1663914893, 1663914893, 1663914893, 1663914893, 1663914893, 1663914893, 1663914893, 1663914893, 1663914893, 1663914893, 1665302222, 1665302222, 1665302222, 1665302222, 1665302222, 1665302222, 1665302222, 1665302222, 1665302222, 1665302222, 1665302222, 1665302222, 1665302222, 1665302222, 1665302222, 1665302222, 1665302222, 1665302222, 1665302222, 1665302222, 1665302222, 1665302222, 1665302222, 1665302222, 1665302222, 1665302222, 1665302222, 1665302222, 1665302222, 1665302222, 1665302222, 1665302222, 1665302222, 1665302222, 1665302222, 1665302222, 1665302222, 1665302222, 1665302222, 1665302222, 1665302222, 1665302222, 1665302222, 1665302222, 1665302222, 1665302222, 1665302222, 1665302222, 1664425127, 1664425127, 1664425127, 1664425127, 1664425127, 1664425127, 1664425127, 1664425127, 1664425127, 1664425127, 1664425127, 1664425127, 1664425127, 1664425127, 1663915546, 1663915546, 1663915546, 1663915546, 1663915546, 1663915546, 1663915546, 1663915546, 1663915546, 1663915546], dtype=int32)>, 'ground_truth_bbox': <tf.Tensor: shape=(330, 7), dtype=float32, numpy= array([[ 5.8067226e+01, 9.7487718e-01, 1.0188284e+00, ..., 2.0480144e+00, 2.4068022e+00, 1.1158246e-02], [ 7.6088852e+01, -2.3517072e+00, 7.9206568e-01, ..., 1.9108844e+00, 1.9433117e+00, 2.0753147e-02], [ 5.2290707e+01, 4.4977088e+00, 6.1230433e-01, ..., 1.8461144e+00, 1.7497402e+00, -1.3990463e-02], ..., [-6.1083694e+01, 8.5613041e+00, 5.1825398e-01, ..., 1.9321867e+00, 1.5277758e+00, -3.5613656e-01], [ 1.3475568e+02, 1.3814440e+01, -1.8174309e+00, ..., 8.7663484e-01, 1.0886487e+00, 4.0723112e-01], [ 5.2634931e+00, -2.6153786e+01, 1.0193750e+00, ..., 2.4543688e+00, 2.7000000e+00, -3.1098187e+00]], dtype=float32)>, 'ground_truth_type': <tf.Tensor: shape=(330,), dtype=int32, numpy= array([1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 2, 4, 4, 2, 2, 2, 2, 1, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 4, 1, 1, 4, 1, 1, 1, 4, 4, 2, 0, 1, 2, 4, 2, 1, 4, 4, 2, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 4, 4, 4, 4, 2, 0, 0, 0, 1, 1, 4, 0, 0, 1, 4, 4, 1, 4, 1, 1, 4, 2, 2, 2, 4, 4, 4, 4, 2, 4, 4, 4, 1, 1, 1, 1, 1, 1, 1, 1, 4, 1, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 1, 1, 0, 1, 0, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 4, 2, 4, 2, 2, 1, 4, 4, 1, 1, 1, 1, 1, 1, 1, 1, 1, 2, 4, 2, 4, 2, 2, 2, 2, 2, 2, 2, 4, 4, 1, 1, 2, 2, 2, 4, 0, 0, 2, 2, 4, 0, 4, 2, 0, 0, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 0, 4, 4, 1, 1, 1, 1, 1, 1, 1, 1, 0, 1], dtype=int32)>, 'ground_truth_difficulty': <tf.Tensor: shape=(330,), dtype=int32, numpy= array([1, 2, 3, 3, 1, 2, 1, 2, 1, 1, 1, 2, 1, 3, 1, 3, 2, 2, 2, 2, 2, 2, 1, 2, 3, 2, 2, 2, 1, 1, 1, 3, 3, 3, 3, 1, 1, 2, 1, 1, 3, 3, 1, 1, 1, 2, 1, 1, 2, 1, 2, 2, 2, 3, 2, 3, 1, 2, 3, 2, 3, 3, 2, 1, 2, 1, 3, 2, 1, 2, 2, 1, 1, 3, 3, 1, 3, 1, 3, 2, 1, 2, 2, 3, 3, 1, 1, 3, 2, 1, 3, 2, 2, 1, 1, 3, 3, 1, 3, 3, 2, 3, 3, 2, 3, 2, 2, 3, 1, 1, 2, 3, 1, 3, 1, 2, 2, 2, 1, 2, 1, 2, 2, 2, 2, 1, 2, 3, 2, 2, 1, 2, 3, 3, 2, 2, 1, 3, 1, 3, 2, 1, 3, 2, 1, 3, 3, 3, 1, 3, 3, 3, 3, 1, 2, 3, 1, 2, 1, 1, 2, 3, 2, 1, 3, 3, 1, 2, 2, 1, 2, 3, 2, 1, 1, 1, 2, 2, 1, 3, 3, 1, 2, 2, 1, 1, 1, 1, 1, 2, 1, 2, 1, 3, 1, 3, 1, 3, 1, 3, 3, 2, 1, 3, 3, 1, 3, 1, 3, 1, 2, 2, 2, 1, 3, 3, 2, 1, 2, 2, 1, 3, 3, 2, 3, 1, 1, 2, 3, 1, 3, 2, 2, 1, 2, 2, 2, 3, 3, 2, 3, 1, 1, 2, 1, 1, 1, 1, 3, 3, 2, 1, 1, 1, 2, 3, 2, 2, 3, 2, 3, 3, 3, 3, 3, 1, 1, 3, 3, 2, 1, 3, 3, 3, 1, 3, 3, 2, 3, 1, 3, 2, 3, 1, 3, 3, 3, 3, 2, 3, 1, 2, 1, 2, 1, 1, 3, 3, 1, 3, 2, 2, 3, 1, 2, 3, 2, 2, 2, 3, 3, 3, 2, 2, 3, 3, 1, 1, 1, 2, 2, 2, 2, 2, 3, 1, 1, 1, 2, 1], dtype=int32)>}

Loong789 avatar Aug 16 '23 07:08 Loong789

my metrics_dict:

OBJECT_TYPE_TYPE_VEHICLE_LEVEL_2/LET-mAP : 0.0 OBJECT_TYPE_TYPE_VEHICLE_LEVEL_2/LET-mAPH : 0.0 OBJECT_TYPE_TYPE_VEHICLE_LEVEL_2/LET-mAPL : 0.0 OBJECT_TYPE_TYPE_PEDESTRIAN_LEVEL_2/LET-mAP : 0.0 OBJECT_TYPE_TYPE_PEDESTRIAN_LEVEL_2/LET-mAPH : 0.0 OBJECT_TYPE_TYPE_PEDESTRIAN_LEVEL_2/LET-mAPL : 0.0 OBJECT_TYPE_TYPE_CYCLIST_LEVEL_2/LET-mAP : 0.0 OBJECT_TYPE_TYPE_CYCLIST_LEVEL_2/LET-mAPH : 0.0 OBJECT_TYPE_TYPE_CYCLIST_LEVEL_2/LET-mAPL : 0.0 CAMERA_TYPE_VEHICLE_FRONT_LEVEL_2/LET-mAP : 0.0 CAMERA_TYPE_VEHICLE_FRONT_LEVEL_2/LET-mAPH : 0.0 CAMERA_TYPE_VEHICLE_FRONT_LEVEL_2/LET-mAPL : 0.0 CAMERA_TYPE_VEHICLE_FRONT-LEFT_LEVEL_2/LET-mAP : 0.0 CAMERA_TYPE_VEHICLE_FRONT-LEFT_LEVEL_2/LET-mAPH : 0.0 CAMERA_TYPE_VEHICLE_FRONT-LEFT_LEVEL_2/LET-mAPL : 0.0 CAMERA_TYPE_VEHICLE_FRONT-RIGHT_LEVEL_2/LET-mAP : 0.0 CAMERA_TYPE_VEHICLE_FRONT-RIGHT_LEVEL_2/LET-mAPH : 0.0 CAMERA_TYPE_VEHICLE_FRONT-RIGHT_LEVEL_2/LET-mAPL : 0.0 CAMERA_TYPE_VEHICLE_SIDE-LEFT_LEVEL_2/LET-mAP : 0.0 CAMERA_TYPE_VEHICLE_SIDE-LEFT_LEVEL_2/LET-mAPH : 0.0 CAMERA_TYPE_VEHICLE_SIDE-LEFT_LEVEL_2/LET-mAPL : 0.0 CAMERA_TYPE_VEHICLE_SIDE-RIGHT_LEVEL_2/LET-mAP : 0.0 CAMERA_TYPE_VEHICLE_SIDE-RIGHT_LEVEL_2/LET-mAPH : 0.0 CAMERA_TYPE_VEHICLE_SIDE-RIGHT_LEVEL_2/LET-mAPL : 0.0 CAMERA_TYPE_PEDESTRIAN_FRONT_LEVEL_2/LET-mAP : 0.0 CAMERA_TYPE_PEDESTRIAN_FRONT_LEVEL_2/LET-mAPH : 0.0 CAMERA_TYPE_PEDESTRIAN_FRONT_LEVEL_2/LET-mAPL : 0.0 CAMERA_TYPE_PEDESTRIAN_FRONT-LEFT_LEVEL_2/LET-mAP : 0.0 CAMERA_TYPE_PEDESTRIAN_FRONT-LEFT_LEVEL_2/LET-mAPH : 0.0 CAMERA_TYPE_PEDESTRIAN_FRONT-LEFT_LEVEL_2/LET-mAPL : 0.0 CAMERA_TYPE_PEDESTRIAN_FRONT-RIGHT_LEVEL_2/LET-mAP : 0.0 CAMERA_TYPE_PEDESTRIAN_FRONT-RIGHT_LEVEL_2/LET-mAPH : 0.0 CAMERA_TYPE_PEDESTRIAN_FRONT-RIGHT_LEVEL_2/LET-mAPL : 0.0 CAMERA_TYPE_PEDESTRIAN_SIDE-LEFT_LEVEL_2/LET-mAP : 0.0 CAMERA_TYPE_PEDESTRIAN_SIDE-LEFT_LEVEL_2/LET-mAPH : 0.0 CAMERA_TYPE_PEDESTRIAN_SIDE-LEFT_LEVEL_2/LET-mAPL : 0.0 CAMERA_TYPE_PEDESTRIAN_SIDE-RIGHT_LEVEL_2/LET-mAP : 0.0 CAMERA_TYPE_PEDESTRIAN_SIDE-RIGHT_LEVEL_2/LET-mAPH : 0.0 CAMERA_TYPE_PEDESTRIAN_SIDE-RIGHT_LEVEL_2/LET-mAPL : 0.0 CAMERA_TYPE_CYCLIST_FRONT_LEVEL_2/LET-mAP : 0.0 CAMERA_TYPE_CYCLIST_FRONT_LEVEL_2/LET-mAPH : 0.0 CAMERA_TYPE_CYCLIST_FRONT_LEVEL_2/LET-mAPL : 0.0 CAMERA_TYPE_CYCLIST_FRONT-LEFT_LEVEL_2/LET-mAP : 0.0 CAMERA_TYPE_CYCLIST_FRONT-LEFT_LEVEL_2/LET-mAPH : 0.0 CAMERA_TYPE_CYCLIST_FRONT-LEFT_LEVEL_2/LET-mAPL : 0.0 CAMERA_TYPE_CYCLIST_FRONT-RIGHT_LEVEL_2/LET-mAP : 0.0 CAMERA_TYPE_CYCLIST_FRONT-RIGHT_LEVEL_2/LET-mAPH : 0.0 CAMERA_TYPE_CYCLIST_FRONT-RIGHT_LEVEL_2/LET-mAPL : 0.0 CAMERA_TYPE_CYCLIST_SIDE-LEFT_LEVEL_2/LET-mAP : 0.0 CAMERA_TYPE_CYCLIST_SIDE-LEFT_LEVEL_2/LET-mAPH : 0.0 CAMERA_TYPE_CYCLIST_SIDE-LEFT_LEVEL_2/LET-mAPL : 0.0 CAMERA_TYPE_CYCLIST_SIDE-RIGHT_LEVEL_2/LET-mAP : 0.0 CAMERA_TYPE_CYCLIST_SIDE-RIGHT_LEVEL_2/LET-mAPH : 0.0 CAMERA_TYPE_CYCLIST_SIDE-RIGHT_LEVEL_2/LET-mAPL : 0.0 RANGE_TYPE_VEHICLE_[0, 30)LEVEL_2/LET-mAP : 0.0 RANGE_TYPE_VEHICLE[0, 30)LEVEL_2/LET-mAPH : 0.0 RANGE_TYPE_VEHICLE[0, 30)LEVEL_2/LET-mAPL : 0.0 RANGE_TYPE_VEHICLE[30, 50)LEVEL_2/LET-mAP : 0.0 RANGE_TYPE_VEHICLE[30, 50)LEVEL_2/LET-mAPH : 0.0 RANGE_TYPE_VEHICLE[30, 50)LEVEL_2/LET-mAPL : 0.0 RANGE_TYPE_VEHICLE[50, +inf)LEVEL_2/LET-mAP : 0.0 RANGE_TYPE_VEHICLE[50, +inf)LEVEL_2/LET-mAPH : 0.0 RANGE_TYPE_VEHICLE[50, +inf)LEVEL_2/LET-mAPL : 0.0 RANGE_TYPE_PEDESTRIAN[0, 30)LEVEL_2/LET-mAP : 0.0 RANGE_TYPE_PEDESTRIAN[0, 30)LEVEL_2/LET-mAPH : 0.0 RANGE_TYPE_PEDESTRIAN[0, 30)LEVEL_2/LET-mAPL : 0.0 RANGE_TYPE_PEDESTRIAN[30, 50)LEVEL_2/LET-mAP : 0.0 RANGE_TYPE_PEDESTRIAN[30, 50)LEVEL_2/LET-mAPH : 0.0 RANGE_TYPE_PEDESTRIAN[30, 50)LEVEL_2/LET-mAPL : 0.0 RANGE_TYPE_PEDESTRIAN[50, +inf)LEVEL_2/LET-mAP : 0.0 RANGE_TYPE_PEDESTRIAN[50, +inf)LEVEL_2/LET-mAPH : 0.0 RANGE_TYPE_PEDESTRIAN[50, +inf)LEVEL_2/LET-mAPL : 0.0 RANGE_TYPE_CYCLIST[0, 30)LEVEL_2/LET-mAP : 0.0 RANGE_TYPE_CYCLIST[0, 30)LEVEL_2/LET-mAPH : 0.0 RANGE_TYPE_CYCLIST[0, 30)LEVEL_2/LET-mAPL : 0.0 RANGE_TYPE_CYCLIST[30, 50)LEVEL_2/LET-mAP : 0.0 RANGE_TYPE_CYCLIST[30, 50)LEVEL_2/LET-mAPH : 0.0 RANGE_TYPE_CYCLIST[30, 50)LEVEL_2/LET-mAPL : 0.0 RANGE_TYPE_CYCLIST[50, +inf)LEVEL_2/LET-mAP : 0.0 RANGE_TYPE_CYCLIST[50, +inf)LEVEL_2/LET-mAPH : 0.0 RANGE_TYPE_CYCLIST[50, +inf)_LEVEL_2/LET-mAPL : 0.0

Loong789 avatar Aug 16 '23 07:08 Loong789

Hi @Loong789 ,

I don't see obvious bug here. Before looking into further details:

  1. Can you verify that with the provided groundtruth and prediction you get similar results as the colab which are non-zeros?
  2. Can you pass in the same boxes for prediction and groundtruth to validate a perfect detection scenario?

Thanks, Wayne

hfslyc avatar Aug 18 '23 20:08 hfslyc