PyTorch-YOLOv3 icon indicating copy to clipboard operation
PyTorch-YOLOv3 copied to clipboard

Why did my test only achieve mAP: 0.409?

Open CS-Jackson opened this issue 6 years ago • 28 comments

I tried the python test.py --weights_path weights/yolov3.weights, but get mAp: 0.409

CS-Jackson avatar Oct 03 '18 09:10 CS-Jackson

I got mAP: 0.409 too.

underfitting avatar Oct 06 '18 05:10 underfitting

same here

ThatAIGeek avatar Oct 10 '18 13:10 ThatAIGeek

I believe this commit change the test accuracy https://github.com/eriklindernoren/PyTorch-YOLOv3/commit/e9994d6a18f018e2c76985e038b669113aa44468

ThatAIGeek avatar Oct 10 '18 13:10 ThatAIGeek

I got mAP 0.4648 with confidence threshold 0.2(which I think is basic for original implementation)

ThatAIGeek avatar Oct 10 '18 14:10 ThatAIGeek

Same here. How to fix it?

AVK636 avatar Oct 15 '18 07:10 AVK636

I notice that ap of some classes is 0 , what is the problem

GNAYUOHZ avatar Oct 15 '18 11:10 GNAYUOHZ

Same Here.

WJtomcat avatar Oct 16 '18 01:10 WJtomcat

same here. with

python3.6 test.py --batch_size 10 --n_cpu 8

gave me an mAP of 0.4856844428591997

lolongcovas avatar Oct 18 '18 16:10 lolongcovas

Same here. How to fix it?

92ypli avatar Oct 19 '18 10:10 92ypli

mAP: 0.40963824871843535 how to improve the accuracy

shayxurui avatar Oct 20 '18 01:10 shayxurui

The metric, which is now calculated, seems to be the COCO mAP, not the mAP_50. In the original tech note, the mAP is 33.0 for YOLOv3 608 × 608 Darknet-53. In this code, however, images are converted to 416 x 416, and when I set --img_size to 608 I get mAP almost zero. I am not sure, why.

okdimok avatar Oct 20 '18 10:10 okdimok

anyone know how to improve this acc?

perrywu1989 avatar Oct 30 '18 05:10 perrywu1989

Same here...How to fix it? Thanks

Dev2022 avatar Oct 30 '18 08:10 Dev2022

The metric, which is now calculated, seems to be the COCO mAP, not the mAP_50. In the original tech note, the mAP is 33.0 for YOLOv3 608 × 608 Darknet-53. In this code, however, images are converted to 416 x 416, and when I set --img_size to 608 I get mAP almost zero. I am not sure, why.

The metric, which is now calculated, seems to be the COCO mAP, not the mAP_50. In the original tech note, the mAP is 33.0 for YOLOv3 608 × 608 Darknet-53. In this code, however, images are converted to 416 x 416, and when I set --img_size to 608 I get mAP almost zero. I am not sure, why.

set --img_size cannot work here. Because the init of dataloader/Dataset is wrong with img_size, you can fix it by your own.

1243France avatar Nov 07 '18 03:11 1243France

Same here. pytorch=0.4.1,use default parameter to test,class '70' and class '78' got zero,why?

sloan96 avatar Dec 20 '18 14:12 sloan96

same here!

Heath-zyl avatar Dec 28 '18 02:12 Heath-zyl

Did anyone here resolve this issue?

normster avatar Jan 13 '19 09:01 normster

how to resolve this problem?

houweidong avatar Feb 23 '19 04:02 houweidong

The repo below tests to about 0.58 mAP on COCO using the original YOLOv3 weights: https://github.com/ultralytics/yolov3

If you run python3 test.py you should see:

      Image      Total  Precision     Recall        mAP
       5000       5000      0.633      0.598      0.589

mAP Per Class:
         person: 0.7397
        bicycle: 0.4354
            car: 0.4884
      motorbike: 0.6372
      aeroplane: 0.8263
            bus: 0.7101
          train: 0.7713
          truck: 0.3599
           boat: 0.3982
  traffic light: 0.4359
   fire hydrant: 0.7410
      stop sign: 0.7251
  parking meter: 0.4293
          bench: 0.2846
           bird: 0.4764
            cat: 0.6460
            dog: 0.5972
          horse: 0.6855
          sheep: 0.4297
            cow: 0.4343
       elephant: 0.8016
           bear: 0.6418
          zebra: 0.7726
        giraffe: 0.8707
       backpack: 0.2034
       umbrella: 0.5101
        handbag: 0.1676
            tie: 0.5130
       suitcase: 0.3754
        frisbee: 0.6494
           skis: 0.4402
      snowboard: 0.5657
    sports ball: 0.5956
           kite: 0.5647
   baseball bat: 0.5436
 baseball glove: 0.5312
     skateboard: 0.7109
      surfboard: 0.6562
  tennis racket: 0.7707
         bottle: 0.3868
     wine glass: 0.4738
            cup: 0.4165
           fork: 0.3319
          knife: 0.2303
          spoon: 0.2031
           bowl: 0.3590
         banana: 0.3034
          apple: 0.1920
       sandwich: 0.3489
         orange: 0.2760
       broccoli: 0.3100
         carrot: 0.1926
        hot dog: 0.4404
          pizza: 0.5814
          donut: 0.4284
           cake: 0.4452
          chair: 0.3541
           sofa: 0.4362
    pottedplant: 0.3704
            bed: 0.5254
    diningtable: 0.3670
         toilet: 0.8059
      tvmonitor: 0.6290
         laptop: 0.6277
          mouse: 0.6213
         remote: 0.3764
       keyboard: 0.5638
     cell phone: 0.2963
      microwave: 0.5795
           oven: 0.4246
        toaster: 0.0000
           sink: 0.5452
   refrigerator: 0.5449
           book: 0.1520
          clock: 0.6236
           vase: 0.4339
       scissors: 0.2896
     teddy bear: 0.5438
     hair drier: 0.0000
     toothbrush: 0.2697

fourth-archive avatar Feb 23 '19 11:02 fourth-archive

The repo below tests to about 0.58 mAP on COCO using the original YOLOv3 weights: https://github.com/ultralytics/yolov3

If you run python3 test.py you should see:

      Image      Total  Precision     Recall        mAP
       5000       5000      0.633      0.598      0.589

mAP Per Class:
         person: 0.7397
        bicycle: 0.4354
            car: 0.4884
      motorbike: 0.6372
      aeroplane: 0.8263
            bus: 0.7101
          train: 0.7713
          truck: 0.3599
           boat: 0.3982
  traffic light: 0.4359
   fire hydrant: 0.7410
      stop sign: 0.7251
  parking meter: 0.4293
          bench: 0.2846
           bird: 0.4764
            cat: 0.6460
            dog: 0.5972
          horse: 0.6855
          sheep: 0.4297
            cow: 0.4343
       elephant: 0.8016
           bear: 0.6418
          zebra: 0.7726
        giraffe: 0.8707
       backpack: 0.2034
       umbrella: 0.5101
        handbag: 0.1676
            tie: 0.5130
       suitcase: 0.3754
        frisbee: 0.6494
           skis: 0.4402
      snowboard: 0.5657
    sports ball: 0.5956
           kite: 0.5647
   baseball bat: 0.5436
 baseball glove: 0.5312
     skateboard: 0.7109
      surfboard: 0.6562
  tennis racket: 0.7707
         bottle: 0.3868
     wine glass: 0.4738
            cup: 0.4165
           fork: 0.3319
          knife: 0.2303
          spoon: 0.2031
           bowl: 0.3590
         banana: 0.3034
          apple: 0.1920
       sandwich: 0.3489
         orange: 0.2760
       broccoli: 0.3100
         carrot: 0.1926
        hot dog: 0.4404
          pizza: 0.5814
          donut: 0.4284
           cake: 0.4452
          chair: 0.3541
           sofa: 0.4362
    pottedplant: 0.3704
            bed: 0.5254
    diningtable: 0.3670
         toilet: 0.8059
      tvmonitor: 0.6290
         laptop: 0.6277
          mouse: 0.6213
         remote: 0.3764
       keyboard: 0.5638
     cell phone: 0.2963
      microwave: 0.5795
           oven: 0.4246
        toaster: 0.0000
           sink: 0.5452
   refrigerator: 0.5449
           book: 0.1520
          clock: 0.6236
           vase: 0.4339
       scissors: 0.2896
     teddy bear: 0.5438
     hair drier: 0.0000
     toothbrush: 0.2697

The mAP calculation func is wrong in the repo you pointed out, it has been binged up in https://github.com/ultralytics/yolov3/issues/7. It calculate mAP per image, then average this mAP, which could lead to the mAP higher than the trul value.

houweidong avatar Feb 23 '19 12:02 houweidong

@houweidong yes I think the repo computes 1 mAP per image (this 1 mAP is the average of all the mAPs for all the classes present in the image), then averages the 5000 mAPs to get the overall mAP.

What should the correct mAP method be? Maybe I can submit a PR.

fourth-archive avatar Feb 23 '19 22:02 fourth-archive

Hi, this should be resolved in the latest version. You can see the updated measurements in the README.

eriklindernoren avatar Apr 23 '19 11:04 eriklindernoren

I got the mAP: 0.5145

nanhui69 avatar Oct 23 '19 07:10 nanhui69

I got the mAP: 0.5145

me too..

falex-ml avatar Oct 27 '19 09:10 falex-ml

Also, training for 70 epochs from pretrained weights brings to 0.18mAP only :(

falex-ml avatar Oct 27 '19 09:10 falex-ml

I got the mAP: 0.5145

me too..

metooooo

soldier828 avatar Feb 25 '20 06:02 soldier828

I got the mAP: 0.5145

me, too. What's wrong on earth?

densechen avatar Oct 25 '20 13:10 densechen

Refer to here: issue

densechen avatar Oct 25 '20 13:10 densechen