awesome-semantic-segmentation-pytorch icon indicating copy to clipboard operation
awesome-semantic-segmentation-pytorch copied to clipboard

大家有没有遇到用自己数据集训练的时候 损失特别低 所有测试图像的准确率和miou都一模一样?如下所示

Open anqin5211314 opened this issue 5 years ago • 15 comments

2019-11-20 11:16:10,422 semantic_segmentation INFO: Iters: 880/139700 || Lr: 0.000099 || Loss: 0.0033 || Cost Time: 0:06:46 || Estimated Time: 17:46:33 2019-11-20 11:16:15,154 semantic_segmentation INFO: Iters: 890/139700 || Lr: 0.000099 || Loss: 0.0026 || Cost Time: 0:06:50 || Estimated Time: 17:46:48 2019-11-20 11:16:19,786 semantic_segmentation INFO: Iters: 900/139700 || Lr: 0.000099 || Loss: 0.0047 || Cost Time: 0:06:55 || Estimated Time: 17:46:44 2019-11-20 11:16:24,414 semantic_segmentation INFO: Iters: 910/139700 || Lr: 0.000099 || Loss: 0.0116 || Cost Time: 0:07:00 || Estimated Time: 17:46:44 2019-11-20 11:16:49,455 semantic_segmentation INFO: Start validation, Total sample: 114 2019-11-20 11:16:58,204 semantic_segmentation INFO: Sample: 1, validation pixAcc: 100.000, mIoU: 50.000 2019-11-20 11:17:04,222 semantic_segmentation INFO: Sample: 2, validation pixAcc: 100.000, mIoU: 50.000 2019-11-20 11:17:04,642 semantic_segmentation INFO: Sample: 3, validation pixAcc: 100.000, mIoU: 50.000 2019-11-20 11:17:05,219 semantic_segmentation INFO: Sample: 4, validation pixAcc: 100.000, mIoU: 50.000 2019-11-20 11:17:05,779 semantic_segmentation INFO: Sample: 5, validation pixAcc: 100.000, mIoU: 50.000 2019-11-20 11:17:06,346 semantic_segmentation INFO: Sample: 6, validation pixAcc: 100.000, mIoU: 50.000 2019-11-20 11:17:06,908 semantic_segmentation INFO: Sample: 7, validation pixAcc: 100.000, mIoU: 50.000 2019-11-20 11:17:07,470 semantic_segmentation INFO: Sample: 8, validation pixAcc: 100.000, mIoU: 50.000 2019-11-20 11:17:08,031 semantic_segmentation INFO: Sample: 9, validation pixAcc: 100.000, mIoU: 50.000 2019-11-20 11:17:08,584 semantic_segmentation INFO: Sample: 10, validation pixAcc: 100.000, mIoU: 50.000 2019-11-20 11:17:09,134 semantic_segmentation INFO: Sample: 11, validation pixAcc: 100.000, mIoU: 50.000 2019-11-20 11:17:09,682 semantic_segmentation INFO: Sample: 12, validation pixAcc: 100.000, mIoU: 50.000 2019-11-20 11:17:10,076 semantic_segmentation INFO: Sample: 13, validation pixAcc: 100.000, mIoU: 50.000 2019-11-20 11:17:10,621 semantic_segmentation INFO: Sample: 14, validation pixAcc: 100.000, mIoU: 50.000 2019-11-20 11:17:11,180 semantic_segmentation INFO: Sample: 15, validation pixAcc: 100.000, mIoU: 50.000 2019-11-20 11:17:11,576 semantic_segmentation INFO: Sample: 16, validation pixAcc: 100.000, mIoU: 50.000 2019-11-20 11:17:12,127 semantic_segmentation INFO: Sample: 17, validation pixAcc: 100.000, mIoU: 50.000 2019-11-20 11:17:12,690 semantic_segmentation INFO: Sample: 18, validation pixAcc: 100.000, mIoU: 50.000 2019-11-20 11:17:13,259 semantic_segmentation INFO: Sample: 19, validation pixAcc: 100.000, mIoU: 50.000 2019-11-20 11:17:13,659 semantic_segmentation INFO: Sample: 20, validation pixAcc: 100.000, mIoU: 50.000 2019-11-20 11:17:14,219 semantic_segmentation INFO: Sample: 21, validation pixAcc: 100.000, mIoU: 50.000 2019-11-20 11:17:14,797 semantic_segmentation INFO: Sample: 22, validation pixAcc: 100.000, mIoU: 50.000

anqin5211314 avatar Nov 21 '19 12:11 anqin5211314

@Tramac 厉害的作者 请知名道路谢谢

anqin5211314 avatar Nov 21 '19 12:11 anqin5211314

Maybe there is something wrong with the label.

Tramac avatar Dec 16 '19 06:12 Tramac

Hi,if i want to use my own dataset,which dataloader.py should i use,please tell me when you saw the question.

swjtulinxi avatar Dec 19 '19 06:12 swjtulinxi

Hi,if i want to use my own dataset,which dataloader.py should i use,please tell me when you saw the question.

It depends on your dataset directory structure.

Tramac avatar Dec 19 '19 07:12 Tramac

your code is designed for dataset like cityscape,voc 2012 and so on,but i want to us my own dataset, it contains train label img val label img test img label

swjtulinxi avatar Dec 19 '19 07:12 swjtulinxi

You can refer to the script cityscapes.py, it is similar to your file structure.

Tramac avatar Dec 19 '19 07:12 Tramac

from core.nn import _C

ImportError: cannot import name '_C',there is no _c ine the files

swjtulinxi avatar Dec 19 '19 08:12 swjtulinxi

hi,according to your results,why is it lower than the paper's results,do you have any answer?

swjtulinxi avatar Dec 22 '19 03:12 swjtulinxi

I meet the same problem, could you tell me the way you solved the problem? Thanks.

YangYangGirl avatar May 13 '20 01:05 YangYangGirl

2019-11-20 11:16:10,422 semantic_segmentation INFO: Iters: 880/139700 || Lr: 0.000099 || Loss: 0.0033 || Cost Time: 0:06:46 || Estimated Time: 17:46:33 2019-11-20 11:16:15,154 semantic_segmentation INFO: Iters: 890/139700 || Lr: 0.000099 || Loss: 0.0026 || Cost Time: 0:06:50 || Estimated Time: 17:46:48 2019-11-20 11:16:19,786 semantic_segmentation INFO: Iters: 900/139700 || Lr: 0.000099 || Loss: 0.0047 || Cost Time: 0:06:55 || Estimated Time: 17:46:44 2019-11-20 11:16:24,414 semantic_segmentation INFO: Iters: 910/139700 || Lr: 0.000099 || Loss: 0.0116 || Cost Time: 0:07:00 || Estimated Time: 17:46:44 2019-11-20 11:16:49,455 semantic_segmentation INFO: Start validation, Total sample: 114 2019-11-20 11:16:58,204 semantic_segmentation INFO: Sample: 1, validation pixAcc: 100.000, mIoU: 50.000 2019-11-20 11:17:04,222 semantic_segmentation INFO: Sample: 2, validation pixAcc: 100.000, mIoU: 50.000 2019-11-20 11:17:04,642 semantic_segmentation INFO: Sample: 3, validation pixAcc: 100.000, mIoU: 50.000 2019-11-20 11:17:05,219 semantic_segmentation INFO: Sample: 4, validation pixAcc: 100.000, mIoU: 50.000 2019-11-20 11:17:05,779 semantic_segmentation INFO: Sample: 5, validation pixAcc: 100.000, mIoU: 50.000 2019-11-20 11:17:06,346 semantic_segmentation INFO: Sample: 6, validation pixAcc: 100.000, mIoU: 50.000 2019-11-20 11:17:06,908 semantic_segmentation INFO: Sample: 7, validation pixAcc: 100.000, mIoU: 50.000 2019-11-20 11:17:07,470 semantic_segmentation INFO: Sample: 8, validation pixAcc: 100.000, mIoU: 50.000 2019-11-20 11:17:08,031 semantic_segmentation INFO: Sample: 9, validation pixAcc: 100.000, mIoU: 50.000 2019-11-20 11:17:08,584 semantic_segmentation INFO: Sample: 10, validation pixAcc: 100.000, mIoU: 50.000 2019-11-20 11:17:09,134 semantic_segmentation INFO: Sample: 11, validation pixAcc: 100.000, mIoU: 50.000 2019-11-20 11:17:09,682 semantic_segmentation INFO: Sample: 12, validation pixAcc: 100.000, mIoU: 50.000 2019-11-20 11:17:10,076 semantic_segmentation INFO: Sample: 13, validation pixAcc: 100.000, mIoU: 50.000 2019-11-20 11:17:10,621 semantic_segmentation INFO: Sample: 14, validation pixAcc: 100.000, mIoU: 50.000 2019-11-20 11:17:11,180 semantic_segmentation INFO: Sample: 15, validation pixAcc: 100.000, mIoU: 50.000 2019-11-20 11:17:11,576 semantic_segmentation INFO: Sample: 16, validation pixAcc: 100.000, mIoU: 50.000 2019-11-20 11:17:12,127 semantic_segmentation INFO: Sample: 17, validation pixAcc: 100.000, mIoU: 50.000 2019-11-20 11:17:12,690 semantic_segmentation INFO: Sample: 18, validation pixAcc: 100.000, mIoU: 50.000 2019-11-20 11:17:13,259 semantic_segmentation INFO: Sample: 19, validation pixAcc: 100.000, mIoU: 50.000 2019-11-20 11:17:13,659 semantic_segmentation INFO: Sample: 20, validation pixAcc: 100.000, mIoU: 50.000 2019-11-20 11:17:14,219 semantic_segmentation INFO: Sample: 21, validation pixAcc: 100.000, mIoU: 50.000 2019-11-20 11:17:14,797 semantic_segmentation INFO: Sample: 22, validation pixAcc: 100.000, mIoU: 50.000

Hi, I got trouble when trying to train on my own dataset... I've already written a dataloader file and renamed it to mydata.py for example. I also added the name "mydata" in code as an argument. But when I ran the command " python train.py --model fcn32s --backbone vgg16 --dataset mydata --lr 0.01 --epochs 50" ,the error occured: train.py: error: argument --dataset: invalid choice: 'mydata' (choose from 'pascal_voc', 'pascal_aug', 'ade20k', 'citys', 'sbu') Do u know what should I modify then? Thanks a million for ur help!

Kittywyk avatar Sep 14 '20 11:09 Kittywyk

Please check your dataloader.

Tramac avatar Sep 14 '20 11:09 Tramac

Sorry to bother u again. But I felt there's something wrong in adding the choice, cuz the error shows that there's no "mydata" dataset choice. I wonder except adding argument in function parse_args() of train.py, where alse can I modify? Sincerely looking forward to ur reply, thanks! 

------------------ 原始邮件 ------------------ 发件人: "Tramac/awesome-semantic-segmentation-pytorch" <[email protected]>; 发送时间: 2020年9月14日(星期一) 晚上7:56 收件人: "Tramac/awesome-semantic-segmentation-pytorch"<[email protected]>; 抄送: "KunyuanW"<[email protected]>;"Comment"<[email protected]>; 主题: Re: [Tramac/awesome-semantic-segmentation-pytorch] 大家有没有遇到用自己数据集训练的时候 损失特别低 所有测试图像的准确率和miou都一模一样?如下所示 (#105)

Please check your dataloader.

— You are receiving this because you commented. Reply to this email directly, view it on GitHub, or unsubscribe.

Kittywyk avatar Sep 14 '20 12:09 Kittywyk

Hi, I've solved the problem. It's indeed the problem of my dataloader, thx!

------------------ 原始邮件 ------------------ 发件人: "Tramac/awesome-semantic-segmentation-pytorch" <[email protected]>; 发送时间: 2020年9月14日(星期一) 晚上7:56 收件人: "Tramac/awesome-semantic-segmentation-pytorch"<[email protected]>; 抄送: "KunyuanW"<[email protected]>;"Comment"<[email protected]>; 主题: Re: [Tramac/awesome-semantic-segmentation-pytorch] 大家有没有遇到用自己数据集训练的时候 损失特别低 所有测试图像的准确率和miou都一模一样?如下所示 (#105)

Please check your dataloader.

— You are receiving this because you commented. Reply to this email directly, view it on GitHub, or unsubscribe.

Kittywyk avatar Sep 14 '20 14:09 Kittywyk

Hi, I've solved the problem. It's indeed the problem of my dataloader, thx! ------------------ 原始邮件 ------------------ 发件人: "Tramac/awesome-semantic-segmentation-pytorch" <[email protected]>; 发送时间: 2020年9月14日(星期一) 晚上7:56 收件人: "Tramac/awesome-semantic-segmentation-pytorch"<[email protected]>; 抄送: "KunyuanW"<[email protected]>;"Comment"<[email protected]>; 主题: Re: [Tramac/awesome-semantic-segmentation-pytorch] 大家有没有遇到用自己数据集训练的时候 损失特别低 所有测试图像的准确率和miou都一模一样?如下所示 (#105) Please check your dataloader. — You are receiving this because you commented. Reply to this email directly, view it on GitHub, or unsubscribe.

please ,how do you solve the problem

yao123yuhui avatar Nov 16 '21 13:11 yao123yuhui

Hi, I've solved the problem. It's indeed the problem of my dataloader, thx! ------------------ 原始邮件 ------------------ 发件人: "Tramac/awesome-semantic-segmentation-pytorch" <[email protected]>; 发送时间: 2020年9月14日(星期一) 晚上7:56 收件人: "Tramac/awesome-semantic-segmentation-pytorch"<[email protected]>; 抄送: "KunyuanW"<[email protected]>;"Comment"<[email protected]>; 主题: Re: [Tramac/awesome-semantic-segmentation-pytorch] 大家有没有遇到用自己数据集训练的时候 损失特别低 所有测试图像的准确率和miou都一模一样?如下所示 (#105) Please check your dataloader. — You are receiving this because you commented. Reply to this email directly, view it on GitHub, or unsubscribe.

please ,how do you solve the problem

Sorry, I forgot it a bit cuz I didn't use this project anymore. But if I recall correctly, I added my customized class into dataloader/init.py file and it worked. Best.

Kittywyk avatar Nov 17 '21 05:11 Kittywyk