Entity icon indicating copy to clipboard operation
Entity copied to clipboard

can't test "High Quality Segmentation for Ultra High-resolution Images"

Open trinh-hoang-hiep opened this issue 2 years ago • 16 comments

I run test.py but i met this error, miss "_seg.png" file

python test.py --dir ./data/DUTS-TE --model ./model_10000 --output ./output --clear /home/hoang/anaconda3/envs/tranSOD/lib/python3.6/site-packages/torchvision/io/image.py:11: UserWarning: Failed to load image Python extension: /home/hoang/anaconda3/envs/tranSOD/lib/python3.6/site-packages/torchvision/image.so: undefined symbol: _ZNK2at10TensorBase21__dispatch_contiguousEN3c1012MemoryFormatE warn(f"Failed to load image Python extension: {e}")

before_Parser_time: 1659253874.6776164 Hyperparameters: {'dir': './data/DUTS-TE', 'model': './model_10000', 'output': './output', 'global_only': False, 'L': 900, 'stride': 450, 'clear': True, 'ade': False} ASPP_4level 12 images found

before_for_time: 1659253881.0989463 ; before_for_time - before_Parser_time: 6.421329975128174 Traceback (most recent call last): File "test.py", line 106, in for im, seg, gt, name, crm_data in progressbar.progressbar(val_loader): File "/home/hoang/anaconda3/envs/tranSOD/lib/python3.6/site-packages/progressbar/shortcuts.py", line 10, in progressbar for result in progressbar(iterator): File "/home/hoang/anaconda3/envs/tranSOD/lib/python3.6/site-packages/progressbar/bar.py", line 547, in next value = next(self._iterable) File "/home/hoang/anaconda3/envs/tranSOD/lib/python3.6/site-packages/torch/utils/data/dataloader.py", line 521, in next data = self._next_data() File "/home/hoang/anaconda3/envs/tranSOD/lib/python3.6/site-packages/torch/utils/data/dataloader.py", line 561, in _next_data data = self._dataset_fetcher.fetch(index) # may raise StopIteration File "/home/hoang/anaconda3/envs/tranSOD/lib/python3.6/site-packages/torch/utils/data/_utils/fetch.py", line 44, in fetch data = [self.dataset[idx] for idx in possibly_batched_index] File "/home/hoang/anaconda3/envs/tranSOD/lib/python3.6/site-packages/torch/utils/data/_utils/fetch.py", line 44, in data = [self.dataset[idx] for idx in possibly_batched_index] File "/home/hoang/Desktop/luanvan/implicitmodel/4cham/Entity/High-Quality-Segmention/dataset/offline_dataset_crm_pad32.py", line 138, in getitem im, seg, gt = self.load_tuple(self.im_list[idx]) File "/home/hoang/Desktop/luanvan/implicitmodel/4cham/Entity/High-Quality-Segmention/dataset/offline_dataset_crm_pad32.py", line 110, in load_tuple seg = Image.open(im[:-7]+'_seg.png').convert('L') File "/home/hoang/anaconda3/envs/tranSOD/lib/python3.6/site-packages/PIL/Image.py", line 2975, in open fp = builtins.open(filename, "rb") FileNotFoundError: [Errno 2] No such file or directory: './data/DUTS-TE/sun_abzsyxfgntlvd_seg.png'

trinh-hoang-hiep avatar Jul 31 '22 09:07 trinh-hoang-hiep

I suggest you insert PDB or Print before line 138 of dataset/offline_dataset_crm_pad32.py to see if you can load the image manually. Thanks.

tcShen avatar Jul 31 '22 09:07 tcShen

I suggest you insert PDB or Print before line 138 of dataset/offline_dataset_crm_pad32.py to see if you can load the image manually. Thanks.

I loaded img variable by image file.png but in seg variable, dataset miss "_seg.png" file I used --dir ./data/DUTS-TE. but there is no "*_seg.png" file

trinh-hoang-hiep avatar Jul 31 '22 10:07 trinh-hoang-hiep

So, if you don't want to modify the dataset code, please change your dataset form. Or keep your dataset, and modify the code for loading seg.png. Thanks.

tcShen avatar Jul 31 '22 16:07 tcShen

So, if you don't want to modify the dataset code, please change your dataset form. Or keep your dataset, and modify the code for loading seg.png. Thanks.

Thank you for your answer, What is the meaning of seg variable and "_seg.png" image, i want to modify the code for generate seg variable to produce "coord" variable and "cell "varibale

trinh-hoang-hiep avatar Jul 31 '22 17:07 trinh-hoang-hiep

image I try to test by using model_10000 and take seg by gray image but the result is so weird seg = Image.open(im[:-4]+'.png').convert('L') seg = self.resize_bi(crop_lambda(Image.open(im).convert('L')))

trinh-hoang-hiep avatar Aug 01 '22 10:08 trinh-hoang-hiep

Hi, have you run through the ENTITY coarse partition network yet? I am getting the following error when running this coarse partition network with instances2017 dataset:

ERROR [08/05 15:18:06 d2.engine.train_loop]: Exception during training: Traceback (most recent call last): File "/home/hndx/detectron2-main/detectron2/engine/train_loop.py", line 149, in train self.run_step() File "/home/hndx/detectron2-main/detectron2/engine/defaults.py", line 494, in run_step self._trainer.run_step() File "/home/hndx/detectron2-main/detectron2/engine/train_loop.py", line 268, in run_step data = next(self._data_loader_iter) File "/home/hndx/detectron2-main/detectron2/data/common.py", line 234, in iter for d in self.dataset: File "/home/hndx/anaconda3/envs/llz0/lib/python3.7/site-packages/torch/utils/data/dataloader.py", line 521, in next data = self._next_data() File "/home/hndx/anaconda3/envs/llz0/lib/python3.7/site-packages/torch/utils/data/dataloader.py", line 1203, in _next_data return self._process_data(data) File "/home/hndx/anaconda3/envs/llz0/lib/python3.7/site-packages/torch/utils/data/dataloader.py", line 1229, in _process_data data.reraise() File "/home/hndx/anaconda3/envs/llz0/lib/python3.7/site-packages/torch/_utils.py", line 434, in reraise raise exception AssertionError: Caught AssertionError in DataLoader worker process 0. Original Traceback (most recent call last): File "/home/hndx/anaconda3/envs/llz0/lib/python3.7/site-packages/torch/utils/data/_utils/worker.py", line 287, in _worker_loop data = fetcher.fetch(index) File "/home/hndx/anaconda3/envs/llz0/lib/python3.7/site-packages/torch/utils/data/_utils/fetch.py", line 32, in fetch data.append(next(self.dataset_iter)) File "/home/hndx/detectron2-main/detectron2/data/common.py", line 201, in iter yield self.dataset[idx] File "/home/hndx/detectron2-main/detectron2/data/common.py", line 90, in getitem data = self._map_func(self._dataset[cur_idx]) File "/home/hndx/detectron2-main/detectron2/utils/serialize.py", line 26, in call return self._obj(*args, **kwargs) File "/home/hndx/detectron2-main/detectron2/projects/EntitySeg/entityseg/data/dataset_mapper.py", line 197, in call instances.instanceid = instance_id_list File "/home/hndx/detectron2-main/detectron2/structures/instances.py", line 66, in setattr self.set(name, val) File "/home/hndx/detectron2-main/detectron2/structures/instances.py", line 84, in set ), "Adding a field of length {} to a Instances of length {}".format(data_len, len(self)) ##lizhi long AssertionError: Adding a field of length 0 to a Instances of length 2

longlizhi avatar Aug 05 '22 07:08 longlizhi

detectron2-main

I've seen it a lot in some issues, but I don't know what it is meaning of detectron2-main ?, what does it do?

trinh-hoang-hiep avatar Aug 05 '22 07:08 trinh-hoang-hiep

This is the download of detectron2 according to the readme of the entity network issue0 issue

longlizhi avatar Aug 05 '22 07:08 longlizhi

This is the download of detectron2 according to the readme of the entity network issue0 issue

I think this repo has 2 projects. one is entity, the other is High-Quality-Segmention. Which one are you working with? .Im working with High-Quality-Segmention

trinh-hoang-hiep avatar Aug 09 '22 06:08 trinh-hoang-hiep

Im working with entity,The first picture shows the readme file of the entity network. Which one has you worked with?

longlizhi avatar Aug 09 '22 08:08 longlizhi

“High Quality Segmentation for Ultra High-resolution Images” doesn't need detectron2. Thanks.

tcShen avatar Aug 11 '22 08:08 tcShen

I got it, Thanks

------------------ 原始邮件 ------------------ 发件人: @.>; 发送时间: 2022年8月11日(星期四) 下午4:53 收件人: @.>; 抄送: @.>; @.>; 主题: Re: [dvlab-research/Entity] can't test "High Quality Segmentation for Ultra High-resolution Images" (Issue #21)

“High Quality Segmentation for Ultra High-resolution Images” doesn't need detectron2. Thanks.

— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you commented.Message ID: @.***>

longlizhi avatar Aug 11 '22 08:08 longlizhi

image I try to test by using model_10000 and take seg by gray image but the result is so weird seg = Image.open(im[:-4]+'.png').convert('L') seg = self.resize_bi(crop_lambda(Image.open(im).convert('L')))

So, what's your problem? The coarse mask from segmentation model is needed. Thanks.

tcShen avatar Aug 11 '22 08:08 tcShen

I used the coarse mask from pspnet

------------------ 原始邮件 ------------------ 发件人: @.>; 发送时间: 2022年8月11日(星期四) 下午4:55 收件人: @.>; 抄送: @.>; @.>; 主题: Re: [dvlab-research/Entity] can't test "High Quality Segmentation for Ultra High-resolution Images" (Issue #21)

I try to test by using model_10000 and take seg by gray image but the result is so weird seg = Image.open(im[:-4]+'.png').convert('L') seg = self.resize_bi(crop_lambda(Image.open(im).convert('L')))

So, what's your problem? The coarse mask from segmentation model is needed. Thanks.

— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you commented.Message ID: @.***>

longlizhi avatar Aug 11 '22 08:08 longlizhi

coarse mask

I understood seg.png as raw mask of another segmentation neural network . So "High Quality Segmentation for Ultra High Resolution Images" is a post-processing right?

trinh-hoang-hiep avatar Aug 11 '22 09:08 trinh-hoang-hiep

yean!

------------------ 原始邮件 ------------------ 发件人: @.>; 发送时间: 2022年8月11日(星期四) 下午5:01 收件人: @.>; 抄送: @.>; @.>; 主题: Re: [dvlab-research/Entity] can't test "High Quality Segmentation for Ultra High-resolution Images" (Issue #21)

coarse mask

I understood seg.png as raw mask of another segmentation neural network . So "High Quality Segmentation for Ultra High Resolution Images" is a post-processing right?

— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you commented.Message ID: @.***>

longlizhi avatar Aug 11 '22 09:08 longlizhi