hyperseg icon indicating copy to clipboard operation
hyperseg copied to clipboard

TypeError: an integer is required (got type tuple)

Open consideragain opened this issue 3 years ago • 11 comments

Hi, Thanks for your sharing. There is a error when I run the training code which is the sample.

TRAINING: Epoch: 1 / 360; LR: 1.0e-03; losses: [total: 3.1942 (3.1942); ] bench: [iou: 0.0106 (0.0106); ] : 0%| | 1/1000 [00:07<1:59:20, 7.17s/batches]

Traceback (most recent call last): File "/home/tt/zyj_ws/hyperseg/configs/train/cityscapes_efficientnet_b1_hyperseg-m.py", line 43, in main(exp_dir, train_dataset=train_dataset, val_dataset=val_dataset, train_img_transforms=train_img_transforms, File "/home/tt/zyj_ws/hyperseg/train.py", line 248, in main epoch_loss, epoch_iou = proces_epoch(train_loader, train=True) File "/home/tt/zyj_ws/hyperseg/train.py", line 104, in proces_epoch for i, (input, target) in enumerate(pbar): File "/home/tt/anaconda3/envs/zyjenv/lib/python3.9/site-packages/tqdm/std.py", line 1185, in iter for obj in iterable: File "/home/tt/anaconda3/envs/zyjenv/lib/python3.9/site-packages/torch/utils/data/dataloader.py", line 517, in next data = self._next_data() File "/home/tt/anaconda3/envs/zyjenv/lib/python3.9/site-packages/torch/utils/data/dataloader.py", line 1179, in _next_data return self._process_data(data) File "/home/tt/anaconda3/envs/zyjenv/lib/python3.9/site-packages/torch/utils/data/dataloader.py", line 1225, in _process_data data.reraise() File "/home/tt/anaconda3/envs/zyjenv/lib/python3.9/site-packages/torch/_utils.py", line 429, in reraise raise self.exc_type(msg) TypeError: Caught TypeError in DataLoader worker process 1. Original Traceback (most recent call last): File "/home/tt/anaconda3/envs/zyjenv/lib/python3.9/site-packages/torch/utils/data/_utils/worker.py", line 202, in _worker_loop data = fetcher.fetch(index) File "/home/tt/anaconda3/envs/zyjenv/lib/python3.9/site-packages/torch/utils/data/_utils/fetch.py", line 44, in fetch data = [self.dataset[idx] for idx in possibly_batched_index] File "/home/tt/anaconda3/envs/zyjenv/lib/python3.9/site-packages/torch/utils/data/_utils/fetch.py", line 44, in data = [self.dataset[idx] for idx in possibly_batched_index] File "/home/tt/zyj_ws/hyperseg/datasets/cityscapes.py", line 220, in getitem image, target = self.transforms(image, target) File "/home/tt/zyj_ws/hyperseg/datasets/seg_transforms.py", line 78, in call input = list(t(*input)) File "/home/tt/zyj_ws/hyperseg/datasets/seg_transforms.py", line 334, in call lbl = F.pad(lbl, (int(self.size[1] - lbl.size[0]), 0), self.lbl_fill, self.padding_mode).copy() File "/home/tt/anaconda3/envs/zyjenv/lib/python3.9/site-packages/torchvision/transforms/functional.py", line 426, in pad return F_pil.pad(img, padding=padding, fill=fill, padding_mode=padding_mode) File "/home/tt/anaconda3/envs/zyjenv/lib/python3.9/site-packages/torchvision/transforms/functional_pil.py", line 153, in pad image = ImageOps.expand(img, border=padding, **opts) File "/home/tt/anaconda3/envs/zyjenv/lib/python3.9/site-packages/PIL/ImageOps.py", line 403, in expand draw.rectangle((0, 0, width - 1, height - 1), outline=color, width=border) File "/home/tt/anaconda3/envs/zyjenv/lib/python3.9/site-packages/PIL/ImageDraw.py", line 259, in rectangle self.draw.draw_rectangle(xy, ink, 0, width) TypeError: an integer is required (got type tuple)

According to the information from the Internet, this may be a problem in the transformation. Can you give me some suggestion on how to do it?

consideragain avatar Jul 15 '21 09:07 consideragain

I'll look into it. What is your PyTorch and Torchvision versions?

YuvalNirkin avatar Jul 15 '21 11:07 YuvalNirkin

Thanks for your reply. The versions are as follows: pytorch --------------- 1.8.1 torchvision----------- 0.9.1 opencv-python------4.5.2.54 cudatoolkit-----------10.2.89

consideragain avatar Jul 16 '21 02:07 consideragain

I am not able to recreate this issue with the same versions.

YuvalNirkin avatar Jul 16 '21 08:07 YuvalNirkin

What is your Python and Ubuntu versions?

consideragain avatar Jul 16 '21 09:07 consideragain

I have the same issue when i run the training code。 Traceback (most recent call last): File "E:\CODE\hyperseg\configs\train\vocsbd_efficientnet_b3_hyperseg-l.py", line 38, in main(exp_dir, train_dataset=train_dataset, val_dataset=val_dataset, train_img_transforms=train_img_transforms, File "E:\CODE\hyperseg\train.py", line 248, in main epoch_loss, epoch_iou = proces_epoch(train_loader, train=True) File "E:\CODE\hyperseg\train.py", line 104, in proces_epoch for i, (input, target) in enumerate(pbar): File "E:\ana\envs\hyperseg\lib\site-packages\tqdm\std.py", line 1185, in iter for obj in iterable: File "E:\ana\envs\hyperseg\lib\site-packages\torch\utils\data\dataloader.py", line 521, in next data = self._next_data() File "E:\ana\envs\hyperseg\lib\site-packages\torch\utils\data\dataloader.py", line 1203, in _next_data return self._process_data(data) File "E:\ana\envs\hyperseg\lib\site-packages\torch\utils\data\dataloader.py", line 1229, in _process_data data.reraise() File "E:\ana\envs\hyperseg\lib\site-packages\torch_utils.py", line 425, in reraise raise self.exc_type(msg) TypeError: Caught TypeError in DataLoader worker process 0. Original Traceback (most recent call last): File "E:\ana\envs\hyperseg\lib\site-packages\torch\utils\data_utils\worker.py", line 287, in _worker_loop data = fetcher.fetch(index) File "E:\ana\envs\hyperseg\lib\site-packages\torch\utils\data_utils\fetch.py", line 44, in fetch data = [self.dataset[idx] for idx in possibly_batched_index] File "E:\ana\envs\hyperseg\lib\site-packages\torch\utils\data_utils\fetch.py", line 44, in data = [self.dataset[idx] for idx in possibly_batched_index] File "E:\CODE\hyperseg\datasets\voc_sbd.py", line 94, in getitem img, target = self.transforms(img, target) File "E:\CODE\hyperseg\datasets\seg_transforms.py", line 78, in call input = list(t(*input)) File "E:\CODE\hyperseg\datasets\seg_transforms.py", line 243, in call lbl = F.pad(lbl, padding, self.lbl_fill, self.padding_mode) File "E:\ana\envs\hyperseg\lib\site-packages\torchvision\transforms\functional.py", line 454, in pad return F_pil.pad(img, padding=padding, fill=fill, padding_mode=padding_mode) File "E:\ana\envs\hyperseg\lib\site-packages\torchvision\transforms\functional_pil.py", line 153, in pad image = ImageOps.expand(img, border=padding, **opts) File "E:\ana\envs\hyperseg\lib\site-packages\PIL\ImageOps.py", line 403, in expand draw.rectangle((0, 0, width - 1, height - 1), outline=color, width=border) File "E:\ana\envs\hyperseg\lib\site-packages\PIL\ImageDraw.py", line 259, in rectangle self.draw.draw_rectangle(xy, ink, 0, width) TypeError: an integer is required (got type tuple) my visions are as follows: pytorch:1.9.0 torchvision:0.10.0 cudatoolkit:11.1.1

Artoria1998 avatar Jul 20 '21 10:07 Artoria1998

I don't have access to an Ubuntu at the moment, so I'll need someone to validate a solution. The issue probably originates from this line in seg_transforms.py:. Converting padding to a list of ints might solve the issue.

YuvalNirkin avatar Jul 27 '21 07:07 YuvalNirkin

I'm not sure if I understand you correctly. I changed this line to "padding = [0, 0] + list(np.maximum(self.padding - np.array(img.size), 0))" but this is invalid. There is the same error.

consideragain avatar Jul 28 '21 02:07 consideragain

It might be that the list contains floats instead of ints?

YuvalNirkin avatar Jul 30 '21 05:07 YuvalNirkin

please replace Pillow pip install Pillow==7.1.2

ruanhailiang avatar Aug 04 '21 04:08 ruanhailiang

My Ubuntu was reinstalled. So I can't try the idea. Now I give up reappearing this code. Thank for your help.

consideragain avatar Aug 04 '21 08:08 consideragain

I think the problem was the version of the Pillow package as @ruanhailiang suggested. I was using Pillow version 8.3.1 and getting the same issue that is described here. Installing clean version of Pillow 7.1.2 solved the issue for me. Thanks for the help!

mavibirdesmi avatar Sep 01 '21 10:09 mavibirdesmi

I have added an exact environment installation yml, this should solve this issue.

YuvalNirkin avatar Nov 13 '22 20:11 YuvalNirkin