yolact icon indicating copy to clipboard operation
yolact copied to clipboard

error: (-215:Assertion failed) !dsize.empty() in function 'resize'

Open jieruyao49 opened this issue 4 years ago • 5 comments

When I train the Mo dataset, I will report the following error:

Scaling parameters by 0.12 to account for a batch size of 1.
Per-GPU batch size is less than the recommended limit for batch norm. Disabling batch norm.
loading annotations into memory...
Done (t=0.22s)
creating index...
index created!
loading annotations into memory...
Done (t=0.20s)
creating index...
index created!
Initializing weights...
Begin training!

/home/rubyyao/PycharmProjects/yolact-master/utils/augmentations.py:315: VisibleDeprecationWarning: Creating an ndarray from ragged nested sequences (which is a list-or-tuple of lists-or-tuples-or ndarrays with different lengths or shapes) is deprecated. If you meant to do this, you must specify 'dtype=object' when creating the ndarray.
  mode = random.choice(self.sample_options)
/home/rubyyao/PycharmProjects/yolact-master/utils/augmentations.py:315: VisibleDeprecationWarning: Creating an ndarray from ragged nested sequences (which is a list-or-tuple of lists-or-tuples-or ndarrays with different lengths or shapes) is deprecated. If you meant to do this, you must specify 'dtype=object' when creating the ndarray.
  mode = random.choice(self.sample_options)
/home/rubyyao/PycharmProjects/yolact-master/utils/augmentations.py:315: VisibleDeprecationWarning: Creating an ndarray from ragged nested sequences (which is a list-or-tuple of lists-or-tuples-or ndarrays with different lengths or shapes) is deprecated. If you meant to do this, you must specify 'dtype=object' when creating the ndarray.
  mode = random.choice(self.sample_options)
/home/rubyyao/PycharmProjects/yolact-master/utils/augmentations.py:315: VisibleDeprecationWarning: Creating an ndarray from ragged nested sequences (which is a list-or-tuple of lists-or-tuples-or ndarrays with different lengths or shapes) is deprecated. If you meant to do this, you must specify 'dtype=object' when creating the ndarray.
  mode = random.choice(self.sample_options)
[  0]       0 || B: 7.360 | C: 20.863 | M: 8.260 | S: 56.083 | T: 92.567 || ETA: 190 days, 15:24:52 || timer: 2.574
[  0]      10 || B: 5.726 | C: 17.737 | M: 7.301 | S: 37.395 | T: 68.159 || ETA: 42 days, 1:35:57 || timer: 0.650
[  0]      20 || B: 6.969 | C: 14.140 | M: 6.866 | S: 20.999 | T: 48.974 || ETA: 39 days, 14:47:49 || timer: 0.556
[  0]      30 || B: 6.986 | C: 11.675 | M: 7.733 | S: 14.896 | T: 41.289 || ETA: 35 days, 23:01:34 || timer: 0.164
[  0]      40 || B: 6.781 | C: 9.815 | M: 8.140 | S: 11.550 | T: 36.285 || ETA: 35 days, 23:02:25 || timer: 0.541
[  0]      50 || B: 6.862 | C: 9.072 | M: 8.370 | S: 9.719 | T: 34.024 || ETA: 34 days, 17:45:54 || timer: 0.253
[  0]      60 || B: 7.093 | C: 8.399 | M: 8.293 | S: 8.365 | T: 32.149 || ETA: 33 days, 14:14:49 || timer: 0.228
[  0]      70 || B: 6.889 | C: 7.832 | M: 7.734 | S: 7.375 | T: 29.830 || ETA: 32 days, 19:37:17 || timer: 0.148
[  0]      80 || B: 6.705 | C: 7.282 | M: 7.371 | S: 6.566 | T: 27.923 || ETA: 32 days, 13:55:47 || timer: 0.563
Traceback (most recent call last):
  File "/home/rubyyao/PycharmProjects/yolact-master/train.py", line 509, in <module>
    train()
  File "/home/rubyyao/PycharmProjects/yolact-master/train.py", line 275, in train
    for datum in data_loader:
  File "/home/rubyyao/anaconda3/envs/yolact_pytorch/lib/python3.7/site-packages/torch/utils/data/dataloader.py", line 819, in __next__
    return self._process_data(data)
  File "/home/rubyyao/anaconda3/envs/yolact_pytorch/lib/python3.7/site-packages/torch/utils/data/dataloader.py", line 846, in _process_data
    data.reraise()
  File "/home/rubyyao/anaconda3/envs/yolact_pytorch/lib/python3.7/site-packages/torch/_utils.py", line 369, in reraise
    raise self.exc_type(msg)
cv2.error: Caught error in DataLoader worker process 1.
Original Traceback (most recent call last):
  File "/home/rubyyao/anaconda3/envs/yolact_pytorch/lib/python3.7/site-packages/torch/utils/data/_utils/worker.py", line 178, in _worker_loop
    data = fetcher.fetch(index)
  File "/home/rubyyao/anaconda3/envs/yolact_pytorch/lib/python3.7/site-packages/torch/utils/data/_utils/fetch.py", line 44, in fetch
    data = [self.dataset[idx] for idx in possibly_batched_index]
  File "/home/rubyyao/anaconda3/envs/yolact_pytorch/lib/python3.7/site-packages/torch/utils/data/_utils/fetch.py", line 44, in <listcomp>
    data = [self.dataset[idx] for idx in possibly_batched_index]
  File "/home/rubyyao/PycharmProjects/yolact-master/data/coco.py", line 94, in __getitem__
    im, gt, masks, h, w, num_crowds = self.pull_item(index)
  File "/home/rubyyao/PycharmProjects/yolact-master/data/coco.py", line 162, in pull_item
    {'num_crowds': num_crowds, 'labels': target[:, 4]})
  File "/home/rubyyao/PycharmProjects/yolact-master/utils/augmentations.py", line 694, in __call__
    return self.augment(img, masks, boxes, labels)
  File "/home/rubyyao/PycharmProjects/yolact-master/utils/augmentations.py", line 55, in __call__
    img, masks, boxes, labels = t(img, masks, boxes, labels)
  File "/home/rubyyao/PycharmProjects/yolact-master/utils/augmentations.py", line 162, in __call__
    masks = cv2.resize(masks, (width, height))
cv2.error: OpenCV(4.5.2) /tmp/pip-req-build-oxjbfc17/opencv/modules/imgproc/src/resize.cpp:3688: error: (-215:Assertion failed) !dsize.empty() in function 'resize'

jieruyao49 avatar Jun 01 '21 06:06 jieruyao49

Hey! @jieruyao49 were you able to fix this error? If so what did you do to fix it? I'm facing the same error. Thank you!

InvincibleKnight avatar Jun 07 '21 09:06 InvincibleKnight

hi I have the same issue

Vaaaaaalllll avatar Sep 15 '21 08:09 Vaaaaaalllll

same here

xanjay avatar Nov 28 '21 17:11 xanjay

It indicates, that the number of objects in an image is more than ~500. CV resize method cannot process that many channels. https://github.com/opencv/opencv/issues/14770

Hydr8O avatar Dec 09 '21 16:12 Hydr8O

as @Hydr8O said if number of objects in an image is more than ~500 (actually more than 512) open cv method crushes. To solve this problem modify the line

masks = cv2.resize(masks, (width, height))

at /yolact-master/utils/augmentations.py, (~ line 162):

cv_limit = 512
if masks.shape[2] <= cv_limit:
    masks = cv2.resize(masks, (width, height))
else:
    # split masks array on batches with max size 512 along channel axis, resize and merge them back
    masks = np.concatenate([cv2.resize(masks[:, :, i:min(i + cv_limit, masks.shape[2])], (width, height))
                            for i in range(0, masks.shape[2], cv_limit)], axis=2)

VABer-dv avatar Apr 26 '22 11:04 VABer-dv