nnUNet icon indicating copy to clipboard operation
nnUNet copied to clipboard

IndexError: list index out of range

Open hadign20 opened this issue 1 year ago • 2 comments

I am having a strange error when I try to run the inference:

nnUNetv2_predict -i "/NNunet/Data/nnUNet_raw/Dataset500_CRLM/imagesTs/" -o "/Output/Dataset500_CRLM/3dfullers/" -d 500 -c 3d_fullres -p nnUNetPlans

I have 503 test images. Last week I ran this code and it went fine for 221 images with no problem. But suddenly, without me changing anything, it shows two errors even for one image.

This error shows up at the beginning of the process:

0%| | 0/315 [00:00<?, ?it/s] 0%| | 1/315 [00:06<31:57, 6.11s/it] 1%| | 2/315 [00:06<14:53, 2.85s/it] 1%| | 3/315 [00:07<09:26, 1.82s/it] 1%|▏ | 4/315 [00:07<06:52, 1.33s/it] 2%|▏ | 5/315 [00:08<05:27, 1.06s/it] 2%|▏ | 6/315 [00:09<04:36, 1.12it/s] 2%|▏ | 7/315 [00:09<04:04, 1.26it/s] 3%|▎ | 8/315 [00:10<03:42, 1.38it/s] 3%|▎ | 9/315 [00:10<03:27, 1.47it/s] 3%|▎ | 10/315 [00:11<03:18, 1.54it/s] 3%|▎ | 11/315 [00:11<03:11, 1.59it/s] 4%|▍ | 12/315 [00:12<03:06, 1.63it/s] 4%|▍ | 13/315 [00:13<03:02, 1.65it/s] 4%|▍ | 14/315 [00:13<03:00, 1.67it/s] 5%|▍ | 15/315 [00:14<02:58, 1.69it/s] 5%|▌ | 16/315 [00:14<02:56, 1.69it/s] 5%|▌ | 17/315 [00:15<02:55, 1.70it/s] 6%|▌ | 18/315 [00:15<02:54, 1.71it/s] 6%|▌ | 19/315 [00:16<02:53, 1.71it/s] 6%|▋ | 20/315 [00:17<02:52, 1.71it/s] 7%|▋ | 21/315 [00:17<02:51, 1.71it/s] 7%|▋ | 22/315 [00:18<02:50, 1.71it/s] 7%|▋ | 23/315 [00:18<02:50, 1.71it/s] 8%|▊ | 24/315 [00:19<02:49, 1.71it/s] 8%|▊ | 25/315 [00:20<02:49, 1.71it/s] 8%|▊ | 26/315 [00:20<02:48, 1.71it/s] 9%|▊ | 27/315 [00:21<02:48, 1.71it/s] 9%|▉ | 28/315 [00:21<02:47, 1.71it/s]Process SpawnProcess-4: Traceback (most recent call last): File "/home/ghahreh/miniconda3/envs/e1/lib/python3.10/multiprocessing/process.py", line 314, in _bootstrap self.run() File "/home/ghahreh/miniconda3/envs/e1/lib/python3.10/multiprocessing/process.py", line 108, in run self._target(*self._args, **self._kwargs) File "/lila/home/ghahreh/nnUNet/nnunetv2/inference/data_iterators.py", line 57, in preprocess_fromfiles_save_to_queue raise e File "/lila/home/ghahreh/nnUNet/nnunetv2/inference/data_iterators.py", line 31, in preprocess_fromfiles_save_to_queue data, seg, data_properties = preprocessor.run_case(list_of_lists[idx], File "/lila/home/ghahreh/nnUNet/nnunetv2/preprocessing/preprocessors/default_preprocessor.py", line 139, in run_case data, seg = self.run_case_npy(data, seg, data_properties, plans_manager, configuration_manager, File "/lila/home/ghahreh/nnUNet/nnunetv2/preprocessing/preprocessors/default_preprocessor.py", line 78, in run_case_npy data = self._normalize(data, seg, configuration_manager, File "/lila/home/ghahreh/nnUNet/nnunetv2/preprocessing/preprocessors/default_preprocessor.py", line 183, in _normalize scheme = configuration_manager.normalization_schemes[c] IndexError: list index out of range

9%|▉ | 29/315 [00:22<02:47, 1.71it/s] 10%|▉ | 30/315 [00:22<02:46, 1.71it/s] 10%|▉ | 31/315 [00:23<02:46, 1.71it/s] 10%|█ | 32/315 [00:24<02:45, 1.71it/s] 10%|█ | 33/315 [00:24<02:45, 1.71it/s] 11%|█ | 34/315 [00:25<02:44, 1.71it/s] 11%|█ | 35/315 [00:25<02:43, 1.71it/s] 11%|█▏ | 36/315 [00:26<02:43, 1.71it/s]

And this error shows at the end:

99%|█████████▉| 119/120 [01:17<00:00, 1.53it/s] 100%|██████████| 120/120 [01:18<00:00, 1.53it/s] 100%|██████████| 120/120 [01:18<00:00, 1.53it/s] Traceback (most recent call last): File "/home/ghahreh/miniconda3/envs/e1/lib/python3.10/multiprocessing/resource_sharer.py", line 138, in _serve Traceback (most recent call last): File "/home/ghahreh/miniconda3/envs/e1/bin/nnUNetv2_predict", line 8, in with self._listener.accept() as conn: File "/home/ghahreh/miniconda3/envs/e1/lib/python3.10/multiprocessing/connection.py", line 465, in accept sys.exit(predict_entry_point()) File "/lila/home/ghahreh/nnUNet/nnunetv2/inference/predict_from_raw_data.py", line 838, in predict_entry_point deliver_challenge(c, self._authkey) File "/home/ghahreh/miniconda3/envs/e1/lib/python3.10/multiprocessing/connection.py", line 740, in deliver_challenge predictor.predict_from_files(args.i, args.o, save_probabilities=args.save_probabilities, File "/lila/home/ghahreh/nnUNet/nnunetv2/inference/predict_from_raw_data.py", line 249, in predict_from_files response = connection.recv_bytes(256) # reject large message File "/home/ghahreh/miniconda3/envs/e1/lib/python3.10/multiprocessing/connection.py", line 216, in recv_bytes return self.predict_from_data_iterator(data_iterator, save_probabilities, num_processes_segmentation_export) buf = self._recv_bytes(maxlength) File "/lila/home/ghahreh/nnUNet/nnunetv2/inference/predict_from_raw_data.py", line 342, in predict_from_data_iterator File "/home/ghahreh/miniconda3/envs/e1/lib/python3.10/multiprocessing/connection.py", line 414, in _recv_bytes buf = self._recv(4) File "/home/ghahreh/miniconda3/envs/e1/lib/python3.10/multiprocessing/connection.py", line 379, in _recv for preprocessed in data_iterator: File "/lila/home/ghahreh/nnUNet/nnunetv2/inference/data_iterators.py", line 109, in preprocessing_iterator_fromfiles chunk = read(handle, remaining) ConnectionResetError: [Errno 104] Connection reset by peer raise RuntimeError('Background workers died. Look for the error message further up! If there is ' RuntimeError: Background workers died. Look for the error message further up! If there is none then your RAM was full and the worker was killed by the OS. Use fewer workers or get more RAM in that case!

I tried on two different systems (one Linux, one Windows) but I get the same error.

Please let me know how to fix this.

hadign20 avatar Dec 12 '23 18:12 hadign20

Is it possible that the image in question has a different shape than the other ones, i.e. more channels?

dojoh avatar Jan 10 '24 16:01 dojoh

Why don't make nnUNetv2_predict to convert any input image to grayscale without alpha? Or at least throw a error message more meaningful rather than this blant IndexError: list index out of range?

I do this with all my codes, using pillow is very simple.

Sorry, I should do a PR but I don't have the time right now.

alanwilter avatar Mar 20 '24 23:03 alanwilter