unet
unet copied to clipboard
problem in test genrator
Thank you very much for this code there is no problem in training while prediction faced after 25 it failed
25/30 [========================>.....] - ETA: 0sException in thread Thread-2: Traceback (most recent call last): File "D:\miniconda\lib\threading.py", line 916, in _bootstrap_inner self.run() File "D:\miniconda\lib\threading.py", line 864, in run self._target(*self._args, **self._kwargs) File "D:\miniconda\lib\site-packages\keras\engine\training.py", line 612, in data_generator_task generator_output = next(self._generator) StopIteration
26/30 [=========================>....] - ETA: 0sTraceback (most recent call last):
File "main.py", line 21, in
please help
Thank you very much for this code there is no problem in training while prediction faced after 25 it failed
25/30 [========================>.....] - ETA: 0sException in thread Thread-2: Traceback (most recent call last): File "D:\miniconda\lib\threading.py", line 916, in _bootstrap_inner self.run() File "D:\miniconda\lib\threading.py", line 864, in run self._target(*self._args, **self._kwargs) File "D:\miniconda\lib\site-packages\keras\engine\training.py", line 612, in data_generator_task generator_output = next(self._generator) StopIteration
26/30 [=========================>....] - ETA: 0sTraceback (most recent call last): File "main.py", line 21, in results = model.predict_generator(testGene,30,verbose=1) File "D:\miniconda\lib\site-packages\keras\legacy\interfaces.py", line 88, in wrapper return func(*args, **kwargs) File "D:\miniconda\lib\site-packages\keras\engine\training.py", line 2108, in predict_generator outs = self.predict_on_batch(x) File "D:\miniconda\lib\site-packages\keras\engine\training.py", line 1696, in predict_on_batch outputs = self.predict_function(ins) File "D:\miniconda\lib\site-packages\keras\backend\tensorflow_backend.py", line 2229, in call feed_dict=feed_dict) File "D:\miniconda\lib\site-packages\tensorflow\python\client\session.py", line 900, in run run_metadata_ptr) File "D:\miniconda\lib\site-packages\tensorflow\python\client\session.py", line 1111, in _run str(subfeed_t.get_shape()))) ValueError: Cannot feed value of shape () for Tensor 'input_1:0', which has shape '(?, 256, 256, 1)'
please help
Check the 25th image exist or is the correct image with same dimensions as the others. And you have actually 30 images in the folder being read
thank you for response i have touched nothing only download the code and executed it please help
@vcvishal I encountered this error when running model.predict_generator(),
Traceback Error:
...
line 2272, in predict_generator
generator_output = next(output_generator)
StopIteration
In my case, I inserted max_queue_size=1
in model.predict_generator(), then the code no longer produced Stopiteration error. Then, saveResults() method is run normally.
I was inspired this by the last answer from here.
Having the same problem. I did not change the code or data.
21/30 [====================>.........] - ETA: 0s
Exception in thread Thread-105:
Traceback (most recent call last):
File "/home/x/anaconda3/envs/unet/lib/python3.5/threading.py", line 914, in _bootstrap_inner
self.run()
File "/home/x/anaconda3/envs/unet/lib/python3.5/threading.py", line 862, in run
self._target(*self._args, **self._kwargs)
File "/home/x/anaconda3/envs/unet/lib/python3.5/site-packages/keras/engine/training.py", line 612, in data_generator_task
generator_output = next(self._generator)
StopIteration
Traceback (most recent call last):
File "/home/x/anaconda3/envs/unet/lib/python3.5/site-packages/IPython/core/interactiveshell.py", line 2878, in run_code
exec(code_obj, self.user_global_ns, self.user_ns)
File "<ipython-input-2-b4a4a913e277>", line 1, in <module>
runfile('/home/x/PycharmProjects/unet/main.py', wdir='/home/x/PycharmProjects/unet')
File "/snap/pycharm-professional/159/helpers/pydev/_pydev_bundle/pydev_umd.py", line 197, in runfile
pydev_imports.execfile(filename, global_vars, local_vars) # execute the script
File "/snap/pycharm-professional/159/helpers/pydev/_pydev_imps/_pydev_execfile.py", line 18, in execfile
exec(compile(contents+"\n", file, 'exec'), glob, loc)
File "/home/x/PycharmProjects/unet/main.py", line 25, in <module>
results = model.predict_generator(testGene, 30, verbose=1)
File "/home/x/anaconda3/envs/unet/lib/python3.5/site-packages/keras/legacy/interfaces.py", line 88, in wrapper
return func(*args, **kwargs)
File "/home/x/anaconda3/envs/unet/lib/python3.5/site-packages/keras/engine/training.py", line 2120, in predict_generator
outs = self.predict_on_batch(x)
File "/home/x/anaconda3/envs/unet/lib/python3.5/site-packages/keras/engine/training.py", line 1705, in predict_on_batch
outputs = self.predict_function(ins)
File "/home/x/anaconda3/envs/unet/lib/python3.5/site-packages/keras/backend/tensorflow_backend.py", line 2269, in __call__
**self.session_kwargs)
File "/home/x/anaconda3/envs/unet/lib/python3.5/site-packages/tensorflow/python/client/session.py", line 895, in run
run_metadata_ptr)
File "/home/x/anaconda3/envs/unet/lib/python3.5/site-packages/tensorflow/python/client/session.py", line 1100, in _run
% (np_val.shape, subfeed_t.name, str(subfeed_t.get_shape())))
ValueError: Cannot feed value of shape () for Tensor 'input_1:0', which has shape '(?, 256, 256, 1)'
@vcvishal In my case, I inserted
max_queue_size=1
in model.predict_generator()
That is what I get then:
TypeError: predict_generator() got an unexpected keyword argument 'max_queue_size'
This problem might be caused by a memory overflow, I tryed to set parameter from 30 to 20, and it works.
results = model.predict_generator(testGene,30,verbose=1)
in main.py