deep-learning-with-python-notebooks icon indicating copy to clipboard operation
deep-learning-with-python-notebooks copied to clipboard

5.14 Training Convnet with data-augmentation generators

Open ghimireadarsh opened this issue 4 years ago • 2 comments

Why the batchsize = 32, should it not be 20? Since the steps_per_epoch = 100 and there is issue when using the generator in this way, since the batchsize is not matching up with amount of data being generated. Thus the current keras generator requires to use repeat() in generator to matchup data generated. Correct me if I am wrong

ghimireadarsh avatar Oct 10 '20 07:10 ghimireadarsh

Steps per epoch should be equal to sample//batch_size

ghimireadarsh avatar Oct 19 '20 13:10 ghimireadarsh

I have the same question when I use batchsize=32, it gives me the error. 63/100 [=================>............] - ETA: 7s - loss: 0.6931 - acc: 0.5150WARNING:tensorflow:Your input ran out of data; interrupting training. Make sure that your dataset or generator can generate at least steps_per_epoch * epochs batches (in this case, 10000 batches). You may need to use the repeat() function when building your dataset.

cocoma16 avatar Mar 31 '21 18:03 cocoma16