ssd_keras icon indicating copy to clipboard operation
ssd_keras copied to clipboard

About generator in SSD_training.ipynb

Open MAGI003769 opened this issue 6 years ago • 1 comments

Thanks for your brilliant work on such a Keras port of SSD but I have a questions about generator constructed in the training demo of SSD_training.ipynb. In __init__ fucntion

self.train_batches = len(train_keys)
self.val_batches = len(val_keys)

These two attributes specify the number of batches, or number of steps, per epoch. I thinks such a initialization directly make the number of train/validation batches equal to that of train/validation samples. Does that mean the batch size should be one ???

Just my superficial understanding, this perhaps is a problem that significantly prolongs the training process and makes the model suffers overfitting as the validation loss increased along with train going further. If my thinking is wrong, please forgive me. I'm only a tyro in the field of object detection and deep leaning.

Hope you can share your opinion. Thanks for your patience.

MAGI003769 avatar Apr 05 '18 13:04 MAGI003769

If I understand your question, difference between train/val batch size does not affect to train quality, because val process is only prediction (without weights changing), and val batch size affects only to val phase speed, and depends on your memory size (the more memory you have, the bigger batch size you can use).

Also, in fact, you can use different batch size on different epochs and even on different steps, but it is not common practice. The answer of train batch size sense is in area of SGD, read about this method if want more info, batch size on prediction not about SGD, just about speed and memory usage

Kwentar avatar Jun 18 '18 11:06 Kwentar