kaggle_ndsb2017
kaggle_ndsb2017 copied to clipboard
question about the batch size in step 2 detector
The code is that: model.fit_generator(train_gen, len(train_files) / 1, 12, validation_data=holdout_gen, nb_val_samples=len(holdout_files) / 1, callbacks=[checkpoint, checkpoint_fixed_name, learnrate_scheduler]) Url is: https://github.com/juliandewit/kaggle_ndsb2017/blob/master/step2_train_nodule_detector.py#L387
Why to divide 1? Maybe should divide the batch size?
Hello, That was just a conveinience thing that I used to test on smaller trainsets while experimenting. ie. I made it / 20 to train on a smaller trainset. You can remove the " / 1 " altogether.. There was no relation with the batch size.
Sorry for the confusion.. I'm afraid there will be lots of such confusing things..
Thanks for your replay. In your code, the steps_per_epoch = len(train_files), but I think it should be len(train_files)/batch_size
Hello mileyan, where is that code ? As you put it you are right but I do not see that code. Note that I wrote my own data iterator so behaviour/code might not be as you expect.