kaggle_ndsb2017 icon indicating copy to clipboard operation
kaggle_ndsb2017 copied to clipboard

question about the batch size in step 2 detector

Open mileyan opened this issue 7 years ago • 3 comments

The code is that: model.fit_generator(train_gen, len(train_files) / 1, 12, validation_data=holdout_gen, nb_val_samples=len(holdout_files) / 1, callbacks=[checkpoint, checkpoint_fixed_name, learnrate_scheduler]) Url is: https://github.com/juliandewit/kaggle_ndsb2017/blob/master/step2_train_nodule_detector.py#L387

Why to divide 1? Maybe should divide the batch size?

mileyan avatar May 19 '17 08:05 mileyan

Hello, That was just a conveinience thing that I used to test on smaller trainsets while experimenting. ie. I made it / 20 to train on a smaller trainset. You can remove the " / 1 " altogether.. There was no relation with the batch size.

Sorry for the confusion.. I'm afraid there will be lots of such confusing things..

juliandewit avatar May 19 '17 09:05 juliandewit

Thanks for your replay. In your code, the steps_per_epoch = len(train_files), but I think it should be len(train_files)/batch_size

mileyan avatar May 19 '17 09:05 mileyan

Hello mileyan, where is that code ? As you put it you are right but I do not see that code. Note that I wrote my own data iterator so behaviour/code might not be as you expect.

juliandewit avatar May 22 '17 07:05 juliandewit