cuda out of memory
Hi. Thanks because of your repo. how can I find the batch_size? I couldn't find it in train.py plz help
Hey, it's set in the AspectRatioBasedSampler in train.py, right before dataloader_train is created:
sampler = AspectRatioBasedSampler(dataset_train, batch_size=4, drop_last=False)
Hi. sorry i didn't understand what you mean. plz explain more. Ty
On Tue, Aug 4, 2020 at 9:05 AM Lukas Struppek [email protected] wrote:
Hey, it's set in the AspectRatioBasedSampler in train.py, right before dataloader_train is created: sampler = AspectRatioBasedSampler(dataset_train, batch_size=4, drop_last=False)
— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/yhenon/pytorch-retinanet/issues/180#issuecomment-668684892, or unsubscribe https://github.com/notifications/unsubscribe-auth/AM3F7266PMJ7HWFLUWKAPI3R7AWU3ANCNFSM4PTCMAUA .
Hi, you can change the batch size in line 69 in the constructor of AspectRatioBasedSampler. See my code snippet above, there you can set batch_size as parameter.
Thank you very much dear Lukas Struppel
On Wed, Aug 5, 2020 at 11:10 AM Lukas Struppek [email protected] wrote:
Hi, you can change the batch size in line 69 in the constructor of AspectRatioBasedSampler. See my code snippet above, there you can set batch_size as parameter.
— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/yhenon/pytorch-retinanet/issues/180#issuecomment-669010934, or unsubscribe https://github.com/notifications/unsubscribe-auth/AM3F722C43QSBOTJENV5L2LR7D5E3ANCNFSM4PTCMAUA .