FOD-Net
FOD-Net copied to clipboard
Question about the batch size
hi, thank you for sharing your work, but I'm really interested in how your training phase looks like, in your paper you mentioned that your batch size is 64, what I want to know is, a batch here means 64 FODs in a batch or 64 patches in a batch?
If it is 64 patches in a batch, I noticed that your learning rate is 0.01, which is, in my point of view, to large, and make the network learning painfully. If the batch is 64 FODs, how did you solve the memory issues.
Looking forward for your reply.
Kind regards Jia