nsynth_wavenet
nsynth_wavenet copied to clipboard
batch size
Hello. When you train for 400 000 or 200 000 iteration. Which batch size are you using? is it 28? On 1 GPU (nvidia Geforce 1080 with 11178MiB of memory) I cannot increase batch size more than 2. This means that in order to get to your 400000 steps supposing your batch size is 28, I would have to train for 14 * 400 000 steps, is that right ?
Hi, larger batch size usually leads to stabler gradients and convergence in earlier iteration. However there is no linear relation between convergence speed and batch size.