nsynth_wavenet icon indicating copy to clipboard operation
nsynth_wavenet copied to clipboard

batch size

Open DLZML001 opened this issue 6 years ago • 1 comments

Hello. When you train for 400 000 or 200 000 iteration. Which batch size are you using? is it 28? On 1 GPU (nvidia Geforce 1080 with 11178MiB of memory) I cannot increase batch size more than 2. This means that in order to get to your 400000 steps supposing your batch size is 28, I would have to train for 14 * 400 000 steps, is that right ?

DLZML001 avatar Oct 16 '18 10:10 DLZML001

Hi, larger batch size usually leads to stabler gradients and convergence in earlier iteration. However there is no linear relation between convergence speed and batch size.

bfs18 avatar Oct 17 '18 03:10 bfs18