Conv-TasNet icon indicating copy to clipboard operation
Conv-TasNet copied to clipboard

batch_size

Open fncode246 opened this issue 3 years ago • 1 comments

according to table, the batch_size is 3. But what is the segment of each waveform during training? (Is it 3 sec, in this condition what is batch_size?) according to default batch_size in train.py code that is 128 , which value is batch_size ?

Thanks a lot.

fncode246 avatar Apr 16 '21 09:04 fncode246

That's also my question.Via reading the origin paper, wavform is shaped into the type of 4s and 8khz. And they prepare 10 hours train-set and 5 hours test-set. That's so large.If we use batchsize 3,it means that the training will cost us 3000 times literation for one epoch ,which is too slow.By the way , the paper author seems to train the model for 30 hours.

JnveLee avatar Mar 11 '22 17:03 JnveLee