Conv-TasNet
Conv-TasNet copied to clipboard
batch_size
according to table, the batch_size is 3. But what is the segment of each waveform during training? (Is it 3 sec, in this condition what is batch_size?) according to default batch_size in train.py code that is 128 , which value is batch_size ?
Thanks a lot.
That's also my question.Via reading the origin paper, wavform is shaped into the type of 4s and 8khz. And they prepare 10 hours train-set and 5 hours test-set. That's so large.If we use batchsize 3,it means that the training will cost us 3000 times literation for one epoch ,which is too slow.By the way , the paper author seems to train the model for 30 hours.