FaceFormer icon indicating copy to clipboard operation
FaceFormer copied to clipboard

Why batch_size = 1

Open bezorro opened this issue 1 year ago • 4 comments

Hi, I find that batch_size = 1(refer to link). Is there any reason? I think it may be faster with a larger batch_size

bezorro avatar Jul 20 '22 13:07 bezorro

Hi,because the lengths of datas are different,the collate() in dataloader can't work if batchsize!=1.

TonyLRJ avatar Jul 21 '22 03:07 TonyLRJ

Hi,because the lengths of datas are different,the collate() in dataloader can't work if batchsize!=1.

Thanks @TonyLRJ , so we can implement collate_fn of the dataloader to support the collecting data with different lengths? If the answer is YES, I will try to fix it. Are there any other reason? e.g. the model may not able to forward with batch_size!=1?

bezorro avatar Jul 21 '22 03:07 bezorro

Hi,because the lengths of datas are different,the collate() in dataloader can't work if batchsize!=1.

Thanks @TonyLRJ , so we can implement collate_fn of the dataloader to support the collecting data with different lengths? If the answer is YES, I will try to fix it. Are there any other reason? e.g. the model may not able to forward with batch_size!=1?

Hi, have you fixed this problem? And could you please share the solution?

oliver8459 avatar Sep 07 '22 07:09 oliver8459

I found the main difficult point is that how to process the batch mode of linear_interpolation with padding.

icech avatar Nov 24 '22 06:11 icech