TokenLabeling icon indicating copy to clipboard operation
TokenLabeling copied to clipboard

BatchSize Specified

Open HiIcy opened this issue 2 years ago • 2 comments

if I wanna to use 1p to train, how many batchsize I need to allocate? or there's the formula to compute?, please

HiIcy avatar Mar 06 '22 08:03 HiIcy

Hi, we use batch_size=1024 for most of our experiments.

zihangJiang avatar Mar 16 '22 06:03 zihangJiang

oh, 1024 is too large...

HiIcy avatar Apr 10 '22 04:04 HiIcy