TokenLabeling
TokenLabeling copied to clipboard
BatchSize Specified
if I wanna to use 1p to train, how many batchsize I need to allocate? or there's the formula to compute?, please
Hi, we use batch_size=1024
for most of our experiments.
oh, 1024 is too large...