BottleneckTransformers icon indicating copy to clipboard operation
BottleneckTransformers copied to clipboard

A bug shows when the batch_size sets 1

Open liuhui0401 opened this issue 3 years ago • 1 comments

When I set batch_size 1, a bug shows as "ValueError: Expected more than 1 value per channel when training, got input size torch.Size([1, 512, 1, 1])". I wonder how to solve this problem.

liuhui0401 avatar Nov 30 '21 08:11 liuhui0401

ok, I got it. The batchnorm layer needs more than one sample to calculate the parameters in it.

liuhui0401 avatar Dec 24 '21 02:12 liuhui0401