BottleneckTransformers
BottleneckTransformers copied to clipboard
A bug shows when the batch_size sets 1
When I set batch_size 1, a bug shows as "ValueError: Expected more than 1 value per channel when training, got input size torch.Size([1, 512, 1, 1])". I wonder how to solve this problem.
ok, I got it. The batchnorm layer needs more than one sample to calculate the parameters in it.