hiddenlayer
hiddenlayer copied to clipboard
unable to handle batchnorm1d in pytorch
A simple example, no issues
When added batchnorm1d between layers,
Expected more than 1 value per channel when training, got input size torch.Size([1, 100])
It seems like it's treating batchnorm1d as batchnorm2d?