bayesian-torch
bayesian-torch copied to clipboard
Batchnorm layer
Hi,
i am rather new to the concept of BNNs and wondering if there is a reason for the BatchNorm affine transformation parameters not being considered as random variables, i.e. why the transformation is deterministic?
Furthermore, batchnorm always returns the KL norm (as zero ) whereas for other layers one can disable it for better integrarion with pytroch sequential wrapper etc..
Overall, I am confused: How is the current batchnorm implementation different from vanilla pytroch except returning a constant zero as second value?
Thans in advance Felix