bayesian-torch icon indicating copy to clipboard operation
bayesian-torch copied to clipboard

Batchnorm layer

Open fzimmermann89 opened this issue 3 years ago • 0 comments

Hi,

i am rather new to the concept of BNNs and wondering if there is a reason for the BatchNorm affine transformation parameters not being considered as random variables, i.e. why the transformation is deterministic?

Furthermore, batchnorm always returns the KL norm (as zero ) whereas for other layers one can disable it for better integrarion with pytroch sequential wrapper etc..

Overall, I am confused: How is the current batchnorm implementation different from vanilla pytroch except returning a constant zero as second value?

Thans in advance Felix

fzimmermann89 avatar Sep 22 '22 07:09 fzimmermann89