pytorch-summary icon indicating copy to clipboard operation
pytorch-summary copied to clipboard

count running_mean and running_var params for BN

Open mostafaelhoushi opened this issue 5 years ago • 2 comments

For batch norm layers, count the running_var and running_mean parameters in batch_norm layers

mostafaelhoushi avatar Aug 09 '19 18:08 mostafaelhoushi

This isn't included since running_mean and running_var are what are used to determine what the learnable parameters (in this came beta and gamma). They aren't really learnable parameters.

Here are docs for reference: https://pytorch.org/docs/stable/nn.html#batchnorm1d

Naireen avatar Jan 02 '20 04:01 Naireen

Thanks @Naireen According to the link you sent provided, I am quoting the following:

Also by default, during training this layer keeps running estimates of its computed mean and variance, which are then used for normalization during evaluation.

If track_running_stats is set to False, this layer then does not keep running estimates, and batch statistics are instead used during evaluation time as well

So, my understanding is that we do need to store running_mean and running_var parameters if track_running_stats option is set to True.

mostafaelhoushi avatar Jan 05 '20 19:01 mostafaelhoushi