CompressAI icon indicating copy to clipboard operation
CompressAI copied to clipboard

About the y_bpp and the Gaussian entropy model

Open YuKDseele opened this issue 11 months ago • 4 comments

Hello, I have a question about y_bpp and normalization.

In the implementation of the Gaussian entropy model in CompressAI, y_bpp is computed by estimating the likelihood after normalizing the input:

half = float(0.5)

if means is not None:
    values = inputs - means
else:
    values = inputs

scales = self.lower_bound_scale(scales)

values = torch.abs(values)
upper = self._standardized_cumulative((half - values) / scales)
lower = self._standardized_cumulative((-half - values) / scales)
likelihood = upper - lower

Does this mean that during actual training, it is not (y - means) / scales but rather torch.abs(y - means) / self.lower_bound_scale(scales) that is fitted to the standard normal distribution? I need to normalize the latent variable y to obtain a standard spherical normal vector for calculating the spatial correlation.

YuKDseele avatar Dec 24 '24 12:12 YuKDseele