PyTorch-Learned-Image-Compression-with-GMM-and-Attention icon indicating copy to clipboard operation
PyTorch-Learned-Image-Compression-with-GMM-and-Attention copied to clipboard

gaussian1 = torch.distributions.laplace.Laplace(mean1, sigma1) ?

Open JiangWeibeta opened this issue 3 years ago • 3 comments

I think here should be torch.distributions.normal.Normal. Why you use Laplace here? Could you tell me the reason? thank you.

JiangWeibeta avatar Nov 13 '21 05:11 JiangWeibeta

I also want to know.

MillionLee avatar Nov 22 '21 11:11 MillionLee

This problem i also interested when i wrote this code. The base code is copy from here, and i also tested torch.distributions.normal.Normal and Laplace in the code. The results shows that two methods have the similar results. If you know the functional images for Gaussian distribution and Laplace distribution, you can see that the two functional images is similar.

LiuLei95 avatar Nov 24 '21 11:11 LiuLei95

Thanks for your reply! I also test Gaussian and Laplace at the same time, and they have the similar performance as you said. I think this is because three Gaussian distributions or three Laplace distributions can both model the latent representation well.

MillionLee avatar Nov 24 '21 12:11 MillionLee