denoising-diffusion-pytorch icon indicating copy to clipboard operation
denoising-diffusion-pytorch copied to clipboard

Question about discretized logistic likelihood function

Open janelawrence opened this issue 1 year ago • 0 comments

Hi, I am confused about why do we scale the value of the x0 sample from x1 to [-1, 1].

I understand why when x is between (-1, 1), the log-likelihood would become L_{t-1} when t = 0.

But what about when x = 1? or x = -1? what role does it play in the loss function?

I also don't understand the discretized log-likelihood function setup. I only sort of get the idea that x0 is an image so each pixel is in {0, ..., 255} discrete. But why does the integral give the probability mass of x0?

Any help would be greatly appreciated!

janelawrence avatar Nov 20 '23 03:11 janelawrence