DenoisingDiffusionProbabilityModel-ddpm- icon indicating copy to clipboard operation
DenoisingDiffusionProbabilityModel-ddpm- copied to clipboard

Question about variance calculation in sampling step

Open HIT-LiuChen opened this issue 1 year ago • 5 comments

In Diffusion.py line 66, posterior_var is calculated. Why do you use torch.cat([self.posterior_var[1:2], self.betas[1:]]) in line 77 to extract variance instead of posterior_var.

    def p_mean_variance(self, x_t, t):
        # below: only log_variance is used in the KL computations
        var = torch.cat([self.posterior_var[1:2], self.betas[1:]])    # i think var should be equal to self.posterior_var
        var = extract(var, t, x_t.shape)

        eps = self.model(x_t, t)
        xt_prev_mean = self.predict_xt_prev_mean_from_eps(x_t, t, eps=eps)

        return xt_prev_mean, var

HIT-LiuChen avatar Mar 23 '23 08:03 HIT-LiuChen

I have the same doubt, what's the reason for doing this ?

ljw919 avatar May 09 '23 09:05 ljw919

俺也有这个疑问,方差不是应该是self.posterior_var吗

kache1995 avatar May 25 '23 07:05 kache1995

Have you solved the problem? Can you fill me in?

tsWen0309 avatar Sep 01 '23 03:09 tsWen0309

@zoubohao Could you answer this issue? Thank you!

MetaInsight7 avatar Dec 16 '23 15:12 MetaInsight7

In the paper DDPM, it seems that var is equal to posterior_var or betas. And I don't know why they were concatenated here?

chenchen278 avatar Apr 23 '24 03:04 chenchen278