DynamiCrafter icon indicating copy to clipboard operation
DynamiCrafter copied to clipboard

Inquiry about the Role of use_dynamic_rescale in Training vs. Inference

Open gracezhao1997 opened this issue 1 year ago • 3 comments

Thanks for the great work! I am currently delving into the functionality of the use_dynamic_rescale parameter in your project and have encountered a point of confusion that I hope you can clarify.

It appears that during the training phase, use_dynamic_rescale is applied to the input data xt (x = x * extract_into_tensor(self.scale_arr, t, x.shape)). However, during inference, it seems the rescaling is performed on the predicted x0(prev_scale_t = torch.full(size, self.ddim_scale_arr_prev[index], device=device)). This discrepancy—wherein the adjustment is made to the inputs during training but to the predictions at inference time—raises questions about the alignment between the training and inference processes. Is there any reference for this strategy?

gracezhao1997 avatar May 05 '24 07:05 gracezhao1997

Same here, hope @Doubiiu can share some insights

DarrenZhaoFR avatar Aug 07 '24 07:08 DarrenZhaoFR

Same confusion

Chuge0335 avatar Sep 07 '24 08:09 Chuge0335

any update on this question? I'm also little confused and can't find related literature about this strategy.

Vincent-luo avatar Dec 02 '24 02:12 Vincent-luo