Malte Prinzler
Malte Prinzler
changing the environment.yml as below and reinstalling the environment solved the issue for me. ``` name: panohead channels: - pytorch - nvidia - defaults dependencies: - _libgcc_mutex=0.1=main - _openmp_mutex=5.1=1_gnu -...
I faced the same issue. Has anyone found a solution / explanation to this? Seems like this line's only purpose is to prevent timestep=0 to be used in the ddim...
just checked: there are no visible qualitative differences in the image synthesis between the original implementation and my suggested fix but now you can sample ddim with 1000 steps
@TorAP yes i did, what questions do you have?
@TorAP sorry, just noticed that you're talking about LDMs, I have only trained ControlNets so far, i.e. just LDM finetuning so I havent worked much with his repository here.
@haohang96 Maybe the term "normalized" was a bit misleading. Referring to equation 4 from the ddim paper: https://arxiv.org/pdf/2010.02502.pdf  we get x_1 = sqrt(alpha_1)*x_0 + sqrt(1-alpha_1)*epsilon which obviously != x_0....
my point was more that currently, the predictions for x1 are effectively used as the final output of the denoising process, which is not correct
1. I think you are referring to equation 12 instead of 21 2. I think you are exactly deriving what I tried to express, namely, that with the current implementation,...
it seems like the noise in the depth maps comes from using bilinear interpolation during downsampling. would it be possible to upload the depth files with nearest neighbor downsampling applied?...
is there any news on this? I would also like to reproduce the numbers from the paper