v-diffusion-pytorch
v-diffusion-pytorch copied to clipboard
split up sample() to allow backwards pass
This refactor takes the existing sample() call and splits it into subroutines disabling @torch.no_grad()
to enable optimisation of x
. Could be further cleaned up, but this version is working well currently with downstream code.
(this branch has been caught up, but the CLIP submodule dependency was also removed in f96d121. I can further clean this up by reverting this commit and flatten this history if you are interested in incorporating these changes)