David Marx

Results 175 comments of David Marx

re: noise annealing (via https://github.com/LAION-AI/notebooks/blob/main/DALLE2-Prior%2BDeep-Image-Prior.ipynb): ``` noise_ramp = 1 - min(1, itt / iterations) net_input_noised = net_input if input_noise_strength: phi = min(1, noise_ramp * input_noise_strength) * math.pi / 2 noise...

current blocker: a lot of dependent code assumes latent is a single tensor. downstream operations on the image representation attempt to call methods on it like "clone" that don't have...

migrate to branch `dip_ema_simple`

TargetImage constructor deprecation should be good to go, should clean out some debugging messages and maybe also unnecessary(?) device assignments that were added after 96eaae2.

I think the better solution here is to add EMA as a submodule on the image models that need it using ema-pytorch and just deprecate the EMAImage class entirely

https://github.com/pytti-tools/frame-interpolation

washed-outbrown could be random latent? if that's the case, the issue is that if we have an init image, we should initialize from the latent from inverting the image. I...

early frames are in fact the GAN-inverted init image. maybe it would help if we EMA'd the latent?

tried cranking up the EMA, helped a tiny bit maybe. looks like the real help comes from turning up the direct image weight. I think what' going on here is...

maybe this is the MSE regulation kicking in? if so, maybe we could get around this by adopting the "smart_encode" strategy used in `PixelImage.encode_image`?