Pablo Pernias

Results 31 comments of Pablo Pernias

If I'm not mistaken, you should be able to do something like: ```python z_sample = fized_noise.clone() for transform in reversed(flow.transforms): z_sample = transform.inverse(z_sample) ``` Using this you can also experiment...

You can avoid a lot of those extra steps, also you can run it at 256 by just reducing the batch size as long as Colab gives you a GPU...

I hadn't tried resuming until now, and you're correct, it seems to be broken 🤔 no matter what you pass as an argument, it will be displayed as `None` when...

Sure, just edit `config.py` and and under `class Training(Config):` add `ckpt: str = None` like this: ```python class Training(Config): size: StrictInt iter: StrictInt = 800000 batch: StrictInt = 16 n_sample:...

To me, it would maybe make more sense to make sure that your dataset has no broken images before running any training on it, instead of silently failing on those...

Yeah, that could be a problem because by scaling the Fourier features you are altering the frequencies. My thought was that, if we keep the frequencies below the cutoff value...

Even if we do not achieve equivariance, and we do have those weird deformations, allowing the model to modify the scale can be beneficial, in the same way as your...

@MHRosenberg It should be possible, and the changes are quite simple: First, make it so the `self.affine_fourier` linear layer returns 6 values instead of 4, two for rotation, two for...

@wzmsltw I'm also building a custom model inspired by this paper on the CelebA dataset, and I found something similar happens. I think in my case it's still early in...

Unfortunately no. I have the feeling that the issue is with the sampling method, but it might also be related with how the tokens are masked during training :/