taming-transformers
taming-transformers copied to clipboard
Taming Transformers for High-Resolution Image Synthesis
I am really enjoying experimenting with this! I found a cell error in the Colab notebook, and just ignored it and proceeded with subsequent cells. The cell in question is...
TL;DR: custom training is great! is there a good config or way to debug quality of result on small-ish datasets? --- I've managed to train my own custom models using...
Hello, Is there a way to extract the embedding(s) which are feeded to the decoders ? This is to feed another downstream task.
Hi, While training the Net2Net transformer model, after I have trained the VQ-GAN, I see interesting images being generated on the logs, with `no_pix` in their name, which probably means...
Thanks for your great work ! I meet some problem when I train an unconditional transformer on IMagenet. here is my config file , and I run it on 1...
Hello, Thanks for an awesome work! I have a small doubt. I noticed that in the second stage (GPT), all the conditional generation tasks (from class/segmap etc.), same tokens are...
Using the installed `taming` module from other folders causes the `LPIPS VGG` weights to be downloaded again. This causes unnecessary model replication. Saving the weights to cache folder prevents this...
Hi! Thanks for the contribution, amazing work! I am trying to obtain the FID you report for each of the datasets and not obtaining the same one as you... Could...
just fyi, here is another tool, [mis|ab]using VQGAN with CLIP to generate images (including new Gumbel-F8 model) repo: https://github.com/eps696/aphantasia colab: https://colab.research.google.com/github/eps696/aphantasia/blob/master/CLIP_VQGAN.ipynb examples: https://twitter.com/eps696/status/1411808854191529986
Do you (intend to) support multi gpu inference through the torch's nn.DataParallel class? For example: ``` model = vqgan.VQModel(**config.model.params) model = nn.DataParallel(model) ``` Thank you!