MIX-GAN
MIX-GAN copied to clipboard
Some recent state-of-the-art generative models in ONE notebook: (MIX-)?(GAN|WGAN|BigGAN|MHingeGAN|AMGAN|StyleGAN|StyleGAN2)(\+ADA|\+CR|\+EMA|\+GP|\+R1|\+SA|\+SN)*
MIX-GAN
code for the paper Lessons Learned from the Training of GANs on Artificial Datasets and beyond
Some recent state-of-the-art generative models in ONE notebook.
This repo implements any method that can match the following regular expression:
(MIX-)?(GAN|WGAN|BigGAN|MHingeGAN|AMGAN|StyleGAN|StyleGAN2)(\+ADA|\+CR|\+EMA|\+GP|\+R1|\+SA|\+SN)*
Major dependencies
- For the GPU implementation,
tensorflow>=2ortensorflow-gpu==1.14(some modifications for the calculation of IS and FID will be necessary, see the other repos of mine). - For the TPU implemetation,
tensorflow>=2.4ortf-nightlywill be necessary.
Free GPU training on Colab
This implemetation supports automatic mixed-precision training of TensorFlow, which can reduce GPU memory usage and training time dramatically. Therefore, it is recommended to upgrade to Colab Pro in order to use GPUs with Tensor Cores. Training MIX-MHingeGAN with 10 generators and 10 discriminators takes only 1.5 days on a single Tesla V100.
Free TPU training on Colab
Coming soon...
Training on Cloud TPUs
- First disable Stackdriver Logging to avoid unnecessary charges.
- Create cloud TPUs, TPU software version should be at least
2.4.0ornightly. - Fill in
TPU_NAMESandZONEin the the above notebook for TPUs. Set up environment variablesLOGandDATA, run the notebook. - Delete TPUs.
References
https://github.com/igul222/improved_wgan_training
https://github.com/biuyq/CT-GAN
https://github.com/google/compare_gan
https://github.com/ajbrock/BigGAN-PyTorch
https://github.com/taki0112/BigGAN-Tensorflow
https://github.com/brain-research/self-attention-gan
https://github.com/ilyakava/BigGAN-PyTorch
https://github.com/NVlabs/stylegan2
https://github.com/NVlabs/stylegan2-ada
Citation
@article{tang2020lessons,
title={Lessons Learned from the Training of GANs on Artificial Datasets},
author={Tang, Shichang},
journal={arXiv preprint arXiv:2007.06418},
year={2020}
}