Skip to content

Some recent state-of-the-art generative models in ONE notebook: (MIX-)?(GAN|WGAN|BigGAN|MHingeGAN|AMGAN|StyleGAN|StyleGAN2)(\+ADA|\+CR|\+EMA|\+GP|\+R1|\+SA|\+SN)*

Notifications You must be signed in to change notification settings

tsc2017/MIX-GAN

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

15 Commits
 
 
 
 
 
 

Repository files navigation

MIX-GAN

Some recent state-of-the-art generative models in ONE notebook.

This repo implements any method that can match the following regular expression:

(MIX-)?(GAN|WGAN|BigGAN|MHingeGAN|AMGAN|StyleGAN|StyleGAN2)(\+ADA|\+CR|\+EMA|\+GP|\+R1|\+SA|\+SN)*

Major dependencies

  • For the GPU implementation, tensorflow>=2 or tensorflow-gpu==1.14 (some modifications for the calculation of IS and FID will be necessary, see the other repos of mine).
  • For the TPU implemetation, tensorflow>=2.4 or tf-nightly will be necessary.

Free GPU training on Colab

Example In Colab

This implemetation supports automatic mixed-precision training of TensorFlow, which can reduce GPU memory usage and training time dramatically. Therefore, it is recommended to upgrade to Colab Pro in order to use GPUs with Tensor Cores. Training MIX-MHingeGAN with 10 generators and 10 discriminators takes only 1.5 days on a single Tesla V100.

Free TPU training on Colab

Coming soon...

Training on Cloud TPUs

  • First disable Stackdriver Logging to avoid unnecessary charges.
  • Create cloud TPUs, TPU software version should be at least 2.4.0 or nightly.
  • Fill in TPU_NAMES and ZONE in the the above notebook for TPUs. Set up environment variables LOG and DATA, run the notebook.
  • Delete TPUs.

References

https://github.com/igul222/improved_wgan_training
https://github.com/biuyq/CT-GAN
https://github.com/google/compare_gan
https://github.com/ajbrock/BigGAN-PyTorch
https://github.com/taki0112/BigGAN-Tensorflow
https://github.com/brain-research/self-attention-gan
https://github.com/ilyakava/BigGAN-PyTorch
https://github.com/NVlabs/stylegan2
https://github.com/NVlabs/stylegan2-ada

Citation

@article{tang2020lessons,
  title={Lessons Learned from the Training of GANs on Artificial Datasets},
  author={Tang, Shichang},
  journal={arXiv preprint arXiv:2007.06418},
  year={2020}
}

About

Some recent state-of-the-art generative models in ONE notebook: (MIX-)?(GAN|WGAN|BigGAN|MHingeGAN|AMGAN|StyleGAN|StyleGAN2)(\+ADA|\+CR|\+EMA|\+GP|\+R1|\+SA|\+SN)*

Topics

Resources

Stars

Watchers

Forks