Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3283254.3283282acmconferencesArticle/Chapter ViewAbstractPublication Pagessiggraph-asiaConference Proceedingsconference-collections
research-article

On the convergence and mode collapse of GAN

Published: 04 December 2018 Publication History
  • Get Citation Alerts
  • Abstract

    Generative adversarial network (GAN) is a powerful generative model. However, it suffers from several problems, such as convergence instability and mode collapse. To overcome these drawbacks, this paper presents a novel architecture of GAN, which consists of one generator and two different discriminators. With the fact that GAN is the analogy of a minimax game, the proposed architecture is as follows. The generator (G) aims to produce realistic-looking samples to fool both of two discriminators. The first discriminator (D1) rewards high scores for samples from the data distribution, while the second one (D2) favors samples from the generator conversely. Specifically, the ResBlock and minibatch discrimination (MD) architectures are adopted in D1 to improve the diversity of the samples. The leaky rectified linear unit (Leaky ReLU) and batch normalization (BN) are replaced by the scaled exponential linear unit (SELU) in D2 to alleviate the convergence problem. A new loss function that minimizes the KL divergence is designed to better optimize the model. Extensive experiments on CIFAR-10/100 datasets demonstrate that the proposed method can effectively solve the problems of convergence and mode collapse.

    Supplementary Material

    ZIP File (a21-zhang.zip)
    Supplemental material.

    References

    [1]
    Martin Arjovsky and Léon Bottou. 2017. Towards principled methods for training generative adversarial networks. In ICLR (2017).
    [2]
    Marc G Bellemare, Ivo Danihelka, Will Dabney, Shakir Mohamed, Balaji Lakshminarayanan, Stephan Hoyer, and Rémi Munos. 2017. The cramer distance as a solution to biased wasserstein gradients. arXiv preprint arXiv:1705.10743 (2017).
    [3]
    David Berthelot, Tom Schumm, and Luke Metz. 2017. Began: Boundary equilibrium generative adversarial networks. arXiv preprint arXiv:1703.10717 (2017).
    [4]
    Jia Deng, Wei Dong, Richard Socher, Li-Jia Li, Kai Li, and Li Fei-Fei. 2009. Imagenet: A large-scale hierarchical image database. In Computer Vision and Pattern Recognition, 2009. CVPR 2009. IEEE Conference on. IEEE, 248--255.
    [5]
    Ian Goodfellow, Jean Pouget-Abadie, Mehdi Mirza, Bing Xu, David Warde-Farley, Sherjil Ozair, Aaron Courville, and Yoshua Bengio. 2014. Generative adversarial nets. In Advances in NIPS. 2672--2680.
    [6]
    Ishaan Gulrajani, Faruk Ahmed, Martin Arjovsky, Vincent Dumoulin, and Aaron C Courville. 2017. Improved training of wasserstein gans. In Advances in NIPS. 5769--5779.
    [7]
    hvy. 2017. chainer-inception-score. online: https://github.com/hvy/chainer-inception-score (2017).
    [8]
    Günter Klambauer, Thomas Unterthiner, Andreas Mayr, and Sepp Hochreiter. 2017. Self-normalizing neural networks. In Advances NIPS. 972--981.
    [9]
    Naveen Kodali, James Hays, Jacob Abernethy, and Zsolt Kira. 2018. On convergence and stability of gans. (2018).
    [10]
    Alex Krizhevsky, Vinod Nair, and Geoffrey Hinton. 2014. The CIFAR-10 dataset. online: http://www.cs.toronto.edu/kriz/cifar.html (2014).
    [11]
    Tu Nguyen, Trung Le, Hung Vu, and Dinh Phung. 2017. Dual discriminator generative adversarial nets. In Advances in NIPS. 2667--2677.
    [12]
    pfnet research. 2017. chainer-GAN-lib. online: https://github.com/pfnet-research/chainer-gan-lib (2017).
    [13]
    Alec Radford, Luke Metz, and Soumith Chintala. 2016. Unsupervised representation learning with deep convolutional generative adversarial networks. In ICLR (2016).
    [14]
    Yunus Saatci and Andrew G Wilson. 2017. Bayesian GAN. In Advances in NIPS. 3622--3631.
    [15]
    Tim Salimans, Ian Goodfellow, Wojciech Zaremba, Vicki Cheung, Alec Radford, and Xi Chen. 2016. Improved techniques for training gans. In Advances in NIPS. 2234--2242.
    [16]
    Lucas Theis, Aäron van den Oord, and Matthias Bethge. 2015. A note on the evaluation of generative models. arXiv preprint arXiv:1511.01844 (2015).
    [17]
    Seiya Tokui, Kenta Oono, Shohei Hido, and Justin Clayton. 2015. Chainer: a next-generation open source framework for deep learning. In Proceedings of workshop on machine learning systems (LearningSys) in the twenty-ninth annual conference on neural information processing systems (NIPS), Vol. 5.
    [18]
    David Warde-Farley and Yoshua Bengio. 2016. Improving generative adversarial networks with denoising feature matching. (2016).

    Cited By

    View all
    • (2024)Challenges of Using Synthetic Data Generation Methods for Tabular MicrodataApplied Sciences10.3390/app1414597514:14(5975)Online publication date: 9-Jul-2024
    • (2024)Improving the Leaking of Augmentations in Data-Efficient GANs via Adaptive Negative Data Augmentation2024 IEEE/CVF Winter Conference on Applications of Computer Vision (WACV)10.1109/WACV57701.2024.00533(5400-5409)Online publication date: 3-Jan-2024
    • (2024)Improving the Fairness of the Min-Max Game in GANs Training2024 IEEE/CVF Winter Conference on Applications of Computer Vision (WACV)10.1109/WACV57701.2024.00289(2898-2907)Online publication date: 3-Jan-2024
    • Show More Cited By

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    SA '18: SIGGRAPH Asia 2018 Technical Briefs
    December 2018
    135 pages
    ISBN:9781450360623
    DOI:10.1145/3283254
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 04 December 2018

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. GAN
    2. convergence
    3. mode collapse

    Qualifiers

    • Research-article

    Funding Sources

    • Natual Science Fund of China
    • The Fundamental Research Funds for the Central Universities

    Conference

    SA '18
    Sponsor:
    SA '18: SIGGRAPH Asia 2018
    December 4 - 7, 2018
    Tokyo, Japan

    Acceptance Rates

    Overall Acceptance Rate 178 of 869 submissions, 20%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)257
    • Downloads (Last 6 weeks)27
    Reflects downloads up to

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Challenges of Using Synthetic Data Generation Methods for Tabular MicrodataApplied Sciences10.3390/app1414597514:14(5975)Online publication date: 9-Jul-2024
    • (2024)Improving the Leaking of Augmentations in Data-Efficient GANs via Adaptive Negative Data Augmentation2024 IEEE/CVF Winter Conference on Applications of Computer Vision (WACV)10.1109/WACV57701.2024.00533(5400-5409)Online publication date: 3-Jan-2024
    • (2024)Improving the Fairness of the Min-Max Game in GANs Training2024 IEEE/CVF Winter Conference on Applications of Computer Vision (WACV)10.1109/WACV57701.2024.00289(2898-2907)Online publication date: 3-Jan-2024
    • (2024)Effective Intrusion Detection in Highly Imbalanced IoT Networks With Lightweight S2CGAN-IDSIEEE Internet of Things Journal10.1109/JIOT.2023.334263811:9(15140-15151)Online publication date: 1-May-2024
    • (2024)Advancements in Generative AI: A Comprehensive Review of GANs, GPT, Autoencoders, Diffusion Model, and TransformersIEEE Access10.1109/ACCESS.2024.339777512(69812-69837)Online publication date: 2024
    • (2024)Pix2Pix Hyperparameter Optimisation PredictionProcedia Computer Science10.1016/j.procs.2023.10.088225:C(1009-1018)Online publication date: 4-Mar-2024
    • (2023)A Comprehensive Evaluation of Generalizability of Deep Learning-Based Hi-C Resolution Improvement MethodsGenes10.3390/genes1501005415:1(54)Online publication date: 29-Dec-2023
    • (2023)Exploration of Metrics and Datasets to Assess the Fidelity of Images Generated by Generative Adversarial NetworksApplied Sciences10.3390/app13191063713:19(10637)Online publication date: 24-Sep-2023
    • (2023)Generation of synthetic EEG data for training algorithms supporting the diagnosis of major depressive disorderFrontiers in Neuroscience10.3389/fnins.2023.121913317Online publication date: 2-Oct-2023
    • (2023)What can Discriminator do? Towards Box-free Ownership Verification of Generative Adversarial Networks2023 IEEE/CVF International Conference on Computer Vision (ICCV)10.1109/ICCV51070.2023.00462(4986-4996)Online publication date: 1-Oct-2023
    • Show More Cited By

    View Options

    Get Access

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media