Dynamically Grown Generative Adversarial Networks

Lanlan Liu, Yuting Zhang, Jia Deng, Stefano Soatto

Research output: Chapter in Book/Report/Conference proceedingConference contribution

5 Scopus citations


Recent work introduced progressive network growing as a promising way to ease the training for large GANs, but the model design and architecture-growing strategy still remain under-explored and needs manual design for different image data. In this paper, we propose a method to dynamically grow a GAN during training, optimizing the network architecture and its parameters together with automation. The method embeds architecture search techniques as an interleaving step with gradient-based training to periodically seek the optimal architecture-growing strategy for the generator and discriminator. It enjoys the benefits of both eased training because of progressive growing and improved performance because of broader architecture design space. Experimental results demonstrate new state-of-the-art of image generation. Observations in the search procedure also provide constructive insights into the GAN model design such as generator-discriminator balance and convolutional layer choices.

Original languageEnglish (US)
Title of host publication35th AAAI Conference on Artificial Intelligence, AAAI 2021
PublisherAssociation for the Advancement of Artificial Intelligence
Number of pages8
ISBN (Electronic)9781713835974
StatePublished - 2021
Event35th AAAI Conference on Artificial Intelligence, AAAI 2021 - Virtual, Online
Duration: Feb 2 2021Feb 9 2021

Publication series

Name35th AAAI Conference on Artificial Intelligence, AAAI 2021


Conference35th AAAI Conference on Artificial Intelligence, AAAI 2021
CityVirtual, Online

All Science Journal Classification (ASJC) codes

  • Artificial Intelligence


Dive into the research topics of 'Dynamically Grown Generative Adversarial Networks'. Together they form a unique fingerprint.

Cite this