Fine-grained analysis of optimization and generalization for overparameterized two-layer neural networks

Sanjeev Arora, Simon S. Du, Wei Hu, Zhiyuan Li, Ruosong Wang

Research output: Chapter in Book/Report/Conference proceedingConference contribution

3 Scopus citations

Abstract

Recent works have cast some light on the mystery of why deep nets fit any data and generalize despite being very overparametrized. This paper analyzes training and generalization for a simple 2-layer ReLU net with random initialization, and provides the following improvements over recent works: (i) Using a tighter characterization of training speed than recent papers, an explanation for why training a neural net with random labels leads to slower training, as originally observed in [Zhang et al. ICLR' 17]. (ii) Generalization bound independent of network size, using a data-dependent complexity measure. Our measure distinguishes clearly between random labels and true labels on MNIST and CIFAR, as shown by experiments. Moreover, recent papers require sample complexity to increase (slowly) with the size, while our sample complexity is completely independent of the network size. (iii) Leamability of a broad class of smooth functions by 2-laycr ReLU nets trained via gradient descent. The key idea is to track dynamics of training and generalization via properties of a related kernel.

Original languageEnglish (US)
Title of host publication36th International Conference on Machine Learning, ICML 2019
PublisherInternational Machine Learning Society (IMLS)
Pages477-502
Number of pages26
ISBN (Electronic)9781510886988
StatePublished - Jan 1 2019
Event36th International Conference on Machine Learning, ICML 2019 - Long Beach, United States
Duration: Jun 9 2019Jun 15 2019

Publication series

Name36th International Conference on Machine Learning, ICML 2019
Volume2019-June

Conference

Conference36th International Conference on Machine Learning, ICML 2019
CountryUnited States
CityLong Beach
Period6/9/196/15/19

All Science Journal Classification (ASJC) codes

  • Education
  • Computer Science Applications
  • Human-Computer Interaction

Fingerprint Dive into the research topics of 'Fine-grained analysis of optimization and generalization for overparameterized two-layer neural networks'. Together they form a unique fingerprint.

  • Cite this

    Arora, S., Du, S. S., Hu, W., Li, Z., & Wang, R. (2019). Fine-grained analysis of optimization and generalization for overparameterized two-layer neural networks. In 36th International Conference on Machine Learning, ICML 2019 (pp. 477-502). (36th International Conference on Machine Learning, ICML 2019; Vol. 2019-June). International Machine Learning Society (IMLS).