Adding one neuron can eliminate all bad local minima

Shiyu Liang, Ruoyu Sun, Jason D. Lee, R. Srikant

Research output: Contribution to journalConference articlepeer-review

50 Scopus citations

Abstract

One of the main difficulties in analyzing neural networks is the non-convexity of the loss function which may have many bad local minima. In this paper, we study the landscape of neural networks for binary classification tasks. Under mild assumptions, we prove that after adding one special neuron with a skip connection to the output, or one special neuron per layer, every local minimum is a global minimum.

Original languageEnglish (US)
Pages (from-to)4350-4360
Number of pages11
JournalAdvances in Neural Information Processing Systems
Volume2018-December
StatePublished - 2018
Externally publishedYes
Event32nd Conference on Neural Information Processing Systems, NeurIPS 2018 - Montreal, Canada
Duration: Dec 2 2018Dec 8 2018

All Science Journal Classification (ASJC) codes

  • Computer Networks and Communications
  • Information Systems
  • Signal Processing

Fingerprint

Dive into the research topics of 'Adding one neuron can eliminate all bad local minima'. Together they form a unique fingerprint.

Cite this