Toward deeper understanding of neural networks: The power of initialization and a dual view on expressivity

Amit Daniely, Roy Frostig, Yoram Singer

Research output: Contribution to journalConference articlepeer-review

194 Scopus citations

Abstract

We develop a general duality between neural networks and compositional kernel Hilbert spaces. We introduce the notion of a computation skeleton, an acyclic graph that succinctly describes both a family of neural networks and a kernel space. Random neural networks are generated from a skeleton through node replication followed by sampling from a normal distribution to assign weights. The kernel space consists of functions that arise by compositions, averaging, and non-linear transformations governed by the skeleton's graph topology and activation functions. We prove that random networks induce representations which approximate the kernel space. In particular, it follows that random weight initialization often yields a favorable starting point for optimization despite the worst-case intractability of training neural networks.

Original languageEnglish (US)
Pages (from-to)2261-2269
Number of pages9
JournalAdvances in Neural Information Processing Systems
StatePublished - 2016
Event30th Annual Conference on Neural Information Processing Systems, NIPS 2016 - Barcelona, Spain
Duration: Dec 5 2016Dec 10 2016

All Science Journal Classification (ASJC) codes

  • Computer Networks and Communications
  • Information Systems
  • Signal Processing

Fingerprint

Dive into the research topics of 'Toward deeper understanding of neural networks: The power of initialization and a dual view on expressivity'. Together they form a unique fingerprint.

Cite this