Gaussian Process Surrogate Models for Neural Networks

Michael Y. Li, Erin Grant, Thomas L. Griffiths

Research output: Contribution to journalConference articlepeer-review

4 Scopus citations

Abstract

Not being able to understand and predict the behavior of deep learning systems makes it hard to decide what architecture and algorithm to use for a given problem. In science and engineering, modeling is a methodology used to understand complex systems whose internal processes are opaque. Modeling replaces a complex system with a simpler, more interpretable surrogate. Drawing inspiration from this, we construct a class of surrogate models for neural networks using Gaussian processes. Rather than deriving kernels for infinite neural networks, we learn kernels empirically from the naturalistic behavior of finite neural networks. We demonstrate our approach captures existing phenomena related to the spectral bias of neural networks, and then show that our surrogate models can be used to solve practical problems such as identifying which points most influence the behavior of specific neural networks and predicting which architectures and algorithms will generalize well for specific datasets.

Original languageEnglish (US)
Pages (from-to)1241-1252
Number of pages12
JournalProceedings of Machine Learning Research
Volume216
StatePublished - 2023
Externally publishedYes
Event39th Conference on Uncertainty in Artificial Intelligence, UAI 2023 - Pittsburgh, United States
Duration: Jul 31 2023Aug 4 2023

All Science Journal Classification (ASJC) codes

  • Artificial Intelligence
  • Software
  • Control and Systems Engineering
  • Statistics and Probability

Fingerprint

Dive into the research topics of 'Gaussian Process Surrogate Models for Neural Networks'. Together they form a unique fingerprint.

Cite this