Gaussian process product models for nonparametric nonstationarity

Ryan Prescott Adams, Oliver Stegle

Research output: Chapter in Book/Report/Conference proceedingConference contribution

30 Scopus citations

Abstract

Stationarity is often an unrealistic prior assumption for Gaussian process regression. One solution is to predefine an explicit non-stationary covariance function, but such covariance functions can be difficult to specify and require detailed prior knowledge of the nonstationarity. We propose the Gaussian process product model (GPPM) which models data as the pointwise product of two latent Gaussian processes to nonparametrically infer nonstationary variations of amplitude. This approach differs from other non-parametric approaches to covariance function inference in that it operates on the outputs rather than the inputs, resulting in a significant reduction in computational cost and required data for inference. We present an approximate inference scheme using Expectation Propagation. This variational approximation yields convenient GP hyperparameter selection and compact approximate predictive distributions.

Original languageEnglish (US)
Title of host publicationProceedings of the 25th International Conference on Machine Learning
PublisherAssociation for Computing Machinery (ACM)
Pages1-8
Number of pages8
ISBN (Print)9781605582054
DOIs
StatePublished - 2008
Externally publishedYes
Event25th International Conference on Machine Learning - Helsinki, Finland
Duration: Jul 5 2008Jul 9 2008

Publication series

NameProceedings of the 25th International Conference on Machine Learning

Other

Other25th International Conference on Machine Learning
Country/TerritoryFinland
CityHelsinki
Period7/5/087/9/08

All Science Journal Classification (ASJC) codes

  • Artificial Intelligence
  • Human-Computer Interaction
  • Software

Fingerprint

Dive into the research topics of 'Gaussian process product models for nonparametric nonstationarity'. Together they form a unique fingerprint.

Cite this