Probabilistic backpropagation for scalable learning of Bayesian neural networks

José Miguel Hernández-Lobato, Ryan P. Adams

Research output: Chapter in Book/Report/Conference proceedingConference contribution

170 Scopus citations

Abstract

Large multilayer neural networks trained with backpropagation have recently achieved state-of-the-art results in a wide range of problems. However, using backprop for neural net learning still has some disadvantages, e.g., having to tune a large number of hyperparameters to the data, lack of calibrated probabilistic predictions, and a tendency to overfit the training data. In principle, the Bayesian approach to learning neural networks does not have these problems. However, existing Bayesian techniques lack scalability to large dataset and network sizes. In this work we present a novel scalable method for learning Bayesian neural networks, called probabilistic backpropagation (PBP). Similar to classical backpropagation, PBP works by computing a forward propagation of probabilities through the network and then doing a backward computation of gradients. A series of experiments on ten real-world datasets show that PBP is significantly faster than other techniques, while offering competitive predictive abilities. Our experiments also show that PBP provides accurate estimates of the posterior variance on the network weights.

Original languageEnglish (US)
Title of host publication32nd International Conference on Machine Learning, ICML 2015
EditorsFrancis Bach, David Blei
PublisherInternational Machine Learning Society (IMLS)
Pages1861-1869
Number of pages9
ISBN (Electronic)9781510810587
StatePublished - Jan 1 2015
Externally publishedYes
Event32nd International Conference on Machine Learning, ICML 2015 - Lile, France
Duration: Jul 6 2015Jul 11 2015

Publication series

Name32nd International Conference on Machine Learning, ICML 2015
Volume3

Other

Other32nd International Conference on Machine Learning, ICML 2015
CountryFrance
CityLile
Period7/6/157/11/15

All Science Journal Classification (ASJC) codes

  • Human-Computer Interaction
  • Computer Science Applications

Fingerprint Dive into the research topics of 'Probabilistic backpropagation for scalable learning of Bayesian neural networks'. Together they form a unique fingerprint.

Cite this