Patterns of scalable Bayesian inference

Elaine Angelino, Matthew James Johnson, Ryan P. Adams

Research output: Contribution to journalReview articlepeer-review

47 Scopus citations


Datasets are growing not just in size but in complexity, creating a demand for rich models and quantification of uncertainty. Bayesian methods are an excellent fit for this demand, but scaling Bayesian inference is a challenge. In response to this challenge, there has been considerable recent work based on varying assumptions about model structure, underlying computational resources, and the importance of asymptotic correctness. As a result, there is a zoo of ideas with a wide range of assumptions and applicability. In this paper, we seek to identify unifying principles, patterns, and intuitions for scaling Bayesian inference. We review existing work on utilizing modern computing resources with both MCMC and variational approximation techniques. From this taxonomy of ideas, we characterize the general principles that have proven successful for designing scalable inference procedures and comment on the path forward.

Original languageEnglish (US)
Pages (from-to)119-247
Number of pages129
JournalFoundations and Trends in Machine Learning
Issue number2-3
StatePublished - 2016
Externally publishedYes

All Science Journal Classification (ASJC) codes

  • Software
  • Human-Computer Interaction
  • Artificial Intelligence


Dive into the research topics of 'Patterns of scalable Bayesian inference'. Together they form a unique fingerprint.

Cite this