Bayesian learning and inference in recurrent switching linear dynamical systems

Scott W. Linderman, Matthew J. Johnson, Andrew C. Miller, Ryan P. Adams, David M. Blei, Liam Paninski

Research output: Contribution to conferencePaperpeer-review

Abstract

Many natural systems, such as neurons firing in the brain or basketball teams traversing a court, give rise to time series data with complex, nonlinear dynamics. We can gain insight into these systems by decomposing the data into segments that are each explained by simpler dynamic units. Building on switching linear dynamical systems (SLDS), we develop a model class and Bayesian inference algorithms that not only discover these dynamical units but also, by learning how transition probabilities depend on observations or continuous latent states, explain their switching behavior. Our key innovation is to design these recurrent SLDS models to enable recent Pólya-gamma auxiliary variable techniques and thus make approximate Bayesian learning and inference in these models easy, fast, and scalable.

Original languageEnglish (US)
StatePublished - Jan 1 2017
Event20th International Conference on Artificial Intelligence and Statistics, AISTATS 2017 - Fort Lauderdale, United States
Duration: Apr 20 2017Apr 22 2017

Conference

Conference20th International Conference on Artificial Intelligence and Statistics, AISTATS 2017
Country/TerritoryUnited States
CityFort Lauderdale
Period4/20/174/22/17

All Science Journal Classification (ASJC) codes

  • Artificial Intelligence
  • Statistics and Probability

Fingerprint

Dive into the research topics of 'Bayesian learning and inference in recurrent switching linear dynamical systems'. Together they form a unique fingerprint.

Cite this