ButterflyFlow: Building Invertible Layers with Butterfly Matrices

Chenlin Meng, Linqi Zhou, Kristy Choi, Tri Dao, Stefano Ermon

Research output: Contribution to journalConference articlepeer-review

8 Scopus citations

Abstract

Normalizing flows model complex probability distributions using maps obtained by composing invertible layers. Special linear layers such as masked and 1 × 1 convolutions play a key role in existing architectures because they increase expressive power while having tractable Jacobians and inverses. We propose a new family of invertible linear layers based on butterfly layers, which are known to theoretically capture complex linear structures including permutations and periodicity, yet can be inverted efficiently. This representational power is a key advantage of our approach, as such structures are common in many real-world datasets. Based on our invertible butterfly layers, we construct a new class of normalizing flow models called ButterflyFlow. Empirically, we demonstrate that ButterflyFlows not only achieve strong density estimation results on natural images such as MNIST, CIFAR-10, and ImageNet-32×32, but also obtain significantly better log-likelihoods on structured datasets such as galaxy images and MIMIC-III patient cohorts-all while being more efficient in terms of memory and computation than relevant baselines.

Original languageEnglish (US)
Pages (from-to)15360-15375
Number of pages16
JournalProceedings of Machine Learning Research
Volume162
StatePublished - 2022
Externally publishedYes
Event39th International Conference on Machine Learning, ICML 2022 - Baltimore, United States
Duration: Jul 17 2022Jul 23 2022

All Science Journal Classification (ASJC) codes

  • Artificial Intelligence
  • Software
  • Control and Systems Engineering
  • Statistics and Probability

Fingerprint

Dive into the research topics of 'ButterflyFlow: Building Invertible Layers with Butterfly Matrices'. Together they form a unique fingerprint.

Cite this