Model reduction with memory and the machine learning of dynamical systems

Chao Ma, Jianchun Wang, E. Weinan

Research output: Contribution to journalArticlepeer-review

52 Scopus citations

Abstract

The well-known Mori-Zwanzig theory tells us that model reduction leads to memory effect. For a long time, modeling the memory effect accurately and efficiently has been an important but nearly impossible task in developing a good reduced model. In this work, we explore a natural analogy between recurrent neural networks and the Mori-Zwanzig formalism to establish a systematic approach for developing reduced models with memory. Two training models-a direct training model and a dynamically coupled training model-are proposed and compared. We apply these methods to the Kuramoto-Sivashinsky equation and the Navier-Stokes equation. Numerical experiments show that the proposed method can produce reduced model with good performance on both short-term prediction and long-term statistical properties.

Original languageEnglish (US)
Pages (from-to)947-962
Number of pages16
JournalCommunications in Computational Physics
Volume25
Issue number4
DOIs
StatePublished - 2019

All Science Journal Classification (ASJC) codes

  • Physics and Astronomy (miscellaneous)

Keywords

  • Model reduction
  • Mori-Zwanzig
  • Recurrent neural networks

Fingerprint

Dive into the research topics of 'Model reduction with memory and the machine learning of dynamical systems'. Together they form a unique fingerprint.

Cite this