Abstract
The well-known Mori-Zwanzig theory tells us that model reduction leads to memory effect. For a long time, modeling the memory effect accurately and efficiently has been an important but nearly impossible task in developing a good reduced model. In this work, we explore a natural analogy between recurrent neural networks and the Mori-Zwanzig formalism to establish a systematic approach for developing reduced models with memory. Two training models-a direct training model and a dynamically coupled training model-are proposed and compared. We apply these methods to the Kuramoto-Sivashinsky equation and the Navier-Stokes equation. Numerical experiments show that the proposed method can produce reduced model with good performance on both short-term prediction and long-term statistical properties.
Original language | English (US) |
---|---|
Pages (from-to) | 947-962 |
Number of pages | 16 |
Journal | Communications in Computational Physics |
Volume | 25 |
Issue number | 4 |
DOIs | |
State | Published - 2019 |
All Science Journal Classification (ASJC) codes
- Physics and Astronomy (miscellaneous)
Keywords
- Model reduction
- Mori-Zwanzig
- Recurrent neural networks