Approximation and Optimization Theory for Linear Continuous-Time Recurrent Neural Networks

Zhong Li, Jiequn Han, E. Weinan, Qianxiao Li

Research output: Contribution to journalArticlepeer-review

11 Scopus citations

Abstract

We perform a systematic study of the approximation properties and optimization dynamics of recurrent neural networks (RNNs) when applied to learn input-output relationships in temporal data. We consider the simple but representative setting of using continuous-time linear RNNs to learn from data generated by linear relationships. On the approximation side, we prove a direct and an inverse approximation theorem of linear functionals using RNNs, which reveal the intricate connections between memory structures in the target and the corresponding approximation efficiency. In particular, we show that temporal relationships can be effectively approximated by RNNs if and only if the former possesses sufficient memory decay. On the optimization front, we perform detailed analysis of the optimization dynamics, including a precise understanding of the difficulty that may arise in learning relationships with long-term memory. The term “curse of memory” is coined to describe the uncovered phenomena, akin to the “curse of dimension” that plagues high-dimensional function approximation. These results form a relatively complete picture of the interaction of memory and recurrent structures in the linear dynamical setting.

Original languageEnglish (US)
JournalJournal of Machine Learning Research
Volume23
StatePublished - 2022

All Science Journal Classification (ASJC) codes

  • Control and Systems Engineering
  • Software
  • Statistics and Probability
  • Artificial Intelligence

Keywords

  • Approximation
  • Curse of memory
  • Dynamical systems
  • Optimization
  • Recurrent neural networks

Fingerprint

Dive into the research topics of 'Approximation and Optimization Theory for Linear Continuous-Time Recurrent Neural Networks'. Together they form a unique fingerprint.

Cite this