Machine learning from a continuous viewpoint, I

E. Weinan, Chao Ma, Lei Wu

Research output: Contribution to journalArticlepeer-review

37 Scopus citations

Abstract

We present a continuous formulation of machine learning, as a problem in the calculus of variations and differential-integral equations, in the spirit of classical numerical analysis. We demonstrate that conventional machine learning models and algorithms, such as the random feature model, the two-layer neural network model and the residual neural network model, can all be recovered (in a scaled form) as particular discretizations of different continuous formulations. We also present examples of new models, such as the flow-based random feature model, and new algorithms, such as the smoothed particle method and spectral method, that arise naturally from this continuous formulation. We discuss how the issues of generalization error and implicit regularization can be studied under this framework.

Original languageEnglish (US)
Pages (from-to)2233-2266
Number of pages34
JournalScience China Mathematics
Volume63
Issue number11
DOIs
StatePublished - Nov 1 2020

All Science Journal Classification (ASJC) codes

  • General Mathematics

Keywords

  • 41A99
  • 49M99
  • continuous formulation
  • flow-based model
  • gradient flow
  • machine learning
  • particle approximation

Fingerprint

Dive into the research topics of 'Machine learning from a continuous viewpoint, I'. Together they form a unique fingerprint.

Cite this