Deep empirical risk minimization in finance: Looking into the future

Anders Max Reppen, Halil Mete Soner

Research output: Contribution to journalArticlepeer-review

8 Scopus citations

Abstract

Many modern computational approaches to classical problems in quantitative finance are formulated as empirical loss minimization (ERM), allowing direct applications of classical results from statistical machine learning. These methods, designed to directly construct the optimal feedback representation of hedging or investment decisions, are analyzed in this framework demonstrating their effectiveness as well as their susceptibility to generalization error. Use of classical techniques shows that over-training renders trained investment decisions to become anticipative, and proves overlearning for large hypothesis spaces. On the other hand, nonasymptotic estimates based on Rademacher complexity show the convergence for sufficiently large training sets. These results emphasize the importance of synthetic data generation and the appropriate calibration of complex models to market data. A numerically studied stylized example illustrates these possibilities, including the importance of problem dimension in the degree of overlearning, and the effectiveness of this approach.

Original languageEnglish (US)
Pages (from-to)116-145
Number of pages30
JournalMathematical Finance
Volume33
Issue number1
DOIs
StatePublished - Jan 2023

All Science Journal Classification (ASJC) codes

  • Accounting
  • Finance
  • Social Sciences (miscellaneous)
  • Economics and Econometrics
  • Applied Mathematics

Keywords

  • ERM
  • bias-variance trade-off
  • deep learning
  • dynamic hedging
  • overlearning

Fingerprint

Dive into the research topics of 'Deep empirical risk minimization in finance: Looking into the future'. Together they form a unique fingerprint.

Cite this