Optimal convergence rate of the universal estimation error

E. Weinan, Yao Wang

Research output: Contribution to journalArticlepeer-review

1 Scopus citations


We study the optimal convergence rate for the universal estimation error. Let F be the excess loss class associated with the hypothesis space and n be the size of the data set, we prove that if the Fat-shattering dimension satisfies fat ϵ(F) = O(ϵ- p) , then the universal estimation error is of O(n- 1 / 2) for p< 2 and O(n- 1 / p) for p> 2. Among other things, this result gives a criterion for a hypothesis class to achieve the minimax optimal rate of O(n- 1 / 2). We also show that if the hypothesis space is the compact supported convex Lipschitz continuous functions in Rd with d> 4 , then the rate is approximately O(n- 2 / d).

Original languageEnglish (US)
Article number2
JournalResearch in Mathematical Sciences
Issue number1
StatePublished - Dec 1 2017

All Science Journal Classification (ASJC) codes

  • Theoretical Computer Science
  • Mathematics (miscellaneous)
  • Computational Mathematics
  • Applied Mathematics


  • Empirical Process
  • Estimation Error
  • Gaussian Average
  • Hypothesis Space
  • Optimal Convergence Rate


Dive into the research topics of 'Optimal convergence rate of the universal estimation error'. Together they form a unique fingerprint.

Cite this