Abstract
We study the optimal convergence rate for the universal estimation error. Let F be the excess loss class associated with the hypothesis space and n be the size of the data set, we prove that if the Fat-shattering dimension satisfies fat ϵ(F) = O(ϵ- p) , then the universal estimation error is of O(n- 1 / 2) for p< 2 and O(n- 1 / p) for p> 2. Among other things, this result gives a criterion for a hypothesis class to achieve the minimax optimal rate of O(n- 1 / 2). We also show that if the hypothesis space is the compact supported convex Lipschitz continuous functions in Rd with d> 4 , then the rate is approximately O(n- 2 / d).
Original language | English (US) |
---|---|
Article number | 2 |
Journal | Research in Mathematical Sciences |
Volume | 4 |
Issue number | 1 |
DOIs | |
State | Published - Dec 1 2017 |
All Science Journal Classification (ASJC) codes
- Theoretical Computer Science
- Mathematics (miscellaneous)
- Computational Mathematics
- Applied Mathematics
Keywords
- Empirical Process
- Estimation Error
- Gaussian Average
- Hypothesis Space
- Optimal Convergence Rate