Abstract
Finite-sample bounds on the accuracy of Bhattacharya’s plug-in estimator for Fisher information are derived. These bounds are further improved by introducing a clipping step that allows for better control over the score function. This leads to superior upper bounds on the rates of convergence, albeit under slightly different regularity conditions. The performance bounds on both estimators are evaluated for the practically relevant case of a random variable contaminated by Gaussian noise. Moreover, using Brown’s identity, two corresponding estimators of the minimum mean-square error are proposed.
Original language | English (US) |
---|---|
Article number | 545 |
Journal | Entropy |
Volume | 23 |
Issue number | 5 |
DOIs | |
State | Published - May 2021 |
Externally published | Yes |
All Science Journal Classification (ASJC) codes
- Information Systems
- Electrical and Electronic Engineering
- General Physics and Astronomy
- Mathematical Physics
- Physics and Astronomy (miscellaneous)
Keywords
- Fisher information
- Kernel estimation
- MMSE
- Nonparametric estimation