Abstract
In 1961, James and Stein discovered a remarkable estimator that dominates the maximum-likelihood estimate of the mean of a p-variate normal distribution, provided the dimension p is greater than two. This paper extends the James-Stein estimator and highlights benefits of applying these extensions to adaptive signal processing problems. The main contribution of this paper is the derivation of the James-Stein state filter (JSSF), which is a robust version of the Kaiman filter. The JSSF is designed for situations where the parameters of the state-space evolution model are not known with any certainty. In deriving the JSSF, we derive several other results. We first derive a James-Stein estimator for estimating the regression parameter in a linear regression. A recursive implementation, which we call the James-Stein recursive least squares (JS-RLS) algorithm, is derived. The resulting estimate, although biased, has a smaller mean-square error than the traditional RLS algorithm. Finally, several heuristic algorithms are presented, including a James-Stein version of the Yule-Walker equations for AR parameter estimation.
| Original language | English (US) |
|---|---|
| Pages (from-to) | 2431-2447 |
| Number of pages | 17 |
| Journal | IEEE Transactions on Signal Processing |
| Volume | 46 |
| Issue number | 9 |
| DOIs | |
| State | Published - 1998 |
All Science Journal Classification (ASJC) codes
- Signal Processing
- Electrical and Electronic Engineering
Keywords
- James-stein estimation
- Kaiman filter
- Maximum-likelihood estimation
- Minimax estimation
- Recursive least squares
- Robust filtering
- Yule-walker equations
Fingerprint
Dive into the research topics of 'James-stein state filtering algorithms'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver