James-stein state filtering algorithms

Jonathan H. Manton, Vikram Krishnamurthy, H. Vincent Poor

Research output: Contribution to journalArticlepeer-review

34 Scopus citations

Abstract

In 1961, James and Stein discovered a remarkable estimator that dominates the maximum-likelihood estimate of the mean of a p-variate normal distribution, provided the dimension p is greater than two. This paper extends the James-Stein estimator and highlights benefits of applying these extensions to adaptive signal processing problems. The main contribution of this paper is the derivation of the James-Stein state filter (JSSF), which is a robust version of the Kaiman filter. The JSSF is designed for situations where the parameters of the state-space evolution model are not known with any certainty. In deriving the JSSF, we derive several other results. We first derive a James-Stein estimator for estimating the regression parameter in a linear regression. A recursive implementation, which we call the James-Stein recursive least squares (JS-RLS) algorithm, is derived. The resulting estimate, although biased, has a smaller mean-square error than the traditional RLS algorithm. Finally, several heuristic algorithms are presented, including a James-Stein version of the Yule-Walker equations for AR parameter estimation.

Original languageEnglish (US)
Pages (from-to)2431-2447
Number of pages17
JournalIEEE Transactions on Signal Processing
Volume46
Issue number9
DOIs
StatePublished - 1998

All Science Journal Classification (ASJC) codes

  • Signal Processing
  • Electrical and Electronic Engineering

Keywords

  • James-stein estimation
  • Kaiman filter
  • Maximum-likelihood estimation
  • Minimax estimation
  • Recursive least squares
  • Robust filtering
  • Yule-walker equations

Fingerprint

Dive into the research topics of 'James-stein state filtering algorithms'. Together they form a unique fingerprint.

Cite this