Abstract
The performance lost to data quantization is considered in the context of minimum-mean-square error (MMSE) filtering of stationary Gaussian processes. It is seen that, for data uniformly partitioned into intervals, the optimum MMSE estimator produces an increase in mean-square error over that of the optimum estimator based on the original data. It is also seen that the same increase in MSE is caused by applying linear estimation filter directly to uniformly quantized data. Thus, for small intervals the performance gained by using an optimum post-quantization estimator rather than by simply using the unquantized filter on this quantized data is negligible. The second-order term in the expression for the increased MSE due to quantization is the same as the increased MSE that would be produced in the optimum filter if an orthogonal i. i. d. sequence were added to the unquantized data.
Original language | English (US) |
---|---|
Pages (from-to) | 1430-1435 |
Number of pages | 6 |
Journal | Proceedings of the IEEE Conference on Decision and Control |
DOIs | |
State | Published - 1984 |
All Science Journal Classification (ASJC) codes
- Control and Systems Engineering
- Modeling and Simulation
- Control and Optimization