TY - GEN

T1 - On the applications of the minimum mean p-th error (MMPE) to information theoretic quantities

AU - Dytso, Alex

AU - Bustin, Ronit

AU - Tuninetti, Daniela

AU - Devroye, Natasha

AU - Poor, H. Vincent

AU - Shamai, Shlomo

N1 - Funding Information:
The work of Alex Dytso, Daniela Tuninetti and Natasha Devroye was partially funded by the U. S. National Science Foundation (NSF) under award 1422511.
Publisher Copyright:
© 2016 IEEE.

PY - 2016/10/21

Y1 - 2016/10/21

N2 - This paper considers the minimum mean p-th error (MMPE) estimation problem: estimating a random vector in the presence of additive white Gaussian noise (AWGN) in order to minimize an Lp norm of the estimation error. The MMPE generalizes the classical minimum mean square error (MMSE) estimation problem. This paper derives basic properties of the optimal MMPE estimator and MMPE functional. Optimal estimators are found for several inputs of interests, such as Gaussian and binary symbols. Under an appropriate p-th moment constraint, the Gaussian input is shown to be asymptotically the hardest to estimate for any p ≥ 1. By using a conditional version of the MMPE, the famous 'MMSE single-crossing point' bound is shown to hold for the MMPE too for all p ≥ 1, up to a multiplicative constant. Finally, the paper develops connections between the conditional differential entropy and the MMPE, which leads to a tighter version of the Ozarow-Wyner lower bound on the rate achieved by discrete inputs on AWGN channels.

AB - This paper considers the minimum mean p-th error (MMPE) estimation problem: estimating a random vector in the presence of additive white Gaussian noise (AWGN) in order to minimize an Lp norm of the estimation error. The MMPE generalizes the classical minimum mean square error (MMSE) estimation problem. This paper derives basic properties of the optimal MMPE estimator and MMPE functional. Optimal estimators are found for several inputs of interests, such as Gaussian and binary symbols. Under an appropriate p-th moment constraint, the Gaussian input is shown to be asymptotically the hardest to estimate for any p ≥ 1. By using a conditional version of the MMPE, the famous 'MMSE single-crossing point' bound is shown to hold for the MMPE too for all p ≥ 1, up to a multiplicative constant. Finally, the paper develops connections between the conditional differential entropy and the MMPE, which leads to a tighter version of the Ozarow-Wyner lower bound on the rate achieved by discrete inputs on AWGN channels.

UR - http://www.scopus.com/inward/record.url?scp=84998704729&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84998704729&partnerID=8YFLogxK

U2 - 10.1109/ITW.2016.7606797

DO - 10.1109/ITW.2016.7606797

M3 - Conference contribution

AN - SCOPUS:84998704729

T3 - 2016 IEEE Information Theory Workshop, ITW 2016

SP - 66

EP - 70

BT - 2016 IEEE Information Theory Workshop, ITW 2016

PB - Institute of Electrical and Electronics Engineers Inc.

T2 - 2016 IEEE Information Theory Workshop, ITW 2016

Y2 - 11 September 2016 through 14 September 2016

ER -